xlsr-nm-clp / README.md
susmitabhatt's picture
End of training
9ea75b1 verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: xlsr-nm-clp
    results: []

xlsr-nm-clp

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3632
  • Wer: 0.5241

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
5.0552 4.8780 200 3.0646 1.0
3.0248 9.7561 400 2.9305 1.0
2.8381 14.6341 600 2.7349 1.0
2.2963 19.5122 800 1.9857 0.9550
1.3557 24.3902 1000 1.3196 0.7685
0.6411 29.2683 1200 1.3063 0.6881
0.394 34.1463 1400 1.2477 0.6527
0.2608 39.0244 1600 1.1584 0.6013
0.1804 43.9024 1800 1.2374 0.6013
0.1442 48.7805 2000 1.3478 0.5643
0.1264 53.6585 2200 1.2854 0.5740
0.0892 58.5366 2400 1.2293 0.5900
0.0813 63.4146 2600 1.2025 0.5482
0.0597 68.2927 2800 1.3339 0.5466
0.0495 73.1707 3000 1.4527 0.5595
0.0453 78.0488 3200 1.4188 0.5257
0.0402 82.9268 3400 1.2740 0.5289
0.0367 87.8049 3600 1.3237 0.5161
0.0324 92.6829 3800 1.3321 0.5177
0.0267 97.5610 4000 1.3632 0.5241

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0