xlsr-a-nomi / README.md
susmitabhatt's picture
End of training
08971aa verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: xlsr-a-nomi
    results: []

xlsr-a-nomi

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3688
  • Wer: 0.3324

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
4.7382 2.2727 200 2.5107 1.0
1.3613 4.5455 400 0.3782 0.5943
0.2498 6.8182 600 0.2562 0.4209
0.1205 9.0909 800 0.2575 0.3548
0.0772 11.3636 1000 0.2902 0.3432
0.0625 13.6364 1200 0.3199 0.3458
0.0461 15.9091 1400 0.2814 0.3351
0.0348 18.1818 1600 0.3389 0.3396
0.0323 20.4545 1800 0.3000 0.3423
0.0341 22.7273 2000 0.3097 0.3342
0.0271 25.0 2200 0.3270 0.3342
0.0236 27.2727 2400 0.3370 0.3423
0.0245 29.5455 2600 0.3201 0.3387
0.0143 31.8182 2800 0.3483 0.3315
0.0183 34.0909 3000 0.3245 0.3333
0.0149 36.3636 3200 0.3269 0.3342
0.0128 38.6364 3400 0.3180 0.3324
0.0121 40.9091 3600 0.3465 0.3387
0.0145 43.1818 3800 0.3465 0.3378
0.014 45.4545 4000 0.3181 0.3342
0.0101 47.7273 4200 0.3438 0.3333
0.0057 50.0 4400 0.3405 0.3387
0.0101 52.2727 4600 0.3508 0.3396
0.0084 54.5455 4800 0.3602 0.3360
0.0057 56.8182 5000 0.3369 0.3378
0.0143 59.0909 5200 0.3584 0.3387
0.0062 61.3636 5400 0.3748 0.3360
0.0068 63.6364 5600 0.3625 0.3369
0.006 65.9091 5800 0.3773 0.3369
0.0059 68.1818 6000 0.3666 0.3351
0.008 70.4545 6200 0.3597 0.3378
0.0061 72.7273 6400 0.3703 0.3396
0.0041 75.0 6600 0.3843 0.3387
0.0055 77.2727 6800 0.3829 0.3360
0.0028 79.5455 7000 0.3877 0.3378
0.0025 81.8182 7200 0.3898 0.3333
0.0021 84.0909 7400 0.3910 0.3342
0.0021 86.3636 7600 0.3889 0.3360
0.0025 88.6364 7800 0.3871 0.3342
0.0025 90.9091 8000 0.3787 0.3333
0.0016 93.1818 8200 0.3676 0.3307
0.0017 95.4545 8400 0.3651 0.3307
0.0015 97.7273 8600 0.3685 0.3324
0.0015 100.0 8800 0.3688 0.3324

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0