tun_wav2vec_final / README.md
Myriam123's picture
End of training
45919fa verified
metadata
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: tun_wav2vec_final
    results: []

tun_wav2vec_final

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3109
  • Wer: 0.5737
  • Cer: 0.1609

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 10
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 80
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.2068 5.0 300 0.9499 0.6270 0.1709
0.1824 10.0 600 1.1143 0.6395 0.1740
0.1519 15.0 900 1.4216 0.6520 0.1852
0.1387 20.0 1200 1.1372 0.6176 0.1632
0.1221 25.0 1500 1.3203 0.6364 0.1694
0.1182 30.0 1800 1.3959 0.6270 0.1782
0.099 35.0 2100 1.6996 0.6176 0.1798
0.1098 40.0 2400 1.3228 0.6113 0.1713
0.0834 45.0 2700 1.2459 0.6082 0.1582
0.0801 50.0 3000 1.1573 0.5956 0.1516
0.107 55.0 3300 1.2025 0.6019 0.1640
0.0954 60.0 3600 1.2703 0.5611 0.1593
0.0581 65.0 3900 1.2382 0.5768 0.1566
0.0582 70.0 4200 1.1088 0.5799 0.1566
0.0434 75.0 4500 1.3048 0.5831 0.1597
0.0451 80.0 4800 1.3257 0.5768 0.1640
0.0383 85.0 5100 1.3002 0.5611 0.1532
0.0384 90.0 5400 1.4335 0.5768 0.1620
0.0518 95.0 5700 1.2875 0.5737 0.1570
0.0434 100.0 6000 1.3109 0.5737 0.1609

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1