wav2vec-large-en / README.md
dongim04's picture
End of training
0fee992 verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-large
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec-large-en
    results: []

wav2vec-large-en

This model is a fine-tuned version of facebook/wav2vec2-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.9034
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.3521 50 3.5027 1.0
5.2013 0.7042 100 2.9715 1.0
5.2013 1.0563 150 3.0431 1.0
3.1541 1.4085 200 2.8890 1.0
3.1541 1.7606 250 2.9611 1.0
2.8986 2.1127 300 2.8821 1.0
2.8986 2.4648 350 2.8843 1.0
2.9209 2.8169 400 2.9202 1.0
2.9209 3.1690 450 2.9015 1.0
2.9283 3.5211 500 2.8956 1.0
2.9283 3.8732 550 2.8913 1.0
2.8832 4.2254 600 2.9140 1.0
2.8832 4.5775 650 2.8749 1.0
2.8789 4.9296 700 2.8970 1.0
2.8789 5.2817 750 2.9359 1.0
2.8753 5.6338 800 2.8821 1.0
2.8753 5.9859 850 2.8868 1.0
2.8859 6.3380 900 2.9063 1.0
2.8859 6.6901 950 2.8784 1.0
2.8693 7.0423 1000 2.9513 1.0
2.8693 7.3944 1050 2.8936 1.0
2.8911 7.7465 1100 2.8821 1.0
2.8911 8.0986 1150 2.9227 1.0
2.8625 8.4507 1200 2.8949 1.0
2.8625 8.8028 1250 2.9313 1.0
2.8641 9.1549 1300 2.8978 1.0
2.8641 9.5070 1350 2.9034 1.0

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.5.1+cu121
  • Tokenizers 0.20.3