results / README.md
annaponomarchuk's picture
End of training
7134bd3 verified
metadata
language:
  - ev
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-large-xlsr-53
    results: []

wav2vec2-large-xlsr-53

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the Evenki dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8728
  • Wer: 65.9678

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
8.8625 0.6279 100 4.7672 100.0
3.5933 1.2559 200 3.5125 100.0
3.4942 1.8838 300 3.4852 100.0
3.5053 2.5118 400 3.4885 100.0
3.2185 3.1397 500 2.8873 100.0
2.2076 3.7677 600 1.6237 99.9844
1.6129 4.3956 700 1.2754 92.5833
1.5995 5.0235 800 1.1585 84.6033
1.333 5.6515 900 1.0689 80.4882
1.2571 6.2794 1000 1.0283 77.4840
1.1675 6.9074 1100 0.9761 75.3716
1.1193 7.5353 1200 0.9367 73.3532
1.0054 8.1633 1300 0.9723 72.4300
1.0211 8.7912 1400 0.8911 70.4115
0.9408 9.4192 1500 0.9405 70.6775
0.9115 10.0471 1600 0.8998 68.2835
0.8533 10.6750 1700 0.9073 68.3461
0.7981 11.3030 1800 0.8838 67.8141
0.8154 11.9309 1900 0.8872 66.7345
0.7603 12.5589 2000 0.8681 66.9379
0.7711 13.1868 2100 0.8723 66.5154
0.6974 13.8148 2200 0.8634 66.1242
0.7224 14.4427 2300 0.8728 65.9678

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1