DewiBrynJones's picture
End of training
c91eef2 verified
|
raw
history blame
7.78 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - automatic-speech-recognition
  - DewiBrynJones/banc-trawsgrifiadau-bangor-clean-with-ccv
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xlsr-53-ft-btb-ccv-cy
    results: []

wav2vec2-xlsr-53-ft-btb-ccv-cy

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the DEWIBRYNJONES/BANC-TRAWSGRIFIADAU-BANGOR-CLEAN-WITH-CCV - DEFAULT dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4908
  • Wer: 0.3964

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 20000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.0079 200 3.1856 1.0
No log 0.0157 400 2.6492 1.0
4.6997 0.0236 600 1.3869 0.8722
4.6997 0.0314 800 1.2302 0.8308
1.0569 0.0393 1000 1.1380 0.7958
1.0569 0.0472 1200 1.0668 0.7698
1.0569 0.0550 1400 1.0208 0.7310
0.8131 0.0629 1600 0.9702 0.7151
0.8131 0.0707 1800 0.9408 0.6882
0.7194 0.0786 2000 0.9250 0.6804
0.7194 0.0864 2200 0.9052 0.6726
0.7194 0.0943 2400 0.8986 0.6573
0.6688 0.1022 2600 0.8815 0.6473
0.6688 0.1100 2800 0.8588 0.6445
0.645 0.1179 3000 0.8758 0.6487
0.645 0.1257 3200 0.8725 0.6691
0.645 0.1336 3400 0.8296 0.6298
0.6077 0.1415 3600 0.8356 0.6552
0.6077 0.1493 3800 0.8263 0.6229
0.5983 0.1572 4000 0.8711 0.6885
0.5983 0.1650 4200 0.7837 0.5918
0.5983 0.1729 4400 0.8097 0.6598
0.5788 0.1808 4600 0.7777 0.5869
0.5788 0.1886 4800 0.7913 0.5896
0.5501 0.1965 5000 0.7924 0.5900
0.5501 0.2043 5200 0.7603 0.5737
0.5501 0.2122 5400 0.7750 0.5932
0.5694 0.2200 5600 0.7517 0.5711
0.5694 0.2279 5800 0.7651 0.5698
0.5424 0.2358 6000 0.7548 0.5820
0.5424 0.2436 6200 0.7305 0.5681
0.5424 0.2515 6400 0.7314 0.5589
0.521 0.2593 6600 0.7228 0.5654
0.521 0.2672 6800 0.7350 0.5633
0.5119 0.2751 7000 0.7079 0.5347
0.5119 0.2829 7200 0.7105 0.5601
0.5119 0.2908 7400 0.6876 0.5378
0.5007 0.2986 7600 0.6835 0.5303
0.5007 0.3065 7800 0.7132 0.5351
0.4934 0.3144 8000 0.6972 0.5242
0.4934 0.3222 8200 0.6800 0.5227
0.4934 0.3301 8400 0.6916 0.5365
0.4762 0.3379 8600 0.6802 0.5255
0.4762 0.3458 8800 0.6978 0.5337
0.4774 0.3536 9000 0.6567 0.5211
0.4774 0.3615 9200 0.6479 0.5152
0.4774 0.3694 9400 0.6551 0.5147
0.4632 0.3772 9600 0.6358 0.4955
0.4632 0.3851 9800 0.6466 0.5109
0.4483 0.3929 10000 0.6306 0.5044
0.4483 0.4008 10200 0.6360 0.5004
0.4483 0.4087 10400 0.6302 0.4914
0.4454 0.4165 10600 0.6163 0.4851
0.4454 0.4244 10800 0.6221 0.4911
0.4302 0.4322 11000 0.6396 0.5001
0.4302 0.4401 11200 0.6212 0.4841
0.4302 0.4480 11400 0.6268 0.4938
0.4261 0.4558 11600 0.6098 0.4820
0.4261 0.4637 11800 0.6009 0.4689
0.4026 0.4715 12000 0.6091 0.4810
0.4026 0.4794 12200 0.6019 0.4806
0.4026 0.4872 12400 0.5947 0.4671
0.4027 0.4951 12600 0.5994 0.4709
0.4027 0.5030 12800 0.5982 0.4761
0.3978 0.5108 13000 0.5890 0.4632
0.3978 0.5187 13200 0.5871 0.4567
0.3978 0.5265 13400 0.5873 0.4635
0.3875 0.5344 13600 0.5772 0.4539
0.3875 0.5423 13800 0.5604 0.4419
0.404 0.5501 14000 0.5689 0.4454
0.404 0.5580 14200 0.5595 0.4433
0.404 0.5658 14400 0.5575 0.4406
0.3878 0.5737 14600 0.5522 0.4353
0.3878 0.5816 14800 0.5522 0.4352
0.3622 0.5894 15000 0.5570 0.4401
0.3622 0.5973 15200 0.5467 0.4280
0.3622 0.6051 15400 0.5511 0.4340
0.3545 0.6130 15600 0.5437 0.4245
0.3545 0.6208 15800 0.5489 0.4296
0.3486 0.6287 16000 0.5420 0.4278
0.3486 0.6366 16200 0.5352 0.4213
0.3486 0.6444 16400 0.5377 0.4259
0.3374 0.6523 16600 0.5336 0.4305
0.3374 0.6601 16800 0.5294 0.4188
0.3389 0.6680 17000 0.5253 0.4169
0.3389 0.6759 17200 0.5194 0.4144
0.3389 0.6837 17400 0.5232 0.4171
0.3258 0.6916 17600 0.5179 0.4165
0.3258 0.6994 17800 0.5132 0.4104
0.327 0.7073 18000 0.5096 0.4044
0.327 0.7152 18200 0.5041 0.4034
0.327 0.7230 18400 0.5013 0.3981
0.316 0.7309 18600 0.5074 0.4065
0.316 0.7387 18800 0.5014 0.4055
0.3162 0.7466 19000 0.4959 0.3998
0.3162 0.7545 19200 0.4930 0.3982
0.3162 0.7623 19400 0.4925 0.3982
0.3145 0.7702 19600 0.4922 0.3970
0.3145 0.7780 19800 0.4908 0.3969
0.3095 0.7859 20000 0.4908 0.3964

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1