metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
- automatic-speech-recognition
- DewiBrynJones/banc-trawsgrifiadau-bangor-clean-with-ccv
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-xlsr-53-ft-btb-ccv-cy
results: []
wav2vec2-xlsr-53-ft-btb-ccv-cy
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the DEWIBRYNJONES/BANC-TRAWSGRIFIADAU-BANGOR-CLEAN-WITH-CCV - DEFAULT dataset. It achieves the following results on the evaluation set:
- Loss: 2.8074
- Wer: 0.9983
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 64
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- training_steps: 10000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
No log | 0.0566 | 100 | 4.1582 | 1.0 |
No log | 0.1133 | 200 | 3.1276 | 1.0 |
No log | 0.1699 | 300 | 3.4072 | 1.0 |
No log | 0.2265 | 400 | 2.1967 | 0.9899 |
4.6703 | 0.2831 | 500 | 1.0576 | 0.7174 |
4.6703 | 0.3398 | 600 | 0.8877 | 0.6405 |
4.6703 | 0.3964 | 700 | 0.7894 | 0.5861 |
4.6703 | 0.4530 | 800 | 0.7699 | 0.5877 |
4.6703 | 0.5096 | 900 | 0.7605 | 0.5403 |
0.5242 | 0.5663 | 1000 | 0.7429 | 0.5495 |
0.5242 | 0.6229 | 1100 | 0.6762 | 0.5081 |
0.5242 | 0.6795 | 1200 | 0.6703 | 0.5072 |
0.5242 | 0.7361 | 1300 | 0.6187 | 0.4623 |
0.5242 | 0.7928 | 1400 | 0.6205 | 0.4742 |
0.4093 | 0.8494 | 1500 | 0.6089 | 0.4607 |
0.4093 | 0.9060 | 1600 | 0.6079 | 0.4564 |
0.4093 | 0.9626 | 1700 | 0.5752 | 0.4487 |
0.4093 | 1.0193 | 1800 | 0.5519 | 0.4174 |
0.4093 | 1.0759 | 1900 | 0.5468 | 0.4107 |
0.3366 | 1.1325 | 2000 | 0.5372 | 0.4080 |
0.3366 | 1.1891 | 2100 | 0.5359 | 0.4072 |
0.3366 | 1.2458 | 2200 | 0.5304 | 0.4023 |
0.3366 | 1.3024 | 2300 | 0.5311 | 0.4011 |
0.3366 | 1.3590 | 2400 | 0.5186 | 0.3864 |
0.2939 | 1.4156 | 2500 | 0.5234 | 0.3934 |
0.2939 | 1.4723 | 2600 | 0.5213 | 0.3973 |
0.2939 | 1.5289 | 2700 | 0.5156 | 0.3877 |
0.2939 | 1.5855 | 2800 | 0.5052 | 0.3898 |
0.2939 | 1.6421 | 2900 | 0.4981 | 0.3833 |
0.2838 | 1.6988 | 3000 | 0.4990 | 0.3804 |
0.2838 | 1.7554 | 3100 | 0.5000 | 0.3807 |
0.2838 | 1.8120 | 3200 | 0.4961 | 0.3751 |
0.2838 | 1.8686 | 3300 | 0.4859 | 0.3731 |
0.2838 | 1.9253 | 3400 | 0.4812 | 0.3657 |
0.2694 | 1.9819 | 3500 | 0.4779 | 0.3620 |
0.2694 | 2.0385 | 3600 | 0.4943 | 0.3633 |
0.2694 | 2.0951 | 3700 | 0.4880 | 0.3677 |
0.2694 | 2.1518 | 3800 | 0.4990 | 0.3662 |
0.2694 | 2.2084 | 3900 | 0.5101 | 0.3699 |
0.2419 | 2.2650 | 4000 | 0.5393 | 0.3902 |
0.2419 | 2.3216 | 4100 | 0.6454 | 0.4513 |
0.2419 | 2.3783 | 4200 | 0.9892 | 0.5937 |
0.2419 | 2.4349 | 4300 | 0.7712 | 0.5167 |
0.2419 | 2.4915 | 4400 | 0.6338 | 0.4788 |
0.46 | 2.5481 | 4500 | 0.5562 | 0.4156 |
0.46 | 2.6048 | 4600 | 0.5377 | 0.3906 |
0.46 | 2.6614 | 4700 | 0.5687 | 0.4008 |
0.46 | 2.7180 | 4800 | 0.6321 | 0.4291 |
0.46 | 2.7746 | 4900 | 0.5834 | 0.4203 |
0.299 | 2.8313 | 5000 | 0.5302 | 0.3930 |
0.299 | 2.8879 | 5100 | 0.5316 | 0.3860 |
0.299 | 2.9445 | 5200 | 0.5344 | 0.3800 |
0.299 | 3.0011 | 5300 | 0.5349 | 0.3842 |
0.299 | 3.0578 | 5400 | 0.5776 | 0.4183 |
0.2839 | 3.1144 | 5500 | 0.5883 | 0.4100 |
0.2839 | 3.1710 | 5600 | 0.5723 | 0.4044 |
0.2839 | 3.2276 | 5700 | 0.5630 | 0.4078 |
0.2839 | 3.2843 | 5800 | 0.5810 | 0.4191 |
0.2839 | 3.3409 | 5900 | 0.5996 | 0.4228 |
0.3019 | 3.3975 | 6000 | 0.5682 | 0.4016 |
0.3019 | 3.4541 | 6100 | 0.5561 | 0.4057 |
0.3019 | 3.5108 | 6200 | 0.5905 | 0.4146 |
0.3019 | 3.5674 | 6300 | 0.5875 | 0.4190 |
0.3019 | 3.6240 | 6400 | 0.5878 | 0.4446 |
0.2944 | 3.6806 | 6500 | 0.5939 | 0.4404 |
0.2944 | 3.7373 | 6600 | 0.5903 | 0.4183 |
0.2944 | 3.7939 | 6700 | 0.5808 | 0.4059 |
0.2944 | 3.8505 | 6800 | 0.6155 | 0.4101 |
0.2944 | 3.9071 | 6900 | 0.7987 | 0.5823 |
0.3918 | 3.9638 | 7000 | 0.9750 | 0.5545 |
0.3918 | 4.0204 | 7100 | 1.0540 | 0.5689 |
0.3918 | 4.0770 | 7200 | 0.6851 | 0.4396 |
0.3918 | 4.1336 | 7300 | 0.7332 | 0.4973 |
0.3918 | 4.1903 | 7400 | 0.9466 | 0.6395 |
0.5378 | 4.2469 | 7500 | 0.8257 | 0.4851 |
0.5378 | 4.3035 | 7600 | 0.8490 | 0.4867 |
0.5378 | 4.3601 | 7700 | 0.8717 | 0.4711 |
0.5378 | 4.4168 | 7800 | 0.8839 | 0.5860 |
0.5378 | 4.4734 | 7900 | 2.9113 | 1.0 |
1.3847 | 4.5300 | 8000 | 2.8576 | 1.0 |
1.3847 | 4.5866 | 8100 | 2.8391 | 1.0 |
1.3847 | 4.6433 | 8200 | 2.8406 | 1.0 |
1.3847 | 4.6999 | 8300 | 2.8566 | 1.0 |
1.3847 | 4.7565 | 8400 | 2.8454 | 0.9998 |
2.8136 | 4.8131 | 8500 | 2.8340 | 0.9999 |
2.8136 | 4.8698 | 8600 | 2.8367 | 0.9999 |
2.8136 | 4.9264 | 8700 | 2.8334 | 0.9999 |
2.8136 | 4.9830 | 8800 | 2.8321 | 0.9999 |
2.8136 | 5.0396 | 8900 | 2.8025 | 0.9982 |
2.8007 | 5.0963 | 9000 | 2.8024 | 0.9975 |
2.8007 | 5.1529 | 9100 | 2.8043 | 0.9981 |
2.8007 | 5.2095 | 9200 | 2.8106 | 0.9992 |
2.8007 | 5.2661 | 9300 | 2.8067 | 0.9993 |
2.8007 | 5.3228 | 9400 | 2.8053 | 0.9986 |
2.7935 | 5.3794 | 9500 | 2.8077 | 0.9978 |
2.7935 | 5.4360 | 9600 | 2.8083 | 0.9987 |
2.7935 | 5.4926 | 9700 | 2.8080 | 0.9989 |
2.7935 | 5.5493 | 9800 | 2.8086 | 0.9986 |
2.7935 | 5.6059 | 9900 | 2.8079 | 0.9982 |
2.7861 | 5.6625 | 10000 | 2.8074 | 0.9983 |
Framework versions
- Transformers 4.40.2
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1