|
--- |
|
license: apache-2.0 |
|
base_model: facebook/wav2vec2-large-xlsr-53 |
|
tags: |
|
- automatic-speech-recognition |
|
- DewiBrynJones/banc-trawsgrifiadau-bangor-clean-with-ccv |
|
- generated_from_trainer |
|
metrics: |
|
- wer |
|
model-index: |
|
- name: wav2vec2-xlsr-53-ft-btb-ccv-cy |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# wav2vec2-xlsr-53-ft-btb-ccv-cy |
|
|
|
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the DEWIBRYNJONES/BANC-TRAWSGRIFIADAU-BANGOR-CLEAN-WITH-CCV - DEFAULT dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: inf |
|
- Wer: 0.2962 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0003 |
|
- train_batch_size: 8 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 500 |
|
- training_steps: 200000 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Wer | |
|
|:-------------:|:------:|:------:|:---------------:|:------:| |
|
| 1.059 | 0.0383 | 1000 | inf | 0.7913 | |
|
| 0.7527 | 0.0765 | 2000 | inf | 0.7347 | |
|
| 0.6861 | 0.1148 | 3000 | inf | 0.6767 | |
|
| 0.651 | 0.1531 | 4000 | inf | 0.6462 | |
|
| 0.6372 | 0.1913 | 5000 | inf | 0.6245 | |
|
| 0.6078 | 0.2296 | 6000 | inf | 0.5932 | |
|
| 0.6006 | 0.2679 | 7000 | inf | 0.6101 | |
|
| 0.6008 | 0.3061 | 8000 | inf | 0.5832 | |
|
| 0.592 | 0.3444 | 9000 | inf | 0.5834 | |
|
| 0.5638 | 0.3827 | 10000 | inf | 0.5573 | |
|
| 0.5585 | 0.4209 | 11000 | inf | 0.5664 | |
|
| 0.5569 | 0.4592 | 12000 | inf | 0.5488 | |
|
| 0.5293 | 0.4975 | 13000 | inf | 0.5435 | |
|
| 0.5388 | 0.5357 | 14000 | inf | 0.5418 | |
|
| 0.5163 | 0.5740 | 15000 | inf | 0.5408 | |
|
| 0.5226 | 0.6123 | 16000 | inf | 0.5311 | |
|
| 0.4952 | 0.6505 | 17000 | inf | 0.5289 | |
|
| 0.524 | 0.6888 | 18000 | inf | 0.5215 | |
|
| 0.5076 | 0.7271 | 19000 | inf | 0.5187 | |
|
| 0.492 | 0.7653 | 20000 | inf | 0.5095 | |
|
| 0.4934 | 0.8036 | 21000 | inf | 0.5061 | |
|
| 0.4985 | 0.8419 | 22000 | inf | 0.5129 | |
|
| 0.4887 | 0.8801 | 23000 | inf | 0.4928 | |
|
| 0.484 | 0.9184 | 24000 | inf | 0.4949 | |
|
| 0.4741 | 0.9567 | 25000 | inf | 0.4865 | |
|
| 0.4816 | 0.9949 | 26000 | inf | 0.5055 | |
|
| 0.44 | 1.0332 | 27000 | inf | 0.4797 | |
|
| 0.4359 | 1.0715 | 28000 | inf | 0.4912 | |
|
| 0.411 | 1.1098 | 29000 | inf | 0.4774 | |
|
| 0.4298 | 1.1480 | 30000 | inf | 0.4773 | |
|
| 0.4305 | 1.1863 | 31000 | inf | 0.4898 | |
|
| 0.4126 | 1.2246 | 32000 | inf | 0.4739 | |
|
| 0.4234 | 1.2628 | 33000 | inf | 0.4844 | |
|
| 0.4252 | 1.3011 | 34000 | inf | 0.4763 | |
|
| 0.4106 | 1.3394 | 35000 | inf | 0.4709 | |
|
| 0.4254 | 1.3776 | 36000 | inf | 0.4737 | |
|
| 0.4245 | 1.4159 | 37000 | inf | 0.4534 | |
|
| 0.4154 | 1.4542 | 38000 | inf | 0.4566 | |
|
| 0.4071 | 1.4924 | 39000 | inf | 0.4635 | |
|
| 0.4065 | 1.5307 | 40000 | inf | 0.4668 | |
|
| 0.4086 | 1.5690 | 41000 | inf | 0.4607 | |
|
| 0.4037 | 1.6072 | 42000 | inf | 0.4616 | |
|
| 0.4071 | 1.6455 | 43000 | inf | 0.4607 | |
|
| 0.394 | 1.6838 | 44000 | inf | 0.4431 | |
|
| 0.4103 | 1.7220 | 45000 | inf | 0.4398 | |
|
| 0.3909 | 1.7603 | 46000 | inf | 0.4455 | |
|
| 0.3909 | 1.7986 | 47000 | inf | 0.4421 | |
|
| 0.3982 | 1.8368 | 48000 | inf | 0.4371 | |
|
| 0.3896 | 1.8751 | 49000 | inf | 0.4418 | |
|
| 0.3986 | 1.9134 | 50000 | inf | 0.4382 | |
|
| 0.3968 | 1.9516 | 51000 | inf | 0.4275 | |
|
| 0.4025 | 1.9899 | 52000 | inf | 0.4204 | |
|
| 0.3404 | 2.0282 | 53000 | inf | 0.4272 | |
|
| 0.3354 | 2.0664 | 54000 | inf | 0.4298 | |
|
| 0.3352 | 2.1047 | 55000 | inf | 0.4224 | |
|
| 0.3384 | 2.1430 | 56000 | inf | 0.4267 | |
|
| 0.3342 | 2.1812 | 57000 | inf | 0.4187 | |
|
| 0.3425 | 2.2195 | 58000 | inf | 0.4199 | |
|
| 0.3417 | 2.2578 | 59000 | inf | 0.4174 | |
|
| 0.3355 | 2.2960 | 60000 | inf | 0.4158 | |
|
| 0.3501 | 2.3343 | 61000 | inf | 0.4128 | |
|
| 0.3358 | 2.3726 | 62000 | inf | 0.4116 | |
|
| 0.3343 | 2.4108 | 63000 | inf | 0.4164 | |
|
| 0.3343 | 2.4491 | 64000 | inf | 0.4179 | |
|
| 0.3367 | 2.4874 | 65000 | inf | 0.4117 | |
|
| 0.3237 | 2.5256 | 66000 | inf | 0.4068 | |
|
| 0.3335 | 2.5639 | 67000 | inf | 0.4080 | |
|
| 0.3254 | 2.6022 | 68000 | inf | 0.3982 | |
|
| 0.3295 | 2.6404 | 69000 | inf | 0.4136 | |
|
| 0.3326 | 2.6787 | 70000 | inf | 0.4045 | |
|
| 0.3167 | 2.7170 | 71000 | inf | 0.4044 | |
|
| 0.3376 | 2.7552 | 72000 | inf | 0.3942 | |
|
| 0.3245 | 2.7935 | 73000 | inf | 0.3958 | |
|
| 0.315 | 2.8318 | 74000 | inf | 0.4065 | |
|
| 0.327 | 2.8700 | 75000 | inf | 0.4010 | |
|
| 0.3211 | 2.9083 | 76000 | inf | 0.3926 | |
|
| 0.323 | 2.9466 | 77000 | inf | 0.4005 | |
|
| 0.323 | 2.9848 | 78000 | inf | 0.3864 | |
|
| 0.2747 | 3.0231 | 79000 | inf | 0.3988 | |
|
| 0.2706 | 3.0614 | 80000 | inf | 0.3861 | |
|
| 0.2696 | 3.0996 | 81000 | inf | 0.3878 | |
|
| 0.2792 | 3.1379 | 82000 | inf | 0.3945 | |
|
| 0.2809 | 3.1762 | 83000 | inf | 0.3949 | |
|
| 0.2709 | 3.2144 | 84000 | inf | 0.3852 | |
|
| 0.2808 | 3.2527 | 85000 | inf | 0.3913 | |
|
| 0.2746 | 3.2910 | 86000 | inf | 0.3856 | |
|
| 0.2633 | 3.3293 | 87000 | inf | 0.3885 | |
|
| 0.2745 | 3.3675 | 88000 | inf | 0.3849 | |
|
| 0.2832 | 3.4058 | 89000 | inf | 0.3821 | |
|
| 0.2806 | 3.4441 | 90000 | inf | 0.3857 | |
|
| 0.2756 | 3.4823 | 91000 | inf | 0.3810 | |
|
| 0.2733 | 3.5206 | 92000 | inf | 0.3738 | |
|
| 0.2807 | 3.5589 | 93000 | inf | 0.3857 | |
|
| 0.2773 | 3.5971 | 94000 | inf | 0.3720 | |
|
| 0.2725 | 3.6354 | 95000 | inf | 0.3690 | |
|
| 0.2614 | 3.6737 | 96000 | inf | 0.3753 | |
|
| 0.2674 | 3.7119 | 97000 | inf | 0.3826 | |
|
| 0.2605 | 3.7502 | 98000 | inf | 0.3733 | |
|
| 0.2649 | 3.7885 | 99000 | inf | 0.3691 | |
|
| 0.2638 | 3.8267 | 100000 | inf | 0.3753 | |
|
| 0.2749 | 3.8650 | 101000 | inf | 0.3675 | |
|
| 0.2635 | 3.9033 | 102000 | inf | 0.3667 | |
|
| 0.2639 | 3.9415 | 103000 | inf | 0.3673 | |
|
| 0.2602 | 3.9798 | 104000 | inf | 0.3629 | |
|
| 0.2217 | 4.0181 | 105000 | inf | 0.3645 | |
|
| 0.2226 | 4.0563 | 106000 | inf | 0.3569 | |
|
| 0.2209 | 4.0946 | 107000 | inf | 0.3550 | |
|
| 0.2326 | 4.1329 | 108000 | inf | 0.3595 | |
|
| 0.2203 | 4.1711 | 109000 | inf | 0.3556 | |
|
| 0.2267 | 4.2094 | 110000 | inf | 0.3509 | |
|
| 0.223 | 4.2477 | 111000 | inf | 0.3581 | |
|
| 0.2273 | 4.2859 | 112000 | inf | 0.3548 | |
|
| 0.2278 | 4.3242 | 113000 | inf | 0.3493 | |
|
| 0.2372 | 4.3625 | 114000 | inf | 0.3601 | |
|
| 0.22 | 4.4007 | 115000 | inf | 0.3549 | |
|
| 0.228 | 4.4390 | 116000 | inf | 0.3499 | |
|
| 0.2291 | 4.4773 | 117000 | inf | 0.3485 | |
|
| 0.2301 | 4.5155 | 118000 | inf | 0.3488 | |
|
| 0.2084 | 4.5538 | 119000 | inf | 0.3515 | |
|
| 0.2251 | 4.5921 | 120000 | inf | 0.3509 | |
|
| 0.2205 | 4.6303 | 121000 | inf | 0.3446 | |
|
| 0.2174 | 4.6686 | 122000 | inf | 0.3459 | |
|
| 0.2136 | 4.7069 | 123000 | inf | 0.3499 | |
|
| 0.2142 | 4.7451 | 124000 | inf | 0.3449 | |
|
| 0.2152 | 4.7834 | 125000 | inf | 0.3466 | |
|
| 0.2216 | 4.8217 | 126000 | inf | 0.3443 | |
|
| 0.2209 | 4.8599 | 127000 | inf | 0.3455 | |
|
| 0.2183 | 4.8982 | 128000 | inf | 0.3404 | |
|
| 0.2174 | 4.9365 | 129000 | inf | 0.3403 | |
|
| 0.2165 | 4.9747 | 130000 | inf | 0.3420 | |
|
| 0.1806 | 5.0130 | 131000 | inf | 0.3381 | |
|
| 0.1821 | 5.0513 | 132000 | inf | 0.3426 | |
|
| 0.1825 | 5.0895 | 133000 | inf | 0.3400 | |
|
| 0.1876 | 5.1278 | 134000 | inf | 0.3381 | |
|
| 0.1858 | 5.1661 | 135000 | inf | 0.3342 | |
|
| 0.1729 | 5.2043 | 136000 | inf | 0.3325 | |
|
| 0.1843 | 5.2426 | 137000 | inf | 0.3314 | |
|
| 0.1828 | 5.2809 | 138000 | inf | 0.3338 | |
|
| 0.1878 | 5.3191 | 139000 | inf | 0.3299 | |
|
| 0.1784 | 5.3574 | 140000 | inf | 0.3305 | |
|
| 0.1791 | 5.3957 | 141000 | inf | 0.3263 | |
|
| 0.1861 | 5.4340 | 142000 | inf | 0.3238 | |
|
| 0.176 | 5.4722 | 143000 | inf | 0.3245 | |
|
| 0.1821 | 5.5105 | 144000 | inf | 0.3216 | |
|
| 0.176 | 5.5488 | 145000 | inf | 0.3245 | |
|
| 0.1799 | 5.5870 | 146000 | inf | 0.3251 | |
|
| 0.1696 | 5.6253 | 147000 | inf | 0.3222 | |
|
| 0.1711 | 5.6636 | 148000 | inf | 0.3243 | |
|
| 0.1794 | 5.7018 | 149000 | inf | 0.3212 | |
|
| 0.1806 | 5.7401 | 150000 | inf | 0.3201 | |
|
| 0.1736 | 5.7784 | 151000 | inf | 0.3236 | |
|
| 0.1664 | 5.8166 | 152000 | inf | 0.3222 | |
|
| 0.1704 | 5.8549 | 153000 | inf | 0.3200 | |
|
| 0.1713 | 5.8932 | 154000 | inf | 0.3300 | |
|
| 0.1701 | 5.9314 | 155000 | inf | 0.3172 | |
|
| 0.1687 | 5.9697 | 156000 | inf | 0.3186 | |
|
| 0.1543 | 6.0080 | 157000 | inf | 0.3141 | |
|
| 0.142 | 6.0462 | 158000 | inf | 0.3166 | |
|
| 0.1438 | 6.0845 | 159000 | inf | 0.3156 | |
|
| 0.1433 | 6.1228 | 160000 | inf | 0.3159 | |
|
| 0.1442 | 6.1610 | 161000 | inf | 0.3143 | |
|
| 0.1494 | 6.1993 | 162000 | inf | 0.3107 | |
|
| 0.1355 | 6.2376 | 163000 | inf | 0.3166 | |
|
| 0.1403 | 6.2758 | 164000 | inf | 0.3117 | |
|
| 0.1435 | 6.3141 | 165000 | inf | 0.3124 | |
|
| 0.1446 | 6.3524 | 166000 | inf | 0.3123 | |
|
| 0.1385 | 6.3906 | 167000 | inf | 0.3140 | |
|
| 0.1437 | 6.4289 | 168000 | inf | 0.3103 | |
|
| 0.1328 | 6.4672 | 169000 | inf | 0.3102 | |
|
| 0.1354 | 6.5054 | 170000 | inf | 0.3112 | |
|
| 0.1394 | 6.5437 | 171000 | inf | 0.3094 | |
|
| 0.1385 | 6.5820 | 172000 | inf | 0.3055 | |
|
| 0.138 | 6.6202 | 173000 | inf | 0.3055 | |
|
| 0.138 | 6.6585 | 174000 | inf | 0.3061 | |
|
| 0.1313 | 6.6968 | 175000 | inf | 0.3061 | |
|
| 0.1427 | 6.7350 | 176000 | inf | 0.3083 | |
|
| 0.1432 | 6.7733 | 177000 | inf | 0.3048 | |
|
| 0.136 | 6.8116 | 178000 | inf | 0.3039 | |
|
| 0.1424 | 6.8498 | 179000 | inf | 0.3016 | |
|
| 0.1347 | 6.8881 | 180000 | inf | 0.3039 | |
|
| 0.1307 | 6.9264 | 181000 | inf | 0.3029 | |
|
| 0.1293 | 6.9646 | 182000 | inf | 0.3026 | |
|
| 0.1259 | 7.0029 | 183000 | inf | 0.3025 | |
|
| 0.1151 | 7.0412 | 184000 | inf | 0.3034 | |
|
| 0.1143 | 7.0794 | 185000 | inf | 0.3025 | |
|
| 0.1105 | 7.1177 | 186000 | inf | 0.3006 | |
|
| 0.1126 | 7.1560 | 187000 | inf | 0.3006 | |
|
| 0.1139 | 7.1942 | 188000 | inf | 0.2996 | |
|
| 0.1101 | 7.2325 | 189000 | inf | 0.2982 | |
|
| 0.1187 | 7.2708 | 190000 | inf | 0.2988 | |
|
| 0.1174 | 7.3090 | 191000 | inf | 0.2993 | |
|
| 0.1132 | 7.3473 | 192000 | inf | 0.2996 | |
|
| 0.1108 | 7.3856 | 193000 | inf | 0.2995 | |
|
| 0.1119 | 7.4238 | 194000 | inf | 0.2991 | |
|
| 0.1098 | 7.4621 | 195000 | inf | 0.2985 | |
|
| 0.1053 | 7.5004 | 196000 | inf | 0.2977 | |
|
| 0.11 | 7.5386 | 197000 | inf | 0.2975 | |
|
| 0.1091 | 7.5769 | 198000 | inf | 0.2959 | |
|
| 0.108 | 7.6152 | 199000 | inf | 0.2963 | |
|
| 0.1077 | 7.6535 | 200000 | inf | 0.2962 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.41.2 |
|
- Pytorch 2.3.1+cu121 |
|
- Datasets 2.20.0 |
|
- Tokenizers 0.19.1 |
|
|