DewiBrynJones's picture
Model save
3502e52 verified
|
raw
history blame
7.74 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xlsr-53-ft-btb-ccv-cy
    results: []

wav2vec2-xlsr-53-ft-btb-ccv-cy

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3890
  • Wer: 0.3056

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 100000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.043 0.0393 1000 1.1074 0.7791
0.716 0.0786 2000 0.9459 0.7145
0.6667 0.1179 3000 0.8828 0.6616
0.6238 0.1572 4000 0.8482 0.6633
0.591 0.1965 5000 0.8121 0.6215
0.588 0.2358 6000 0.7926 0.5952
0.5732 0.2751 7000 0.7582 0.5707
0.5557 0.3144 8000 0.7591 0.5600
0.5587 0.3536 9000 0.7245 0.5741
0.531 0.3929 10000 0.7107 0.5469
0.5275 0.4322 11000 0.7102 0.5448
0.5101 0.4715 12000 0.6905 0.5404
0.5215 0.5108 13000 0.6824 0.5255
0.5293 0.5501 14000 0.6682 0.5163
0.4981 0.5894 15000 0.6615 0.5140
0.4891 0.6287 16000 0.6634 0.5249
0.4813 0.6680 17000 0.6469 0.5105
0.4799 0.7073 18000 0.6421 0.5014
0.4799 0.7466 19000 0.6144 0.4819
0.471 0.7859 20000 0.6184 0.4914
0.4644 0.8252 21000 0.6190 0.4983
0.4645 0.8645 22000 0.6085 0.4784
0.4506 0.9038 23000 0.6067 0.4700
0.4439 0.9431 24000 0.5994 0.4773
0.4476 0.9824 25000 0.5946 0.4710
0.3905 1.0217 26000 0.5904 0.4539
0.3807 1.0609 27000 0.5839 0.4573
0.3782 1.1002 28000 0.5694 0.4477
0.3777 1.1395 29000 0.5712 0.4581
0.3879 1.1788 30000 0.5694 0.4539
0.3817 1.2181 31000 0.5558 0.4414
0.3755 1.2574 32000 0.5634 0.4343
0.3629 1.2967 33000 0.5455 0.4342
0.3636 1.3360 34000 0.5472 0.4346
0.3566 1.3753 35000 0.5467 0.4322
0.3683 1.4146 36000 0.5441 0.4325
0.3581 1.4539 37000 0.5279 0.4192
0.3448 1.4932 38000 0.5341 0.4195
0.3558 1.5325 39000 0.5194 0.4212
0.3492 1.5718 40000 0.5243 0.4139
0.3461 1.6111 41000 0.5144 0.4046
0.3412 1.6504 42000 0.5345 0.4237
0.3424 1.6897 43000 0.5192 0.4092
0.341 1.7289 44000 0.5131 0.4056
0.3428 1.7682 45000 0.5110 0.4030
0.3337 1.8075 46000 0.5063 0.4051
0.3286 1.8468 47000 0.5045 0.3976
0.3422 1.8861 48000 0.4938 0.4027
0.3271 1.9254 49000 0.4979 0.3910
0.3313 1.9647 50000 0.4907 0.3974
0.3069 2.0040 51000 0.4899 0.3852
0.2771 2.0433 52000 0.4836 0.3845
0.2705 2.0826 53000 0.4929 0.3825
0.2654 2.1219 54000 0.4843 0.3813
0.2794 2.1612 55000 0.4820 0.3781
0.2644 2.2005 56000 0.4742 0.3755
0.2624 2.2398 57000 0.4685 0.3681
0.2689 2.2791 58000 0.4650 0.3665
0.2584 2.3184 59000 0.4691 0.3658
0.2535 2.3577 60000 0.4627 0.3713
0.2623 2.3970 61000 0.4667 0.3670
0.2502 2.4362 62000 0.4592 0.3681
0.2593 2.4755 63000 0.4569 0.3676
0.2521 2.5148 64000 0.4576 0.3590
0.2415 2.5541 65000 0.4510 0.3542
0.2349 2.5934 66000 0.4454 0.3534
0.2482 2.6327 67000 0.4531 0.3586
0.2527 2.6720 68000 0.4418 0.3522
0.2473 2.7113 69000 0.4437 0.3583
0.2334 2.7506 70000 0.4338 0.3457
0.2314 2.7899 71000 0.4286 0.3456
0.2318 2.8292 72000 0.4275 0.3371
0.2347 2.8685 73000 0.4266 0.3408
0.2313 2.9078 74000 0.4238 0.3354
0.2253 2.9471 75000 0.4199 0.3316
0.217 2.9864 76000 0.4222 0.3333
0.194 3.0257 77000 0.4252 0.3342
0.181 3.0650 78000 0.4228 0.3364
0.187 3.1042 79000 0.4197 0.3356
0.1855 3.1435 80000 0.4215 0.3410
0.1886 3.1828 81000 0.4177 0.3319
0.1821 3.2221 82000 0.4128 0.3293
0.1786 3.2614 83000 0.4102 0.3226
0.1758 3.3007 84000 0.4147 0.3264
0.171 3.3400 85000 0.4131 0.3200
0.1767 3.3793 86000 0.4098 0.3172
0.1804 3.4186 87000 0.4091 0.3209
0.1699 3.4579 88000 0.4044 0.3179
0.1645 3.4972 89000 0.4041 0.3167
0.1707 3.5365 90000 0.4008 0.3202
0.1838 3.5758 91000 0.3981 0.3165
0.1653 3.6151 92000 0.3987 0.3132
0.1679 3.6544 93000 0.3982 0.3110
0.1631 3.6937 94000 0.3904 0.3074
0.1561 3.7330 95000 0.3934 0.3091
0.1699 3.7723 96000 0.3917 0.3068
0.1591 3.8115 97000 0.3918 0.3057
0.1609 3.8508 98000 0.3908 0.3050
0.1675 3.8901 99000 0.3902 0.3059
0.1666 3.9294 100000 0.3890 0.3056

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1