Edit model card

Visualize in Weights & Biases

bambara_mms_20_hour_jeli_asr_dataset

This model is a fine-tuned version of facebook/mms-1b-all on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5960
  • Wer: 0.1951
  • Cer: 0.0927

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.3781 0.4365 500 1.7790 0.9007 0.4152
1.5271 0.8730 1000 1.4809 0.7424 0.3462
1.4216 1.3095 1500 1.4134 0.8316 0.4020
1.3995 1.7460 2000 1.3669 0.7422 0.3379
1.3521 2.1825 2500 1.2531 0.6969 0.3176
1.3161 2.6189 3000 1.3363 0.6643 0.3061
1.2759 3.0554 3500 1.2953 0.6607 0.3018
1.2604 3.4919 4000 1.2630 0.6280 0.2889
1.2424 3.9284 4500 1.4642 0.6308 0.2847
1.1639 4.3649 5000 1.3593 0.6216 0.2797
1.2122 4.8014 5500 1.2043 0.5966 0.2716
1.1504 5.2379 6000 1.1364 0.5892 0.2688
1.156 5.6744 6500 1.1405 0.6049 0.2726
1.1371 6.1109 7000 1.1854 0.5817 0.2633
1.0981 6.5474 7500 1.1306 0.5876 0.2625
1.1021 6.9838 8000 1.1026 0.5910 0.2659
1.0295 7.4203 8500 1.2506 0.5671 0.2600
1.0702 7.8568 9000 1.1672 0.5521 0.2529
1.0277 8.2933 9500 1.0966 0.5616 0.2544
1.0023 8.7298 10000 1.1389 0.5353 0.2403
0.9945 9.1663 10500 1.3434 0.5302 0.2420
0.9533 9.6028 11000 1.1546 0.5391 0.2517
0.9675 10.0393 11500 1.1966 0.5355 0.2451
0.9061 10.4758 12000 1.1808 0.5116 0.2310
0.9243 10.9123 12500 1.1189 0.5095 0.2290
0.8834 11.3488 13000 1.2189 0.4979 0.2226
0.8819 11.7852 13500 1.2035 0.4910 0.2158
0.8522 12.2217 14000 1.1385 0.4961 0.2173
0.8417 12.6582 14500 1.1060 0.4787 0.2110
0.8352 13.0947 15000 1.1295 0.4957 0.2237
0.7857 13.5312 15500 1.0946 0.4814 0.2142
0.7963 13.9677 16000 1.0891 0.4844 0.2240
0.762 14.4042 16500 1.0606 0.4832 0.2177
0.7594 14.8407 17000 1.0415 0.4529 0.1992
0.7368 15.2772 17500 1.0882 0.4399 0.1930
0.7158 15.7137 18000 1.0872 0.4521 0.1972
0.7102 16.1502 18500 1.0949 0.4259 0.1842
0.6789 16.5866 19000 1.1207 0.4138 0.1821
0.6898 17.0231 19500 1.1287 0.4105 0.1792
0.6463 17.4596 20000 1.2131 0.4103 0.1793
0.6525 17.8961 20500 1.1986 0.4001 0.1733
0.6116 18.3326 21000 1.2255 0.4058 0.1778
0.6138 18.7691 21500 1.2027 0.3946 0.1762
0.6033 19.2056 22000 1.1681 0.3870 0.1686
0.5816 19.6421 22500 1.1464 0.3822 0.1662
0.5826 20.0786 23000 1.1767 0.3817 0.1651
0.5504 20.5151 23500 1.2805 0.3796 0.1664
0.5656 20.9515 24000 1.1895 0.3634 0.1581
0.5204 21.3880 24500 1.2111 0.3569 0.1531
0.5186 21.8245 25000 1.2840 0.3526 0.1541
0.5074 22.2610 25500 1.2123 0.3564 0.1558
0.4966 22.6975 26000 1.1740 0.3467 0.1511
0.4886 23.1340 26500 1.3208 0.3351 0.1459
0.4628 23.5705 27000 1.3905 0.3277 0.1439
0.4743 24.0070 27500 1.3396 0.3378 0.1469
0.4408 24.4435 28000 1.3767 0.3164 0.1384
0.4457 24.8800 28500 1.2607 0.3231 0.1364
0.4242 25.3165 29000 1.2562 0.3181 0.1383
0.4279 25.7529 29500 1.2523 0.3198 0.1379
0.4116 26.1894 30000 1.3625 0.3086 0.1332
0.3963 26.6259 30500 1.2143 0.3132 0.1346
0.3945 27.0624 31000 1.2973 0.2993 0.1320
0.3733 27.4989 31500 1.3542 0.2955 0.1296
0.3816 27.9354 32000 1.3804 0.2946 0.1307
0.3487 28.3719 32500 1.4206 0.2841 0.1233
0.3521 28.8084 33000 1.4294 0.2819 0.1236
0.3351 29.2449 33500 1.5658 0.2797 0.1218
0.3285 29.6814 34000 1.5103 0.2803 0.1235
0.3253 30.1179 34500 1.4957 0.2704 0.1209
0.308 30.5543 35000 1.6964 0.2648 0.1173
0.3184 30.9908 35500 1.4796 0.2609 0.1153
0.2941 31.4273 36000 1.5527 0.2597 0.1169
0.2897 31.8638 36500 1.5907 0.2574 0.1150
0.2883 32.3003 37000 1.5718 0.2536 0.1132
0.2792 32.7368 37500 1.5505 0.2527 0.1134
0.2773 33.1733 38000 1.6607 0.2480 0.1102
0.2579 33.6098 38500 1.8962 0.2461 0.1108
0.2658 34.0463 39000 1.9136 0.2426 0.1116
0.2539 34.4828 39500 1.9131 0.2440 0.1113
0.2501 34.9192 40000 1.7290 0.2368 0.1083
0.2358 35.3557 40500 1.9586 0.2309 0.1059
0.2395 35.7922 41000 1.7617 0.2295 0.1040
0.2267 36.2287 41500 1.8779 0.2264 0.1021
0.2239 36.6652 42000 1.8696 0.2248 0.1019
0.2183 37.1017 42500 1.8445 0.2229 0.1026
0.2149 37.5382 43000 1.9865 0.2265 0.1041
0.2125 37.9747 43500 1.8998 0.2252 0.1039
0.2001 38.4112 44000 2.0591 0.2214 0.1010
0.204 38.8477 44500 1.9436 0.2145 0.1004
0.1967 39.2842 45000 2.0536 0.2148 0.0994
0.1902 39.7206 45500 2.0584 0.2161 0.1000
0.1862 40.1571 46000 2.0744 0.2146 0.1004
0.182 40.5936 46500 2.0731 0.2108 0.0982
0.1876 41.0301 47000 2.0895 0.2096 0.0978
0.1744 41.4666 47500 2.2355 0.2038 0.0956
0.178 41.9031 48000 2.1099 0.2099 0.0969
0.1677 42.3396 48500 2.2260 0.2043 0.0959
0.1688 42.7761 49000 2.2102 0.2046 0.0949
0.1676 43.2126 49500 2.2165 0.2047 0.0965
0.1589 43.6491 50000 2.2741 0.2016 0.0937
0.1613 44.0856 50500 2.2069 0.2021 0.0947
0.1529 44.5220 51000 2.3035 0.2018 0.0958
0.1513 44.9585 51500 2.4750 0.2007 0.0957
0.1515 45.3950 52000 2.5079 0.2011 0.0956
0.1463 45.8315 52500 2.5247 0.1994 0.0944
0.1433 46.2680 53000 2.5564 0.1968 0.0934
0.143 46.7045 53500 2.5230 0.1970 0.0933
0.1399 47.1410 54000 2.5532 0.1954 0.0927
0.1377 47.5775 54500 2.4811 0.1985 0.0933
0.1358 48.0140 55000 2.6065 0.1980 0.0934
0.1373 48.4505 55500 2.5808 0.1951 0.0926
0.1327 48.8869 56000 2.5458 0.1954 0.0930
0.1329 49.3234 56500 2.5948 0.1942 0.0927
0.1361 49.7599 57000 2.5960 0.1951 0.0927

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.17.0
  • Tokenizers 0.20.3
Downloads last month
7
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/bambara_mms_20_hour_jeli_asr_dataset

Finetuned
(132)
this model