Edit model card

Visualize in Weights & Biases

bambara-mms-20-hours-oza75bambara-asr-hf

This model is a fine-tuned version of facebook/mms-1b-all on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.0263
  • Wer: 0.4776
  • Cer: 0.2282

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.5836 0.4167 500 1.3427 0.7568 0.3696
1.4893 0.8333 1000 1.3016 0.7671 0.3789
1.4043 1.25 1500 1.1957 0.7125 0.3471
1.377 1.6667 2000 1.1697 0.7138 0.3396
1.3016 2.0833 2500 1.1237 0.6892 0.3352
1.2902 2.5 3000 1.1183 0.6653 0.3203
1.2622 2.9167 3500 1.0832 0.6598 0.3265
1.1992 3.3333 4000 1.0645 0.6525 0.3101
1.2156 3.75 4500 1.0081 0.6232 0.3013
1.1711 4.1667 5000 1.0372 0.6438 0.3067
1.148 4.5833 5500 1.0092 0.6264 0.2950
1.1433 5.0 6000 1.0095 0.6200 0.2958
1.1042 5.4167 6500 1.0094 0.6265 0.3055
1.0931 5.8333 7000 0.9810 0.6115 0.2890
1.0745 6.25 7500 0.9724 0.6200 0.2899
1.0488 6.6667 8000 0.9712 0.5995 0.2814
1.0608 7.0833 8500 0.9923 0.6103 0.2938
1.0132 7.5 9000 0.9634 0.5933 0.2852
1.0209 7.9167 9500 0.9360 0.5860 0.2786
0.9742 8.3333 10000 0.9615 0.6037 0.2950
0.9778 8.75 10500 0.9622 0.5878 0.2831
0.9595 9.1667 11000 0.9613 0.5834 0.2734
0.9298 9.5833 11500 0.9635 0.5758 0.2707
0.9464 10.0 12000 0.9312 0.5862 0.2719
0.8894 10.4167 12500 1.0033 0.5756 0.2821
0.9103 10.8333 13000 0.9104 0.5697 0.2733
0.8705 11.25 13500 0.9330 0.5572 0.2626
0.8669 11.6667 14000 0.9328 0.5581 0.2684
0.8551 12.0833 14500 0.9438 0.5747 0.2721
0.8189 12.5 15000 0.9608 0.5536 0.2617
0.8239 12.9167 15500 0.9347 0.5511 0.2664
0.7917 13.3333 16000 0.9275 0.5422 0.2617
0.7978 13.75 16500 0.9308 0.5539 0.2663
0.783 14.1667 17000 0.9918 0.5460 0.2586
0.7641 14.5833 17500 0.9407 0.5579 0.2663
0.7743 15.0 18000 0.9389 0.5487 0.2650
0.7221 15.4167 18500 0.9997 0.5353 0.2564
0.7293 15.8333 19000 0.9576 0.5547 0.2643
0.7073 16.25 19500 0.9581 0.5441 0.2585
0.696 16.6667 20000 0.9683 0.5376 0.2538
0.6889 17.0833 20500 1.0302 0.5393 0.2580
0.6594 17.5 21000 1.0046 0.5283 0.2479
0.6618 17.9167 21500 0.9584 0.5340 0.2557
0.637 18.3333 22000 0.9824 0.5215 0.2466
0.6391 18.75 22500 1.0197 0.5219 0.2478
0.6243 19.1667 23000 0.9848 0.5316 0.2517
0.6038 19.5833 23500 1.0653 0.5180 0.2462
0.6181 20.0 24000 0.9717 0.5252 0.2516
0.5748 20.4167 24500 1.0184 0.5163 0.2473
0.5895 20.8333 25000 1.0450 0.5275 0.2509
0.5541 21.25 25500 1.0550 0.5271 0.2478
0.5444 21.6667 26000 1.0680 0.5168 0.2505
0.5562 22.0833 26500 1.0354 0.5331 0.2522
0.5186 22.5 27000 1.1237 0.5165 0.2452
0.5349 22.9167 27500 1.0960 0.5065 0.2450
0.4976 23.3333 28000 1.0851 0.5164 0.2462
0.4993 23.75 28500 1.1112 0.5139 0.2443
0.4936 24.1667 29000 1.1059 0.5124 0.2427
0.475 24.5833 29500 1.1079 0.5180 0.2458
0.4787 25.0 30000 1.0897 0.5120 0.2435
0.4469 25.4167 30500 1.1467 0.5095 0.2508
0.4548 25.8333 31000 1.1535 0.5208 0.2482
0.4414 26.25 31500 1.1973 0.5155 0.2460
0.416 26.6667 32000 1.1981 0.5191 0.2471
0.4317 27.0833 32500 1.2177 0.5108 0.2416
0.4014 27.5 33000 1.2161 0.5173 0.2482
0.4063 27.9167 33500 1.1857 0.5129 0.2457
0.3817 28.3333 34000 1.3049 0.5098 0.2444
0.3828 28.75 34500 1.1981 0.5180 0.2467
0.381 29.1667 35000 1.2967 0.5086 0.2410
0.3592 29.5833 35500 1.3089 0.5088 0.2433
0.3709 30.0 36000 1.2478 0.5070 0.2394
0.3422 30.4167 36500 1.3559 0.5067 0.2418
0.3442 30.8333 37000 1.3932 0.5066 0.2428
0.3304 31.25 37500 1.3619 0.5113 0.2454
0.3278 31.6667 38000 1.3764 0.5068 0.2439
0.3208 32.0833 38500 1.3906 0.4952 0.2378
0.3069 32.5 39000 1.4492 0.4996 0.2383
0.3041 32.9167 39500 1.4623 0.5009 0.2395
0.2938 33.3333 40000 1.4528 0.5022 0.2375
0.2931 33.75 40500 1.4347 0.5000 0.2392
0.2807 34.1667 41000 1.4737 0.4997 0.2382
0.2735 34.5833 41500 1.4860 0.5050 0.2388
0.2654 35.0 42000 1.5288 0.4930 0.2374
0.2561 35.4167 42500 1.5550 0.4983 0.2376
0.2626 35.8333 43000 1.4984 0.4931 0.2410
0.2478 36.25 43500 1.5513 0.4964 0.2366
0.2466 36.6667 44000 1.6438 0.4897 0.2377
0.2417 37.0833 44500 1.5970 0.4918 0.2367
0.2323 37.5 45000 1.5682 0.4945 0.2380
0.2351 37.9167 45500 1.5746 0.5009 0.2380
0.2213 38.3333 46000 1.6518 0.4922 0.2362
0.2191 38.75 46500 1.6544 0.4918 0.2359
0.2194 39.1667 47000 1.6425 0.4947 0.2370
0.209 39.5833 47500 1.7225 0.4851 0.2337
0.2114 40.0 48000 1.6719 0.4879 0.2331
0.2011 40.4167 48500 1.7090 0.4902 0.2327
0.1983 40.8333 49000 1.7086 0.4853 0.2321
0.1969 41.25 49500 1.7467 0.4841 0.2310
0.1896 41.6667 50000 1.7615 0.4843 0.2300
0.1834 42.0833 50500 1.7834 0.4854 0.2325
0.1831 42.5 51000 1.8245 0.4809 0.2289
0.1827 42.9167 51500 1.8147 0.4832 0.2329
0.1718 43.3333 52000 1.8478 0.4790 0.2289
0.1734 43.75 52500 1.8383 0.4832 0.2305
0.1675 44.1667 53000 1.9167 0.4862 0.2315
0.1669 44.5833 53500 1.9083 0.4847 0.2303
0.1682 45.0 54000 1.8658 0.4794 0.2275
0.1633 45.4167 54500 1.9068 0.4811 0.2290
0.1606 45.8333 55000 1.9087 0.4806 0.2279
0.1519 46.25 55500 1.9602 0.4845 0.2299
0.1538 46.6667 56000 1.9242 0.4803 0.2305
0.1526 47.0833 56500 1.9976 0.4821 0.2302
0.1519 47.5 57000 2.0323 0.4841 0.2305
0.1525 47.9167 57500 1.9856 0.4802 0.2282
0.1472 48.3333 58000 2.0402 0.4792 0.2295
0.1479 48.75 58500 1.9940 0.4798 0.2282
0.1408 49.1667 59000 2.0308 0.4784 0.2282
0.1407 49.5833 59500 2.0308 0.4776 0.2282
0.1468 50.0 60000 2.0263 0.4776 0.2282

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.17.0
  • Tokenizers 0.20.3
Downloads last month
26
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/bambara-mms-20-hours-oza75bambara-asr-hf

Finetuned
(132)
this model