Edit model card

wav2vec2-large-mms-1b-azz-adapter-all_data_10epochs

This model is a fine-tuned version of facebook/mms-1b-all on the audiofolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4129
  • Wer: 0.3069
  • Cer: 0.0941

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 20
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.0574 0.1294 200 0.9724 0.6573 0.1971
0.9322 0.2589 400 0.7179 0.5833 0.1726
0.9043 0.3883 600 0.7195 0.5468 0.1631
0.8735 0.5178 800 0.6374 0.5238 0.1556
0.773 0.6472 1000 0.6531 0.5076 0.1517
0.7799 0.7767 1200 0.6242 0.5147 0.1520
0.9846 0.9061 1400 0.5878 0.5035 0.1466
0.8117 1.0356 1600 0.5782 0.4827 0.1430
0.746 1.1650 1800 0.5716 0.4745 0.1407
0.7802 1.2945 2000 0.5586 0.4717 0.1390
0.9001 1.4239 2200 0.5548 0.4672 0.1400
0.7452 1.5534 2400 0.5631 0.4572 0.1366
0.676 1.6828 2600 0.5342 0.4490 0.1339
0.7153 1.8123 2800 0.5681 0.4486 0.1334
0.7538 1.9417 3000 0.5299 0.4455 0.1326
0.681 2.0712 3200 0.5385 0.4448 0.1316
0.652 2.2006 3400 0.5315 0.4323 0.1280
0.6037 2.3301 3600 0.5700 0.4439 0.1322
0.6534 2.4595 3800 0.5563 0.4281 0.1288
0.6403 2.5890 4000 0.5152 0.4275 0.1284
0.674 2.7184 4200 0.5150 0.4247 0.1265
0.7599 2.8479 4400 0.5017 0.4218 0.1259
0.6729 2.9773 4600 0.5041 0.4059 0.1220
0.5941 3.1068 4800 0.5145 0.4202 0.1272
0.6739 3.2362 5000 0.5028 0.4193 0.1241
0.6256 3.3657 5200 0.5290 0.4175 0.1227
0.7149 3.4951 5400 0.4998 0.4088 0.1226
0.6998 3.6246 5600 0.4919 0.4124 0.1231
0.6055 3.7540 5800 0.4959 0.4096 0.1206
0.6154 3.8835 6000 0.4943 0.4031 0.1216
0.7407 4.0129 6200 0.5014 0.4283 0.1234
0.6168 4.1424 6400 0.4956 0.3929 0.1173
0.6959 4.2718 6600 0.4954 0.3898 0.1174
0.6978 4.4013 6800 0.5512 0.3913 0.1179
0.9123 4.5307 7000 0.4915 0.3923 0.1170
0.5721 4.6602 7200 0.4913 0.4000 0.1209
0.7162 4.7896 7400 0.4745 0.3923 0.1167
0.6501 4.9191 7600 0.4815 0.3801 0.1145
0.6957 5.0485 7800 0.4814 0.4044 0.1183
0.5791 5.1780 8000 0.4766 0.3839 0.1152
0.6509 5.3074 8200 0.4840 0.4042 0.1203
0.6117 5.4369 8400 0.4800 0.3892 0.1163
0.7194 5.5663 8600 0.4920 0.3815 0.1149
0.8941 5.6958 8800 0.4678 0.3752 0.1130
0.5965 5.8252 9000 0.4702 0.3815 0.1143
0.6351 5.9547 9200 0.4703 0.3816 0.1148
0.5887 6.0841 9400 0.4665 0.3759 0.1138
0.5603 6.2136 9600 0.4866 0.3698 0.1112
0.6516 6.3430 9800 0.4685 0.3717 0.1124
0.6041 6.4725 10000 0.4708 0.3757 0.1131
0.5621 6.6019 10200 0.4669 0.3638 0.1102
0.6136 6.7314 10400 0.4792 0.3687 0.1113
0.5835 6.8608 10600 0.4657 0.3707 0.1119
0.5732 6.9903 10800 0.4723 0.3654 0.1109
0.6285 7.1197 11000 0.4668 0.3661 0.1094
0.6128 7.2492 11200 0.4785 0.3695 0.1129
0.5489 7.3786 11400 0.5141 0.3643 0.1105
0.5681 7.5081 11600 0.4582 0.3612 0.1093
0.4984 7.6375 11800 0.4705 0.3602 0.1083
0.8323 7.7670 12000 0.4689 0.3560 0.1073
0.5723 7.8964 12200 0.4647 0.3558 0.1064
0.62 8.0259 12400 0.4581 0.3555 0.1075
0.5197 8.1553 12600 0.4551 0.3538 0.1072
0.6087 8.2848 12800 0.4591 0.3643 0.1090
0.583 8.4142 13000 0.4526 0.3500 0.1066
0.7788 8.5437 13200 0.4548 0.3618 0.1084
0.6503 8.6731 13400 0.4511 0.3545 0.1056
0.7021 8.8026 13600 0.4653 0.3519 0.1066
0.5428 8.9320 13800 0.4473 0.3523 0.1056
0.5716 9.0615 14000 0.4517 0.3513 0.1070
0.5345 9.1909 14200 0.4431 0.3503 0.1057
0.6278 9.3204 14400 0.4400 0.3489 0.1056
0.5128 9.4498 14600 0.4501 0.3427 0.1032
0.5278 9.5793 14800 0.4649 0.3462 0.1058
0.6367 9.7087 15000 0.4427 0.3563 0.1072
0.5131 9.8382 15200 0.4422 0.3492 0.1053
0.5187 9.9676 15400 0.4361 0.3452 0.1044
0.4976 10.0971 15600 0.4317 0.3445 0.1041
0.5494 10.2265 15800 0.4462 0.3416 0.1032
0.5362 10.3560 16000 0.4295 0.3404 0.1032
0.5069 10.4854 16200 0.4403 0.3418 0.1026
0.5938 10.6149 16400 0.4305 0.3375 0.1025
0.5548 10.7443 16600 0.4394 0.3369 0.1023
0.5127 10.8738 16800 0.4429 0.3407 0.1025
0.5588 11.0032 17000 0.4441 0.3463 0.1045
0.517 11.1327 17200 0.4326 0.3357 0.1014
0.5102 11.2621 17400 0.4562 0.3330 0.1017
0.6477 11.3916 17600 0.4327 0.3358 0.1028
0.5468 11.5210 17800 0.4289 0.3360 0.1019
0.5697 11.6505 18000 0.4340 0.3333 0.1010
0.5501 11.7799 18200 0.4476 0.3372 0.1016
0.5557 11.9094 18400 0.4474 0.3315 0.1006
0.5543 12.0388 18600 0.4251 0.3365 0.1024
0.7196 12.1683 18800 0.4364 0.3306 0.1005
0.4728 12.2977 19000 0.4313 0.3295 0.1009
0.5342 12.4272 19200 0.4267 0.3365 0.1017
0.5437 12.5566 19400 0.4339 0.3323 0.1011
0.5251 12.6861 19600 0.4206 0.3332 0.1015
0.4648 12.8155 19800 0.4297 0.3324 0.1004
0.5792 12.9450 20000 0.4347 0.3259 0.0998
0.4869 13.0744 20200 0.4296 0.3256 0.0988
0.5087 13.2039 20400 0.4338 0.3254 0.0991
0.6692 13.3333 20600 0.4212 0.3256 0.0994
0.5254 13.4628 20800 0.4253 0.3228 0.0985
0.5053 13.5922 21000 0.4294 0.3263 0.0989
0.5353 13.7217 21200 0.4273 0.3217 0.0979
0.5015 13.8511 21400 0.4223 0.3280 0.0997
0.5073 13.9806 21600 0.4308 0.3204 0.0983
0.5079 14.1100 21800 0.4306 0.3220 0.0986
0.5243 14.2395 22000 0.4302 0.3218 0.0980
0.4713 14.3689 22200 0.4391 0.3213 0.0981
0.475 14.4984 22400 0.4197 0.3248 0.0992
0.5342 14.6278 22600 0.4215 0.3214 0.0979
0.4778 14.7573 22800 0.4230 0.3223 0.0975
0.5256 14.8867 23000 0.4285 0.3213 0.0975
0.4872 15.0162 23200 0.4250 0.3187 0.0974
0.7514 15.1456 23400 0.4163 0.3283 0.0985
0.5381 15.2751 23600 0.4219 0.3192 0.0968
0.4458 15.4045 23800 0.4266 0.3241 0.0980
0.474 15.5340 24000 0.4292 0.3164 0.0966
0.4808 15.6634 24200 0.4171 0.3228 0.0983
0.4972 15.7929 24400 0.4156 0.3187 0.0970
0.5605 15.9223 24600 0.4204 0.3196 0.0966
0.4825 16.0518 24800 0.4194 0.3209 0.0972
0.5396 16.1812 25000 0.4109 0.3183 0.0967
0.4674 16.3107 25200 0.4227 0.3130 0.0956
0.5416 16.4401 25400 0.4178 0.3147 0.0962
0.575 16.5696 25600 0.4213 0.3153 0.0961
0.4864 16.6990 25800 0.4116 0.3127 0.0953
0.4773 16.8285 26000 0.4151 0.3137 0.0956
0.5163 16.9579 26200 0.4213 0.3130 0.0954
0.4877 17.0874 26400 0.4152 0.3108 0.0951
0.4689 17.2168 26600 0.4075 0.3147 0.0957
0.5275 17.3463 26800 0.4172 0.3127 0.0956
0.5345 17.4757 27000 0.4218 0.3139 0.0955
0.4446 17.6052 27200 0.4206 0.3105 0.0949
0.435 17.7346 27400 0.4152 0.3121 0.0956
0.5809 17.8641 27600 0.4131 0.3107 0.0952
0.4526 17.9935 27800 0.4105 0.3109 0.0950
0.468 18.1230 28000 0.4153 0.3075 0.0946
0.4247 18.2524 28200 0.4174 0.3077 0.0944
0.5555 18.3819 28400 0.4171 0.3095 0.0947
0.4383 18.5113 28600 0.4162 0.3077 0.0942
0.4817 18.6408 28800 0.4103 0.3085 0.0946
0.4931 18.7702 29000 0.4098 0.3091 0.0944
0.73 18.8997 29200 0.4126 0.3060 0.0941
0.4573 19.0291 29400 0.4119 0.3081 0.0945
0.4527 19.1586 29600 0.4124 0.3067 0.0942
0.446 19.2880 29800 0.4130 0.3074 0.0942
0.4757 19.4175 30000 0.4109 0.3070 0.0940
0.5375 19.5469 30200 0.4133 0.3059 0.0939
0.4405 19.6764 30400 0.4141 0.3067 0.0941
0.5613 19.8058 30600 0.4130 0.3072 0.0941
0.4159 19.9353 30800 0.4129 0.3069 0.0941

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.4.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
965M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for Lguyogiro/wav2vec2-large-mms-1b-azz-adapter-all_data_10epochs

Finetuned
(133)
this model

Evaluation results