model-index: - name: tamasheq-99-final results: []

tamasheq-99-final

This model is a fine-tuned version of jonatasgrosman/wav2vec2-large-xlsr-53-arabic on the None dataset. It achieves the following results on the evaluation set:

  • Cer: 16.2959
  • Wer: 55.5334

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500

Training results

step tamasheq_wer arabic_wer tamasheq_cer arabic_cer
Before train 104.985 23.1305 67.4458 7.30972
step 300 99.5513 23.0544 49.7078 7.1043
step 600 95.1147 22.5267 41.4515 6.0098
step 900 93.5194 21.0404 38.0867 5.52939
step 1200 92.5723 20.6224 37.0877 5.39751
step 1500 92.3009 20.9238 36.9915 5.6718
step 1800 92.0738 21.2699 36.3713 6.08877
step 2100 88.7338 21.9693 33.3648 5.9156
step 2400 87.1884 21.1333 31.8379 5.52939
step 2700 88.299 21.0705 31.4599 5.5078
step 3000 87.7866 21.5021 30.9039 6.29239
step 3300 84.2971 21.666 29.7455 5.97212
step 3600 83.8983 21.5732 28.6145 6.04748
step 3900 81.8544 22.1087 27.9359 5.99096
step 4200 82.9741 23.392 27.4288 6.4013
step 4500 83.8485 24.2452 27.0575 6.79164
step 4800 81.6052 22.666 26.6918 6.09457
step 5100 77.9661 22.4803 25.1084 6.0098
step 5400 77.2183 21.83 24.656 5.9156
step 5700 76.672 22.1078 24.2606 6.0802
step 6000 76.2712 22.7589 23.9236 6.41485
step 6300 75.7228 23.8737 23.7135 6.78222
step 6600 71.2363 23.177 22.196 6.39601
step 6900 69.8405 22.7125 21.574 6.21703
step 7200 72.9452 23.6679 21.0775 6.6918
step 7500 75.9222 24.7097 20.8999 7.17784
step 7800 67.4975 23.1305 20.6786 6.65034
step 8100 65.2542 23.1305 19.7361 6.49962
step 8400 61.7149 22.3874 18.426 6.12283
step 8700 63.8046 23.6679 18.2166 6.2679
step 9000 64.7059 24.1059 17.9952 6.66918
step 9300 67.5474 24.7097 17.6078 7.16843
step 9600 57.1286 23.3163 17.2385 6.66918
step 9900 58.2752 22.8054 17.1065 6.4431
step 10200 57.7767 24.2917 16.848 6.68802
step 10500 55.2841 25.1277 16.5033 7.12133
step 10800 52.5424 23.8272 15.9566 6.80106
step 11100 55.5334 24.6168 16.2959 6.94235

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
34
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ad019el/tamasheq-99-final

Unable to build the model tree, the base model loops to the model itself. Learn more.

Datasets used to train ad019el/tamasheq-99-final