ymoslem's picture
End of training
eb02946 verified
metadata
language:
  - ga
  - en
license: apache-2.0
base_model: openai/whisper-small
tags:
  - generated_from_trainer
datasets:
  - ymoslem/IWSLT2023-GA-EN
  - ymoslem/FLEURS-GA-EN
  - ymoslem/BitesizeIrish-GA-EN
  - ymoslem/SpokenWords-GA-EN-MTed
  - ymoslem/Tatoeba-Speech-Irish
  - ymoslem/Wikimedia-Speech-Irish
metrics:
  - bleu
  - wer
model-index:
  - name: Whisper Medium GA-EN Speech Translation
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: IWSLT-2023, FLEURS, BiteSize, SpokenWords, Tatoeba, and Wikimedia
          type: ymoslem/IWSLT2023-GA-EN
        metrics:
          - name: Bleu
            type: bleu
            value: 35.04
          - name: Wer
            type: wer
            value: 57.90184601530842

Whisper Medium GA-EN Speech Translation

This model is a fine-tuned version of openai/whisper-small on the IWSLT-2023, FLEURS, BiteSize, SpokenWords, Tatoeba, and Wikimedia dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2966
  • Bleu: 35.04
  • Chrf: 55.03
  • Wer: 57.9018

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.03
  • training_steps: 7000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Bleu Chrf Validation Loss Wer
2.5164 0.0328 100 2.56 17.46 2.0060 162.9896
2.656 0.0657 200 8.49 26.0 2.0232 99.5498
2.5156 0.0985 300 7.55 25.1 1.9253 141.2877
2.4722 0.1314 400 12.52 30.49 1.8289 90.4548
2.3376 0.1642 500 17.39 33.23 1.6839 81.1796
2.1733 0.1970 600 9.62 32.48 1.7342 137.9559
2.3382 0.2299 700 12.54 34.43 1.6570 112.2467
2.0041 0.2627 800 17.55 36.73 1.6048 85.1418
2.1142 0.2956 900 17.58 35.74 1.6256 82.7105
2.024 0.3284 1000 14.4 37.22 1.5861 86.7177
1.7556 0.3612 1100 17.21 38.88 1.5415 84.5115
1.6904 0.3941 1200 19.6 38.84 1.4902 85.3670
1.674 0.4269 1300 20.33 41.3 1.4748 88.3836
1.6899 0.4598 1400 22.74 43.25 1.4479 80.9995
1.5234 0.4926 1500 20.13 42.08 1.3763 80.6844
1.364 0.5255 1600 23.12 41.78 1.4164 72.9851
1.5267 0.5583 1700 19.94 41.63 1.3855 91.7605
1.4282 0.5911 1800 23.96 44.84 1.3729 74.6961
1.3611 0.6240 1900 23.1 45.41 1.3562 81.8100
1.1396 0.6568 2000 27.9 46.89 1.3131 67.2670
1.1849 0.6897 2100 24.38 45.25 1.3483 75.8667
1.0871 0.7225 2200 28.64 48.93 1.2848 66.6817
1.1822 0.7553 2300 28.41 47.25 1.2782 68.6628
1.1272 0.7882 2400 27.24 48.57 1.2549 75.9568
1.0241 0.8210 2500 25.74 47.44 1.2922 74.4710
0.9629 0.8539 2600 23.93 44.61 1.3209 82.1252
0.8251 0.8867 2700 32.21 51.64 1.2273 65.5110
0.7921 0.9195 2800 26.38 48.31 1.2881 80.2792
0.8873 0.9524 2900 26.57 50.09 1.2268 77.1724
0.7967 0.9852 3000 29.35 51.53 1.2036 69.6533
0.3119 1.0181 3100 31.77 51.57 1.2231 62.3143
0.3009 1.0509 3200 31.8 50.44 1.2446 61.8190
0.2855 1.0837 3300 30.48 50.86 1.2240 66.7717
0.2535 1.1166 3400 31.96 52.82 1.2287 63.3949
0.2162 1.1494 3500 33.91 52.17 1.2398 61.3688
0.2307 1.1823 3600 32.11 51.67 1.2280 64.7456
0.2184 1.2151 3700 34.59 53.32 1.2149 59.9730
0.2365 1.2479 3800 32.51 52.98 1.2044 62.3593
0.1958 1.2808 3900 32.45 52.86 1.2116 63.1697
0.2081 1.3136 4000 32.53 52.88 1.2087 62.8095
0.2768 1.3465 4100 1.3177 30.73 49.53 64.3854
0.3241 1.3793 4200 1.3363 24.44 46.88 78.2981
0.3326 1.4122 4300 1.3622 27.77 47.05 68.7528
0.3623 1.4450 4400 1.3232 27.0 47.25 70.4187
0.3114 1.4778 4500 1.3530 25.64 46.53 73.7506
0.2933 1.5107 4600 1.3674 29.95 47.77 65.3760
0.3162 1.5435 4700 1.4011 28.58 47.12 66.2765
0.2687 1.5764 4800 1.2875 32.67 50.02 61.7740
0.2733 1.6092 4900 1.3090 30.86 50.51 63.2148
0.2552 1.6420 5000 1.2946 27.95 49.41 69.8334
0.2781 1.6749 5100 1.2971 34.16 52.07 61.5489
0.2367 1.7077 5200 1.2990 32.3 51.69 63.3949
0.244 1.7406 5300 1.3185 32.17 50.59 62.0891
0.2118 1.7734 5400 1.2813 32.85 52.14 60.8735
0.1986 1.8062 5500 1.3007 30.35 50.78 64.9707
0.2393 1.8391 5600 1.2729 34.09 53.08 59.3426
0.1803 1.8719 5700 1.2481 33.92 53.57 59.7929
0.199 1.9048 5800 1.2670 34.53 52.74 58.9824
0.2 1.9376 5900 1.2591 33.57 53.24 60.0180
0.1585 1.9704 6000 1.2855 31.51 52.67 64.0702
0.132 2.0033 6100 1.2915 30.79 51.84 66.5466
0.0555 2.0361 6200 1.3077 34.44 51.8 61.2337
0.0623 2.0690 6300 1.3224 35.52 53.58 59.4327
0.0455 2.1018 6400 1.2942 35.34 53.46 58.9824
0.0573 2.1346 6500 1.3020 34.32 53.93 59.5227
0.0487 2.1675 6600 1.3091 35.64 54.4 58.9824
0.0646 2.2003 6700 1.3184 34.75 53.92 59.0725
0.0454 2.2332 6800 1.3062 35.48 55.12 58.2620
0.0574 2.2660 6900 1.2996 34.97 55.31 58.6673
0.051 2.2989 7000 1.2966 35.04 55.03 57.9018

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1