Edit model card

whisper-small-nomi

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Wer: 0.6383

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.4305 1.9417 100 0.1954 36.8085
0.3179 3.8835 200 0.1760 71.7021
0.1876 5.8252 300 0.2092 24.4681
0.1377 7.7670 400 0.1063 23.1915
0.0933 9.7087 500 0.0556 7.8723
0.0713 11.6505 600 0.0556 9.5745
0.0548 13.5922 700 0.0289 5.1064
0.0436 15.5340 800 0.0330 3.6170
0.0258 17.4757 900 0.0041 0.8511
0.0094 19.4175 1000 0.0072 2.1277
0.0047 21.3592 1100 0.0001 1.7021
0.001 23.3010 1200 0.0001 0.8511
0.0 25.2427 1300 0.0000 0.6383
0.0 27.1845 1400 0.0000 0.6383
0.0 29.1262 1500 0.0000 0.6383

Framework versions

  • Transformers 4.45.0.dev0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for susmitabhatt/whisper-small-nomi

Finetuned
this model