whisper-base-zh / README.md
dongim04's picture
End of training
0e6dbee verified
|
raw
history blame
3.84 kB
metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-base
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: whisper-base-zh
    results: []

whisper-base-zh

This model is a fine-tuned version of openai/whisper-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4316
  • Wer: 85.1032

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.5085 0.2070 100 0.4641 89.6326
0.4047 0.4141 200 0.4315 88.2235
0.4156 0.6211 300 0.4055 86.8646
0.3826 0.8282 400 0.3907 87.2169
0.3087 1.0352 500 0.3806 86.9149
0.2709 1.2422 600 0.3775 87.0659
0.2707 1.4493 700 0.3681 85.7071
0.273 1.6563 800 0.3633 85.5561
0.2638 1.8634 900 0.3592 85.4051
0.1698 2.0704 1000 0.3613 85.6568
0.1849 2.2774 1100 0.3628 85.7574
0.1819 2.4845 1200 0.3645 85.4555
0.1712 2.6915 1300 0.3611 84.8515
0.1768 2.8986 1400 0.3573 84.3986
0.1265 3.1056 1500 0.3640 85.2038
0.1275 3.3126 1600 0.3661 85.4051
0.1277 3.5197 1700 0.3698 84.3986
0.1195 3.7267 1800 0.3684 85.3045
0.1176 3.9337 1900 0.3668 84.9522
0.0843 4.1408 2000 0.3764 84.9522
0.077 4.3478 2100 0.3801 84.8012
0.0853 4.5549 2200 0.3804 85.4051
0.0826 4.7619 2300 0.3827 84.8515
0.0759 4.9689 2400 0.3803 84.9522
0.0519 5.1760 2500 0.3959 84.9522
0.0578 5.3830 2600 0.3946 85.0528
0.0594 5.5901 2700 0.3956 85.0528
0.0598 5.7971 2800 0.3971 85.1535
0.0519 6.0041 2900 0.3980 84.7509
0.0421 6.2112 3000 0.4103 85.5058
0.0379 6.4182 3100 0.4124 84.9019
0.0363 6.6253 3200 0.4131 84.9019
0.0377 6.8323 3300 0.4149 85.2038
0.0286 7.0393 3400 0.4205 84.8515
0.0263 7.2464 3500 0.4257 84.6502
0.0309 7.4534 3600 0.4276 85.3548
0.0295 7.6605 3700 0.4282 85.3045
0.0266 7.8675 3800 0.4291 85.2038
0.0241 8.0745 3900 0.4308 85.1535
0.025 8.2816 4000 0.4316 85.1032

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.5.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.19.1