|
--- |
|
library_name: transformers |
|
language: |
|
- nl |
|
license: apache-2.0 |
|
base_model: openai/whisper-large-v2 |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- wer |
|
model-index: |
|
- name: Whisper Large V2 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# Whisper Large V2 |
|
|
|
This model is a fine-tuned version of [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.3489 |
|
- Wer: 17.3755 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 3e-05 |
|
- train_batch_size: 12 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 20 |
|
- num_epochs: 5 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Wer | |
|
|:-------------:|:------:|:----:|:---------------:|:-------:| |
|
| 0.7517 | 0.0449 | 15 | 0.5024 | 40.5912 | |
|
| 0.4299 | 0.0898 | 30 | 0.3868 | 34.1310 | |
|
| 0.363 | 0.1347 | 45 | 0.3704 | 25.2001 | |
|
| 0.3744 | 0.1796 | 60 | 0.3537 | 22.5218 | |
|
| 0.3731 | 0.2246 | 75 | 0.3554 | 24.5736 | |
|
| 0.3743 | 0.2695 | 90 | 0.3388 | 22.7566 | |
|
| 0.3001 | 0.3144 | 105 | 0.3401 | 22.3597 | |
|
| 0.3382 | 0.3593 | 120 | 0.3262 | 33.1556 | |
|
| 0.3353 | 0.4042 | 135 | 0.3266 | 28.0469 | |
|
| 0.325 | 0.4491 | 150 | 0.3247 | 26.7473 | |
|
| 0.3303 | 0.4940 | 165 | 0.3147 | 22.8616 | |
|
| 0.2925 | 0.5389 | 180 | 0.3147 | 21.2041 | |
|
| 0.3109 | 0.5838 | 195 | 0.3108 | 23.1859 | |
|
| 0.2989 | 0.6287 | 210 | 0.3084 | 24.3570 | |
|
| 0.3111 | 0.6737 | 225 | 0.3018 | 18.4117 | |
|
| 0.2918 | 0.7186 | 240 | 0.3033 | 17.6076 | |
|
| 0.3099 | 0.7635 | 255 | 0.2971 | 21.7151 | |
|
| 0.2997 | 0.8084 | 270 | 0.2987 | 21.5361 | |
|
| 0.2898 | 0.8533 | 285 | 0.2923 | 21.5828 | |
|
| 0.2848 | 0.8982 | 300 | 0.2914 | 17.6452 | |
|
| 0.285 | 0.9431 | 315 | 0.2874 | 17.7425 | |
|
| 0.2624 | 0.9880 | 330 | 0.2861 | 16.8489 | |
|
| 0.169 | 1.0329 | 345 | 0.2948 | 18.5687 | |
|
| 0.1515 | 1.0778 | 360 | 0.2927 | 26.6540 | |
|
| 0.1504 | 1.1228 | 375 | 0.2918 | 18.9422 | |
|
| 0.1484 | 1.1677 | 390 | 0.2916 | 18.3482 | |
|
| 0.1358 | 1.2126 | 405 | 0.2904 | 17.2198 | |
|
| 0.128 | 1.2575 | 420 | 0.2895 | 17.6764 | |
|
| 0.1417 | 1.3024 | 435 | 0.2895 | 23.2572 | |
|
| 0.1561 | 1.3473 | 450 | 0.2876 | 17.7775 | |
|
| 0.1445 | 1.3922 | 465 | 0.2874 | 17.5415 | |
|
| 0.1384 | 1.4371 | 480 | 0.2825 | 16.1420 | |
|
| 0.1488 | 1.4820 | 495 | 0.2857 | 17.3832 | |
|
| 0.1701 | 1.5269 | 510 | 0.2779 | 22.6826 | |
|
| 0.1475 | 1.5719 | 525 | 0.2857 | 25.9860 | |
|
| 0.144 | 1.6168 | 540 | 0.2790 | 16.3145 | |
|
| 0.1402 | 1.6617 | 555 | 0.2874 | 21.3948 | |
|
| 0.1575 | 1.7066 | 570 | 0.2756 | 15.9786 | |
|
| 0.1409 | 1.7515 | 585 | 0.2815 | 17.0862 | |
|
| 0.1388 | 1.7964 | 600 | 0.2792 | 18.9176 | |
|
| 0.1273 | 1.8413 | 615 | 0.2803 | 23.6165 | |
|
| 0.1537 | 1.8862 | 630 | 0.2758 | 17.5454 | |
|
| 0.1537 | 1.9311 | 645 | 0.2764 | 15.8373 | |
|
| 0.1474 | 1.9760 | 660 | 0.2708 | 16.4935 | |
|
| 0.1111 | 2.0210 | 675 | 0.2805 | 19.4337 | |
|
| 0.0745 | 2.0659 | 690 | 0.2924 | 18.5388 | |
|
| 0.0639 | 2.1108 | 705 | 0.2917 | 15.8269 | |
|
| 0.0673 | 2.1557 | 720 | 0.2945 | 16.9306 | |
|
| 0.066 | 2.2006 | 735 | 0.2955 | 16.3677 | |
|
| 0.0714 | 2.2455 | 750 | 0.2933 | 16.2289 | |
|
| 0.0701 | 2.2904 | 765 | 0.2911 | 20.4558 | |
|
| 0.0631 | 2.3353 | 780 | 0.2971 | 17.1316 | |
|
| 0.064 | 2.3802 | 795 | 0.2916 | 15.3846 | |
|
| 0.0659 | 2.4251 | 810 | 0.2971 | 15.1602 | |
|
| 0.0615 | 2.4701 | 825 | 0.2878 | 20.4480 | |
|
| 0.0723 | 2.5150 | 840 | 0.2935 | 14.7569 | |
|
| 0.0695 | 2.5599 | 855 | 0.2846 | 15.6570 | |
|
| 0.0704 | 2.6048 | 870 | 0.2919 | 19.4000 | |
|
| 0.0642 | 2.6497 | 885 | 0.2849 | 17.7373 | |
|
| 0.0684 | 2.6946 | 900 | 0.2888 | 15.9164 | |
|
| 0.077 | 2.7395 | 915 | 0.2828 | 15.5052 | |
|
| 0.0708 | 2.7844 | 930 | 0.2858 | 17.0538 | |
|
| 0.065 | 2.8293 | 945 | 0.2829 | 20.8617 | |
|
| 0.0788 | 2.8743 | 960 | 0.2854 | 19.5621 | |
|
| 0.0677 | 2.9192 | 975 | 0.2825 | 16.6984 | |
|
| 0.0642 | 2.9641 | 990 | 0.2887 | 16.1537 | |
|
| 0.0627 | 3.0090 | 1005 | 0.2828 | 16.0331 | |
|
| 0.0262 | 3.0539 | 1020 | 0.3084 | 15.0202 | |
|
| 0.0266 | 3.0988 | 1035 | 0.3129 | 16.9708 | |
|
| 0.024 | 3.1437 | 1050 | 0.3114 | 14.9722 | |
|
| 0.0271 | 3.1886 | 1065 | 0.3152 | 14.5416 | |
|
| 0.026 | 3.2335 | 1080 | 0.3135 | 16.4533 | |
|
| 0.0281 | 3.2784 | 1095 | 0.3151 | 17.0123 | |
|
| 0.0295 | 3.3234 | 1110 | 0.3160 | 15.4183 | |
|
| 0.0259 | 3.3683 | 1125 | 0.3101 | 14.8269 | |
|
| 0.0276 | 3.4132 | 1140 | 0.3194 | 14.1175 | |
|
| 0.0271 | 3.4581 | 1155 | 0.3172 | 17.3314 | |
|
| 0.0304 | 3.5030 | 1170 | 0.3111 | 18.0577 | |
|
| 0.0268 | 3.5479 | 1185 | 0.3129 | 14.0928 | |
|
| 0.0256 | 3.5928 | 1200 | 0.3083 | 14.7374 | |
|
| 0.0281 | 3.6377 | 1215 | 0.3079 | 14.9125 | |
|
| 0.0274 | 3.6826 | 1230 | 0.3180 | 14.4586 | |
|
| 0.0282 | 3.7275 | 1245 | 0.3091 | 14.6622 | |
|
| 0.0224 | 3.7725 | 1260 | 0.3139 | 14.4132 | |
|
| 0.0254 | 3.8174 | 1275 | 0.3141 | 14.0747 | |
|
| 0.0279 | 3.8623 | 1290 | 0.3110 | 18.3676 | |
|
| 0.0245 | 3.9072 | 1305 | 0.3119 | 15.0565 | |
|
| 0.0256 | 3.9521 | 1320 | 0.3149 | 16.3560 | |
|
| 0.0273 | 3.9970 | 1335 | 0.3128 | 16.3405 | |
|
| 0.0126 | 4.0419 | 1350 | 0.3265 | 14.9385 | |
|
| 0.0087 | 4.0868 | 1365 | 0.3411 | 14.4547 | |
|
| 0.009 | 4.1317 | 1380 | 0.3394 | 14.6298 | |
|
| 0.0093 | 4.1766 | 1395 | 0.3424 | 14.4547 | |
|
| 0.0082 | 4.2216 | 1410 | 0.3457 | 14.4780 | |
|
| 0.0093 | 4.2665 | 1425 | 0.3472 | 13.8192 | |
|
| 0.0072 | 4.3114 | 1440 | 0.3491 | 15.0189 | |
|
| 0.0093 | 4.3563 | 1455 | 0.3490 | 16.3962 | |
|
| 0.0098 | 4.4012 | 1470 | 0.3455 | 16.3755 | |
|
| 0.0077 | 4.4461 | 1485 | 0.3429 | 16.9410 | |
|
| 0.0089 | 4.4910 | 1500 | 0.3452 | 17.0966 | |
|
| 0.0099 | 4.5359 | 1515 | 0.3469 | 18.3897 | |
|
| 0.0066 | 4.5808 | 1530 | 0.3465 | 19.0083 | |
|
| 0.0074 | 4.6257 | 1545 | 0.3455 | 19.6867 | |
|
| 0.0069 | 4.6707 | 1560 | 0.3489 | 18.5440 | |
|
| 0.008 | 4.7156 | 1575 | 0.3502 | 18.4078 | |
|
| 0.0079 | 4.7605 | 1590 | 0.3503 | 18.1057 | |
|
| 0.0077 | 4.8054 | 1605 | 0.3501 | 18.2574 | |
|
| 0.0058 | 4.8503 | 1620 | 0.3492 | 18.1653 | |
|
| 0.0076 | 4.8952 | 1635 | 0.3486 | 17.7905 | |
|
| 0.0064 | 4.9401 | 1650 | 0.3487 | 17.3858 | |
|
| 0.0057 | 4.9850 | 1665 | 0.3489 | 17.3755 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.45.0.dev0 |
|
- Pytorch 2.1.0+cu121 |
|
- Datasets 2.20.0 |
|
- Tokenizers 0.19.1 |
|
|