--- license: apache-2.0 base_model: openai/whisper-tiny tags: - generated_from_keras_callback model-index: - name: whisper_syl_cv12_pad_lob100__0075 results: [] --- # whisper_syl_cv12_pad_lob100__0075 This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0029 - Train Accuracy: 0.0362 - Train Wermet: 0.6039 - Validation Loss: 0.6222 - Validation Accuracy: 0.0239 - Validation Wermet: 3.0512 - Epoch: 74 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch | |:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:| | 5.0233 | 0.0115 | 1.6383 | 3.8616 | 0.0117 | 0.9516 | 0 | | 4.4412 | 0.0127 | 0.8560 | 3.5410 | 0.0125 | 0.8971 | 1 | | 4.0719 | 0.0138 | 0.8366 | 3.2944 | 0.0132 | 0.8706 | 2 | | 3.8091 | 0.0146 | 0.8133 | 3.1691 | 0.0134 | 0.8487 | 3 | | 3.6239 | 0.0152 | 0.7866 | 3.0647 | 0.0136 | 0.8282 | 4 | | 3.4749 | 0.0156 | 0.7589 | 2.9835 | 0.0139 | 0.8049 | 5 | | 3.3444 | 0.0161 | 0.7359 | 2.9351 | 0.0140 | 0.7979 | 6 | | 3.2215 | 0.0165 | 0.7138 | 2.8468 | 0.0145 | 0.7589 | 7 | | 3.0754 | 0.0172 | 0.6873 | 2.7530 | 0.0148 | 0.7413 | 8 | | 2.8713 | 0.0181 | 0.6484 | 2.5226 | 0.0157 | 0.7017 | 9 | | 2.5469 | 0.0197 | 0.5934 | 2.1931 | 0.0168 | 0.6285 | 10 | | 2.0233 | 0.0225 | 0.4997 | 1.6411 | 0.0189 | 0.5215 | 11 | | 1.3808 | 0.0264 | 0.3852 | 1.2401 | 0.0205 | 0.4238 | 12 | | 0.9722 | 0.0290 | 0.3123 | 1.0195 | 0.0215 | 0.3682 | 13 | | 0.7388 | 0.0305 | 0.2828 | 0.8773 | 0.0221 | 0.3322 | 14 | | 0.5787 | 0.0317 | 0.2751 | 0.7970 | 0.0225 | 0.3083 | 15 | | 0.4642 | 0.0325 | 0.2878 | 0.7315 | 0.0227 | 0.2964 | 16 | | 0.3752 | 0.0332 | 0.4217 | 0.6897 | 0.0229 | 0.3297 | 17 | | 0.3042 | 0.0338 | 0.7294 | 0.6572 | 0.0231 | 0.4453 | 18 | | 0.2444 | 0.0343 | 1.1298 | 0.6369 | 0.0232 | 0.6637 | 19 | | 0.1949 | 0.0348 | 1.6370 | 0.6180 | 0.0233 | 1.6119 | 20 | | 0.1544 | 0.0352 | 1.6151 | 0.6149 | 0.0233 | 1.6843 | 21 | | 0.1212 | 0.0355 | 1.3832 | 0.6066 | 0.0233 | 0.8721 | 22 | | 0.0931 | 0.0357 | 1.2799 | 0.6034 | 0.0234 | 0.5109 | 23 | | 0.0725 | 0.0359 | 1.0940 | 0.6102 | 0.0234 | 1.0111 | 24 | | 0.0551 | 0.0361 | 1.2865 | 0.6000 | 0.0234 | 1.1393 | 25 | | 0.0411 | 0.0361 | 1.8511 | 0.6037 | 0.0235 | 2.0574 | 26 | | 0.0311 | 0.0362 | 1.7179 | 0.6018 | 0.0235 | 1.4847 | 27 | | 0.0253 | 0.0362 | 0.9801 | 0.6010 | 0.0235 | 0.4457 | 28 | | 0.0231 | 0.0362 | 0.9376 | 0.6046 | 0.0235 | 0.9247 | 29 | | 0.0196 | 0.0362 | 0.6466 | 0.6078 | 0.0235 | 0.5271 | 30 | | 0.0177 | 0.0362 | 0.4041 | 0.6155 | 0.0235 | 0.4352 | 31 | | 0.0139 | 0.0362 | 0.4202 | 0.6037 | 0.0236 | 0.5585 | 32 | | 0.0137 | 0.0362 | 0.8151 | 0.6015 | 0.0236 | 1.8476 | 33 | | 0.0122 | 0.0362 | 3.4515 | 0.6043 | 0.0236 | 3.8210 | 34 | | 0.0098 | 0.0362 | 1.1787 | 0.5985 | 0.0236 | 0.8094 | 35 | | 0.0071 | 0.0362 | 0.9920 | 0.5992 | 0.0236 | 0.8755 | 36 | | 0.0055 | 0.0362 | 2.4665 | 0.6047 | 0.0236 | 2.0127 | 37 | | 0.0124 | 0.0362 | 4.2468 | 0.6089 | 0.0236 | 2.8886 | 38 | | 0.0109 | 0.0362 | 2.0177 | 0.6097 | 0.0236 | 0.3417 | 39 | | 0.0073 | 0.0362 | 0.9927 | 0.6057 | 0.0237 | 2.5519 | 40 | | 0.0080 | 0.0362 | 1.7341 | 0.6099 | 0.0236 | 1.3119 | 41 | | 0.0063 | 0.0362 | 2.4288 | 0.6058 | 0.0237 | 1.3465 | 42 | | 0.0038 | 0.0362 | 1.4535 | 0.6022 | 0.0237 | 1.6804 | 43 | | 0.0028 | 0.0362 | 2.2629 | 0.6001 | 0.0238 | 3.4388 | 44 | | 0.0021 | 0.0362 | 3.5877 | 0.6018 | 0.0238 | 2.6165 | 45 | | 0.0017 | 0.0362 | 3.0080 | 0.6043 | 0.0238 | 2.6827 | 46 | | 0.0061 | 0.0362 | 2.5182 | 0.6545 | 0.0235 | 0.2316 | 47 | | 0.0126 | 0.0362 | 0.2097 | 0.6206 | 0.0236 | 0.6194 | 48 | | 0.0071 | 0.0362 | 0.3045 | 0.6047 | 0.0237 | 0.7476 | 49 | | 0.0053 | 0.0362 | 1.2045 | 0.6010 | 0.0238 | 0.6553 | 50 | | 0.0040 | 0.0362 | 0.2626 | 0.5964 | 0.0238 | 0.7027 | 51 | | 0.0021 | 0.0362 | 0.5023 | 0.5950 | 0.0238 | 0.3812 | 52 | | 0.0014 | 0.0362 | 0.7108 | 0.6233 | 0.0237 | 1.4647 | 53 | | 0.0017 | 0.0362 | 0.3475 | 0.6087 | 0.0238 | 0.2213 | 54 | | 0.0011 | 0.0362 | 0.1825 | 0.5984 | 0.0239 | 0.2391 | 55 | | 0.0021 | 0.0362 | 1.0757 | 0.6211 | 0.0238 | 7.3766 | 56 | | 0.0078 | 0.0362 | 2.1996 | 0.6349 | 0.0237 | 5.2774 | 57 | | 0.0071 | 0.0362 | 1.2499 | 0.6225 | 0.0237 | 0.9927 | 58 | | 0.0045 | 0.0362 | 5.3986 | 0.6088 | 0.0238 | 27.5186 | 59 | | 0.0027 | 0.0362 | 9.4813 | 0.6035 | 0.0239 | 0.2741 | 60 | | 0.0015 | 0.0362 | 20.4251 | 0.6005 | 0.0239 | 73.4792 | 61 | | 0.0012 | 0.0362 | 17.1227 | 0.6148 | 0.0238 | 4.2506 | 62 | | 0.0024 | 0.0362 | 3.7081 | 0.6249 | 0.0238 | 5.8937 | 63 | | 0.0050 | 0.0362 | 2.2590 | 0.6136 | 0.0238 | 9.6813 | 64 | | 0.0026 | 0.0362 | 3.1954 | 0.6060 | 0.0239 | 15.4541 | 65 | | 0.0032 | 0.0362 | 5.1838 | 0.6233 | 0.0238 | 10.2566 | 66 | | 0.0053 | 0.0362 | 3.1310 | 0.6178 | 0.0239 | 1.4216 | 67 | | 0.0030 | 0.0362 | 1.1169 | 0.6106 | 0.0239 | 0.9273 | 68 | | 0.0018 | 0.0362 | 0.9183 | 0.6034 | 0.0239 | 1.7868 | 69 | | 0.0011 | 0.0362 | 0.3862 | 0.6116 | 0.0239 | 0.5909 | 70 | | 0.0014 | 0.0362 | 0.6235 | 0.6143 | 0.0239 | 0.9794 | 71 | | 0.0025 | 0.0362 | 0.5583 | 0.6510 | 0.0237 | 0.3524 | 72 | | 0.0058 | 0.0362 | 1.9614 | 0.6179 | 0.0239 | 1.2838 | 73 | | 0.0029 | 0.0362 | 0.6039 | 0.6222 | 0.0239 | 3.0512 | 74 | ### Framework versions - Transformers 4.33.0.dev0 - TensorFlow 2.13.0 - Tokenizers 0.13.3