Edit model card

whisper_syl_cv12_pad_lob100_low__0135

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0001
  • Train Accuracy: 0.0362
  • Train Wermet: 0.0011
  • Validation Loss: 0.7303
  • Validation Accuracy: 0.0236
  • Validation Wermet: 0.2229
  • Epoch: 134

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.2930 0.0113 2.0658 3.9415 0.0117 0.9401 0
4.6215 0.0121 0.8917 3.7803 0.0120 0.9294 1
4.4086 0.0128 0.8403 3.6070 0.0124 0.9223 2
4.1842 0.0135 0.8337 3.4291 0.0128 0.8867 3
3.9981 0.0141 0.8182 3.3251 0.0131 0.8750 4
3.8531 0.0145 0.8058 3.2385 0.0133 0.8699 5
3.7345 0.0149 0.7925 3.1751 0.0134 0.8665 6
3.6307 0.0152 0.7851 3.1031 0.0136 0.8507 7
3.5437 0.0155 0.7717 3.0752 0.0138 0.8286 8
3.4649 0.0157 0.7651 3.0334 0.0139 0.8417 9
3.3926 0.0159 0.7531 3.0022 0.0139 0.8413 10
3.3262 0.0162 0.7462 2.9669 0.0140 0.8264 11
3.2625 0.0164 0.7367 2.9342 0.0141 0.8520 12
3.1979 0.0166 0.7231 2.9046 0.0144 0.8196 13
3.1319 0.0169 0.7133 2.8607 0.0145 0.8026 14
3.0616 0.0172 0.7007 2.8165 0.0146 0.7788 15
2.9792 0.0176 0.6816 2.7552 0.0149 0.7643 16
2.8905 0.0180 0.6641 2.6788 0.0151 0.7473 17
2.7749 0.0186 0.6424 2.5824 0.0155 0.7241 18
2.6263 0.0193 0.6159 2.4206 0.0161 0.7047 19
2.4352 0.0203 0.5829 2.2230 0.0168 0.6500 20
2.1941 0.0216 0.5411 2.0349 0.0175 0.5980 21
1.9184 0.0231 0.4922 1.7850 0.0184 0.5659 22
1.6174 0.0249 0.4371 1.5664 0.0192 0.5081 23
1.3542 0.0265 0.3851 1.3992 0.0199 0.4690 24
1.1499 0.0278 0.3408 1.2512 0.0205 0.4299 25
0.9878 0.0288 0.3029 1.1479 0.0209 0.4013 26
0.8600 0.0297 0.2735 1.0527 0.0213 0.3755 27
0.7516 0.0305 0.2441 0.9803 0.0216 0.3570 28
0.6626 0.0311 0.2197 0.9314 0.0219 0.3416 29
0.5863 0.0316 0.1993 0.8730 0.0221 0.3238 30
0.5187 0.0321 0.1775 0.8357 0.0223 0.3136 31
0.4608 0.0326 0.1610 0.8059 0.0224 0.3033 32
0.4087 0.0330 0.1467 0.7746 0.0226 0.2949 33
0.3642 0.0334 0.1298 0.7476 0.0227 0.2847 34
0.3221 0.0337 0.1168 0.7330 0.0228 0.2802 35
0.2837 0.0340 0.1030 0.7093 0.0229 0.2728 36
0.2509 0.0343 0.0882 0.6941 0.0229 0.2687 37
0.2209 0.0346 0.0747 0.6892 0.0230 0.2656 38
0.1934 0.0349 0.0670 0.6824 0.0230 0.2630 39
0.1688 0.0351 0.0542 0.6773 0.0230 0.2625 40
0.1469 0.0353 0.0429 0.6700 0.0231 0.2633 41
0.1268 0.0355 0.0365 0.6680 0.0231 0.2578 42
0.1086 0.0357 0.0284 0.6643 0.0231 0.2540 43
0.0920 0.0358 0.0221 0.6645 0.0231 0.2530 44
0.0783 0.0359 0.0169 0.6621 0.0232 0.2540 45
0.0667 0.0360 0.0121 0.6714 0.0232 0.2532 46
0.0563 0.0361 0.0094 0.6604 0.0232 0.2503 47
0.0477 0.0361 0.0072 0.6620 0.0232 0.2489 48
0.0397 0.0362 0.0055 0.6611 0.0232 0.2502 49
0.0330 0.0362 0.0045 0.6686 0.0232 0.2496 50
0.0283 0.0362 0.0033 0.6705 0.0232 0.2503 51
0.0242 0.0362 0.0034 0.6686 0.0232 0.2486 52
0.0212 0.0362 0.0031 0.6686 0.0232 0.2493 53
0.0197 0.0362 0.0028 0.6688 0.0232 0.2530 54
0.0226 0.0362 0.0041 0.6598 0.0233 0.2451 55
0.0158 0.0362 0.0024 0.6605 0.0233 0.2428 56
0.0115 0.0362 0.0018 0.6648 0.0233 0.2435 57
0.0094 0.0362 0.0017 0.6672 0.0233 0.2446 58
0.0081 0.0362 0.0018 0.6731 0.0233 0.2439 59
0.0071 0.0362 0.0017 0.6762 0.0233 0.2429 60
0.0062 0.0362 0.0017 0.6794 0.0233 0.2426 61
0.0055 0.0362 0.0017 0.6825 0.0233 0.2429 62
0.0048 0.0362 0.0017 0.6895 0.0233 0.2450 63
0.0042 0.0362 0.0019 0.6914 0.0233 0.2424 64
0.0037 0.0362 0.0018 0.6938 0.0233 0.2423 65
0.0224 0.0361 0.0080 0.6695 0.0234 0.2409 66
0.0127 0.0362 0.0037 0.6685 0.0234 0.2383 67
0.0065 0.0362 0.0017 0.6714 0.0234 0.2359 68
0.0045 0.0362 0.0017 0.6645 0.0234 0.2347 69
0.0034 0.0362 0.0016 0.6671 0.0234 0.2353 70
0.0028 0.0362 0.0014 0.6715 0.0234 0.2354 71
0.0024 0.0362 0.0014 0.6745 0.0234 0.2358 72
0.0022 0.0362 0.0014 0.6778 0.0234 0.2356 73
0.0020 0.0362 0.0013 0.6797 0.0234 0.2357 74
0.0018 0.0362 0.0014 0.6833 0.0234 0.2355 75
0.0016 0.0362 0.0013 0.6885 0.0234 0.2363 76
0.0068 0.0362 0.0035 0.7270 0.0232 0.2500 77
0.0131 0.0362 0.0076 0.6965 0.0234 0.2397 78
0.0054 0.0362 0.0088 0.6764 0.0235 0.2339 79
0.0029 0.0362 0.0041 0.6806 0.0235 0.2334 80
0.0019 0.0362 0.0039 0.6723 0.0235 0.2316 81
0.0016 0.0362 0.0028 0.6765 0.0235 0.2315 82
0.0014 0.0362 0.0025 0.6786 0.0235 0.2306 83
0.0013 0.0362 0.0023 0.6805 0.0235 0.2304 84
0.0012 0.0362 0.0022 0.6830 0.0235 0.2301 85
0.0011 0.0362 0.0022 0.6881 0.0235 0.2308 86
0.0010 0.0362 0.0022 0.6875 0.0235 0.2303 87
0.0009 0.0362 0.0022 0.6909 0.0235 0.2307 88
0.0008 0.0362 0.0020 0.6934 0.0235 0.2299 89
0.0007 0.0362 0.0022 0.6968 0.0235 0.2307 90
0.0007 0.0362 0.0020 0.7005 0.0235 0.2300 91
0.0006 0.0362 0.0021 0.7040 0.0235 0.2307 92
0.0006 0.0362 0.0020 0.7086 0.0235 0.2309 93
0.0005 0.0362 0.0020 0.7116 0.0235 0.2318 94
0.0005 0.0362 0.0018 0.7151 0.0235 0.2305 95
0.0111 0.0362 0.2014 0.7185 0.0234 0.2861 96
0.0069 0.0362 0.0051 0.7036 0.0235 0.2337 97
0.0028 0.0362 0.0015 0.6946 0.0235 0.2324 98
0.0023 0.0362 0.0018 0.6937 0.0235 0.2295 99
0.0017 0.0362 0.0013 0.6886 0.0235 0.2283 100
0.0010 0.0362 0.0008 0.6891 0.0236 0.2274 101
0.0009 0.0362 0.0013 0.6901 0.0236 0.2275 102
0.0008 0.0362 0.0015 0.6922 0.0236 0.2273 103
0.0006 0.0362 0.0015 0.6923 0.0236 0.2274 104
0.0008 0.0362 0.0014 0.6996 0.0235 0.2288 105
0.0006 0.0362 0.0014 0.6967 0.0236 0.2266 106
0.0005 0.0362 0.0013 0.6988 0.0236 0.2260 107
0.0004 0.0362 0.0027 0.7008 0.0236 0.2278 108
0.0004 0.0362 0.0017 0.7034 0.0236 0.2261 109
0.0004 0.0362 0.0018 0.7036 0.0236 0.2265 110
0.0004 0.0362 0.0015 0.7090 0.0236 0.2255 111
0.0112 0.0362 0.0059 0.7014 0.0235 0.2271 112
0.0034 0.0362 0.0023 0.6869 0.0236 0.2252 113
0.0015 0.0362 0.0015 0.6863 0.0236 0.2234 114
0.0008 0.0362 0.0010 0.6893 0.0236 0.2227 115
0.0006 0.0362 0.0011 0.6911 0.0236 0.2232 116
0.0005 0.0362 0.0009 0.6923 0.0236 0.2227 117
0.0004 0.0362 0.0009 0.6938 0.0236 0.2225 118
0.0004 0.0362 0.0010 0.6958 0.0236 0.2226 119
0.0003 0.0362 0.0010 0.6966 0.0236 0.2226 120
0.0003 0.0362 0.0010 0.6983 0.0236 0.2230 121
0.0003 0.0362 0.0010 0.7005 0.0236 0.2229 122
0.0003 0.0362 0.0010 0.7022 0.0236 0.2233 123
0.0002 0.0362 0.0010 0.7041 0.0236 0.2226 124
0.0002 0.0362 0.0011 0.7065 0.0236 0.2228 125
0.0002 0.0362 0.0011 0.7081 0.0236 0.2227 126
0.0002 0.0362 0.0011 0.7101 0.0236 0.2224 127
0.0002 0.0362 0.0011 0.7130 0.0236 0.2224 128
0.0002 0.0362 0.0011 0.7157 0.0236 0.2229 129
0.0002 0.0362 0.0011 0.7183 0.0236 0.2225 130
0.0001 0.0362 0.0011 0.7212 0.0236 0.2230 131
0.0001 0.0362 0.0012 0.7250 0.0236 0.2230 132
0.0001 0.0362 0.0012 0.7268 0.0236 0.2229 133
0.0001 0.0362 0.0011 0.7303 0.0236 0.2229 134

Framework versions

  • Transformers 4.33.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bigmorning/whisper_syl_cv12_pad_lob100_low__0135

Finetuned
(1214)
this model