bigmorning's picture
Upload TFWhisperForConditionalGeneration
a4c1584
metadata
license: apache-2.0
base_model: openai/whisper-tiny
tags:
  - generated_from_keras_callback
model-index:
  - name: train_from_raw_cv12_true__0095
    results: []

train_from_raw_cv12_true__0095

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0041
  • Train Accuracy: 0.1114
  • Train Wermet: 4.2042
  • Validation Loss: 0.5504
  • Validation Accuracy: 0.0645
  • Validation Wermet: 9.8599
  • Epoch: 94

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
2.3396 0.0444 2.7124 1.8124 0.0331 9.4418 0
1.7496 0.0562 3.2708 1.6529 0.0359 9.7805 1
1.6254 0.0594 3.1714 1.5874 0.0371 10.4139 2
1.5490 0.0615 3.0208 1.5102 0.0382 8.2054 3
1.4910 0.0631 2.8298 1.4213 0.0402 8.9930 4
1.4233 0.0650 2.7100 1.3319 0.0421 8.8483 5
1.3203 0.0679 2.5125 1.1806 0.0450 7.0373 6
1.1453 0.0730 2.5001 1.0029 0.0484 5.7328 7
0.9537 0.0789 2.6295 0.7799 0.0529 6.7373 8
0.7822 0.0844 2.7501 0.6499 0.0556 8.5538 9
0.6317 0.0895 2.9507 0.5467 0.0578 8.4990 10
0.5010 0.0940 3.1604 0.4597 0.0597 9.4002 11
0.4026 0.0975 3.2910 0.3984 0.0610 9.8173 12
0.3372 0.0998 3.5302 0.3571 0.0619 9.6433 13
0.2922 0.1013 3.5614 0.3391 0.0623 9.6152 14
0.2572 0.1025 3.5685 0.3157 0.0628 9.4497 15
0.2258 0.1036 3.5179 0.3104 0.0630 9.7735 16
0.1995 0.1045 3.5362 0.2944 0.0634 9.8154 17
0.1762 0.1054 3.5227 0.2820 0.0637 9.8672 18
0.1551 0.1061 3.5489 0.2849 0.0638 9.6569 19
0.1356 0.1068 3.4990 0.2821 0.0639 9.9735 20
0.1174 0.1074 3.5119 0.2841 0.0640 9.8770 21
0.1016 0.1080 3.5233 0.2903 0.0640 10.0309 22
0.0847 0.1087 3.5368 0.3013 0.0640 9.8095 23
0.0713 0.1092 3.5250 0.3040 0.0640 9.7686 24
0.0596 0.1096 3.5310 0.3137 0.0640 9.9239 25
0.0478 0.1101 3.5776 0.3228 0.0641 10.2774 26
0.0400 0.1104 3.6155 0.3316 0.0641 9.9082 27
0.0301 0.1107 3.6545 0.3446 0.0641 9.9672 28
0.0227 0.1110 3.7827 0.3579 0.0641 10.2859 29
0.0179 0.1112 3.7672 0.3728 0.0640 10.1965 30
0.0148 0.1112 3.7575 0.3829 0.0641 10.5114 31
0.0116 0.1113 3.7682 0.3959 0.0641 10.4941 32
0.0100 0.1114 3.8332 0.4056 0.0641 10.6330 33
0.0140 0.1112 3.8661 0.4283 0.0638 10.5741 34
0.0159 0.1111 3.8054 0.4203 0.0640 10.5108 35
0.0085 0.1114 3.7278 0.4236 0.0641 10.3731 36
0.0052 0.1115 3.8251 0.4366 0.0641 10.5728 37
0.0058 0.1115 3.9767 0.4459 0.0641 10.5720 38
0.0074 0.1114 4.0721 0.4775 0.0637 10.9922 39
0.0133 0.1111 3.9087 0.4560 0.0640 9.8398 40
0.0072 0.1114 3.9828 0.4533 0.0642 10.7116 41
0.0045 0.1115 3.9705 0.4668 0.0641 10.9326 42
0.0050 0.1114 4.1288 0.4677 0.0641 10.5865 43
0.0057 0.1114 4.1164 0.4709 0.0642 10.9947 44
0.0148 0.1111 4.2062 0.4594 0.0643 10.5855 45
0.0053 0.1114 4.0964 0.4598 0.0644 10.9385 46
0.0024 0.1115 4.1318 0.4660 0.0644 11.2717 47
0.0018 0.1115 4.1024 0.4696 0.0644 11.0203 48
0.0024 0.1115 4.0167 0.4845 0.0642 10.2553 49
0.0065 0.1114 4.1337 0.5009 0.0640 10.6290 50
0.0085 0.1113 4.1551 0.4943 0.0642 10.7108 51
0.0051 0.1114 4.2572 0.4838 0.0644 11.0773 52
0.0029 0.1115 4.2158 0.4869 0.0644 10.8471 53
0.0029 0.1115 4.2389 0.4904 0.0644 10.9344 54
0.0076 0.1114 4.4629 0.5367 0.0637 10.0070 55
0.0109 0.1112 3.8409 0.4880 0.0644 10.0557 56
0.0026 0.1115 3.8716 0.4823 0.0645 10.3257 57
0.0014 0.1115 3.8076 0.4879 0.0645 10.2943 58
0.0008 0.1115 3.8851 0.4854 0.0646 10.3687 59
0.0009 0.1115 3.9626 0.5204 0.0643 10.2350 60
0.0114 0.1111 4.0524 0.4985 0.0643 10.5375 61
0.0051 0.1114 4.0766 0.4958 0.0645 10.3052 62
0.0022 0.1115 4.2414 0.4972 0.0645 10.6064 63
0.0018 0.1115 4.5264 0.5014 0.0645 10.4899 64
0.0033 0.1115 4.4869 0.5182 0.0644 9.8788 65
0.0048 0.1114 4.2916 0.5082 0.0644 10.8155 66
0.0039 0.1114 4.5326 0.5101 0.0644 12.0401 67
0.0025 0.1115 4.4297 0.5056 0.0645 10.9017 68
0.0025 0.1115 4.2703 0.5064 0.0646 11.1319 69
0.0018 0.1115 4.2014 0.5228 0.0644 10.4356 70
0.0046 0.1114 4.1774 0.5267 0.0644 11.5421 71
0.0070 0.1113 4.0521 0.5185 0.0644 11.1608 72
0.0045 0.1114 4.2080 0.5041 0.0646 10.4894 73
0.0018 0.1115 4.1639 0.5001 0.0647 10.9893 74
0.0012 0.1115 4.2831 0.5026 0.0647 11.1654 75
0.0026 0.1115 4.4333 0.5219 0.0645 11.3173 76
0.0056 0.1114 4.1342 0.5351 0.0643 10.6952 77
0.0045 0.1114 4.2413 0.5161 0.0645 10.2386 78
0.0024 0.1115 4.0259 0.5122 0.0646 9.9376 79
0.0020 0.1115 3.9288 0.5131 0.0646 10.5318 80
0.0015 0.1115 4.2598 0.5160 0.0647 10.0045 81
0.0014 0.1115 4.1082 0.5311 0.0645 9.7244 82
0.0061 0.1113 4.1157 0.5400 0.0644 10.6306 83
0.0041 0.1114 4.2934 0.5179 0.0646 10.7082 84
0.0019 0.1115 4.0879 0.5148 0.0646 11.0278 85
0.0021 0.1115 4.1005 0.5350 0.0645 10.6981 86
0.0023 0.1115 4.0001 0.5218 0.0647 11.4887 87
0.0019 0.1115 4.2459 0.5329 0.0646 10.6841 88
0.0019 0.1115 4.1767 0.5281 0.0646 10.8256 89
0.0063 0.1113 4.0552 0.5270 0.0646 10.1748 90
0.0031 0.1114 4.1025 0.5115 0.0648 10.4663 91
0.0011 0.1115 4.1763 0.5090 0.0648 10.9994 92
0.0010 0.1115 4.1623 0.5189 0.0648 10.6794 93
0.0041 0.1114 4.2042 0.5504 0.0645 9.8599 94

Framework versions

  • Transformers 4.33.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3