bigmorning's picture
Upload TFWhisperForConditionalGeneration
ea18b1c
|
raw
history blame
2.71 kB
metadata
license: apache-2.0
base_model: openai/whisper-tiny
tags:
  - generated_from_keras_callback
model-index:
  - name: whisper_syl_cv12_pad_lob100__0010
    results: []

whisper_syl_cv12_pad_lob100__0010

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 2.8713
  • Train Accuracy: 0.0181
  • Train Wermet: 0.6484
  • Validation Loss: 2.5226
  • Validation Accuracy: 0.0157
  • Validation Wermet: 0.7017
  • Epoch: 9

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.0233 0.0115 1.6383 3.8616 0.0117 0.9516 0
4.4412 0.0127 0.8560 3.5410 0.0125 0.8971 1
4.0719 0.0138 0.8366 3.2944 0.0132 0.8706 2
3.8091 0.0146 0.8133 3.1691 0.0134 0.8487 3
3.6239 0.0152 0.7866 3.0647 0.0136 0.8282 4
3.4749 0.0156 0.7589 2.9835 0.0139 0.8049 5
3.3444 0.0161 0.7359 2.9351 0.0140 0.7979 6
3.2215 0.0165 0.7138 2.8468 0.0145 0.7589 7
3.0754 0.0172 0.6873 2.7530 0.0148 0.7413 8
2.8713 0.0181 0.6484 2.5226 0.0157 0.7017 9

Framework versions

  • Transformers 4.33.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3