Medical Whisper
Collection
Fine-tuned Whisper Large v3 on Doctor/ Patient consultations.
•
1 item
•
Updated
This model is a fine-tuned version of openai/whisper-large-v3 on the primock_data dataset.
Fine tuned version of whisper-large-v3 through transfer learning on Doctor/Patient consultations
Medical transcription
Na0s/Medical_Augmented_data
Exhaustive transfer learning
The following hyperparameters were used during training:
| Model Name | WER | CER | Number of Parameters |
---|---|---|---|
Whisper Tiny | 0.46 | 0.27 | 39M |
Whisper Base | 0.42 | 0.26 | 74M |
Whisper Small | 0.39 | 0.26 | 244M |
Whisper Medium | 0.37 | 0.23 | 769M |
Whisper Large v3 | 0.33 | 0.18 | 1.55B |
Whisper Medical | 0.19 | 0.10 | 1.55B |
Performance of foundation Whispers vs Medical Whisper on the Validation set.
Model Name | WER | CER | Number of Parameters |
---|---|---|---|
Whisper Medical | 0.24 | 0.13 | 1.55B |
Table: Performance of Whisper Medical on the Test set.
Base model
openai/whisper-large-v3