--- license: apache-2.0 base_model: facebook/wav2vec2-large-xlsr-53 tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-xlsr-53-ft-btb-ccv-cy results: [] --- # wav2vec2-xlsr-53-ft-btb-ccv-cy This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5324 - Wer: 0.4014 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 8 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 600 - training_steps: 10000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:------:|:-----:|:---------------:|:------:| | 4.7051 | 0.0321 | 500 | 1.7504 | 0.9570 | | 1.0409 | 0.0641 | 1000 | 1.1511 | 0.7761 | | 0.8183 | 0.0962 | 1500 | 1.0506 | 0.7097 | | 0.7091 | 0.1283 | 2000 | 0.9421 | 0.6610 | | 0.6547 | 0.1603 | 2500 | 0.8726 | 0.6128 | | 0.6088 | 0.1924 | 3000 | 0.8246 | 0.5990 | | 0.5781 | 0.2244 | 3500 | 0.8025 | 0.5747 | | 0.5429 | 0.2565 | 4000 | 0.7360 | 0.5305 | | 0.5104 | 0.2886 | 4500 | 0.7335 | 0.5394 | | 0.501 | 0.3206 | 5000 | 0.6933 | 0.5088 | | 0.4708 | 0.3527 | 5500 | 0.6770 | 0.5113 | | 0.4526 | 0.3848 | 6000 | 0.6609 | 0.4806 | | 0.4235 | 0.4168 | 6500 | 0.6373 | 0.4858 | | 0.4032 | 0.4489 | 7000 | 0.6048 | 0.4466 | | 0.3863 | 0.4810 | 7500 | 0.5946 | 0.4432 | | 0.3766 | 0.5130 | 8000 | 0.5737 | 0.4298 | | 0.3746 | 0.5451 | 8500 | 0.5668 | 0.4248 | | 0.3586 | 0.5771 | 9000 | 0.5485 | 0.4101 | | 0.3552 | 0.6092 | 9500 | 0.5378 | 0.4032 | | 0.3326 | 0.6413 | 10000 | 0.5324 | 0.4014 | ### Framework versions - Transformers 4.44.0 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1