--- base_model: facebook/wav2vec2-large-xlsr-53 tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-xlsr-53-ft-btb-ccv-cy results: [] --- # wav2vec2-xlsr-53-ft-btb-ccv-cy This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6889 - Wer: 0.9480 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 16 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 600 - training_steps: 6000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:------:|:----:|:---------------:|:------:| | No log | 0.0672 | 200 | 3.0784 | 1.0 | | No log | 0.1344 | 400 | 2.9962 | 1.0 | | 4.6966 | 0.2016 | 600 | 3.0380 | 1.0 | | 4.6966 | 0.2688 | 800 | 2.9616 | 1.0 | | 3.0067 | 0.3360 | 1000 | 2.9147 | 1.0 | | 3.0067 | 0.4032 | 1200 | 2.9158 | 1.0 | | 3.0067 | 0.4704 | 1400 | 2.9126 | 1.0 | | 2.9095 | 0.5376 | 1600 | 2.8195 | 1.0 | | 2.9095 | 0.6048 | 1800 | 2.3190 | 0.9804 | | 2.3631 | 0.6720 | 2000 | 1.5925 | 0.9349 | | 2.3631 | 0.7392 | 2200 | 1.3529 | 0.8784 | | 2.3631 | 0.8065 | 2400 | 1.2185 | 0.8690 | | 1.3164 | 0.8737 | 2600 | 1.1746 | 0.8566 | | 1.3164 | 0.9409 | 2800 | 1.2047 | 0.8444 | | 1.2708 | 1.0081 | 3000 | 1.3975 | 0.8781 | | 1.2708 | 1.0753 | 3200 | 1.5273 | 0.9086 | | 1.2708 | 1.1425 | 3400 | 1.5937 | 0.9166 | | 1.5876 | 1.2097 | 3600 | 1.4998 | 0.9331 | | 1.5876 | 1.2769 | 3800 | 1.6366 | 0.9646 | | 1.6623 | 1.3441 | 4000 | 1.6667 | 0.9701 | | 1.6623 | 1.4113 | 4200 | 1.5727 | 0.9483 | | 1.6623 | 1.4785 | 4400 | 1.6119 | 0.9611 | | 1.6759 | 1.5457 | 4600 | 1.5941 | 0.9337 | | 1.6759 | 1.6129 | 4800 | 1.4534 | 0.9059 | | 1.5779 | 1.6801 | 5000 | 2.0221 | 0.9572 | | 1.5779 | 1.7473 | 5200 | 1.7697 | 0.9399 | | 1.5779 | 1.8145 | 5400 | 1.6657 | 0.9377 | | 1.775 | 1.8817 | 5600 | 1.7365 | 0.9714 | | 1.775 | 1.9489 | 5800 | 1.6953 | 0.9580 | | 1.7507 | 2.0161 | 6000 | 1.6889 | 0.9480 | ### Framework versions - Transformers 4.44.0 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1