--- license: apache-2.0 base_model: smutuvi/wav2vec2-large-xlsr-sw tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-large-xlsr-sw_ndizi_782_without_NF4 results: [] --- # wav2vec2-large-xlsr-sw_ndizi_782_without_NF4 This model is a fine-tuned version of [smutuvi/wav2vec2-large-xlsr-sw](https://huggingface.co/smutuvi/wav2vec2-large-xlsr-sw) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.0702 - Wer: 0.4635 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 1.3466 | 5.97 | 400 | 1.0027 | 0.4830 | | 0.7355 | 11.94 | 800 | 1.1488 | 0.4873 | | 0.4853 | 17.91 | 1200 | 1.4337 | 0.4678 | | 0.3254 | 23.88 | 1600 | 1.5478 | 0.4895 | | 0.2251 | 29.85 | 2000 | 1.7495 | 0.4736 | | 0.1674 | 35.82 | 2400 | 1.8199 | 0.4751 | | 0.1258 | 41.79 | 2800 | 2.0479 | 0.4606 | | 0.1096 | 47.76 | 3200 | 2.0702 | 0.4635 | ### Framework versions - Transformers 4.37.1 - Pytorch 2.2.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0