--- library_name: transformers license: cc-by-nc-4.0 base_model: facebook/mms-1b-all tags: - generated_from_trainer metrics: - wer model-index: - name: mms_kik results: [] --- # mms_kik This model is a fine-tuned version of [facebook/mms-1b-all](https://huggingface.co/facebook/mms-1b-all) on the None dataset. It achieves the following results on the evaluation set: - Loss: inf - Wer: 0.1756 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 100 - num_epochs: 4 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:------:|:----:|:---------------:|:------:| | 4.4384 | 0.1576 | 100 | inf | 0.4287 | | 0.5264 | 0.3152 | 200 | inf | 0.3938 | | 0.4716 | 0.4728 | 300 | inf | 0.3655 | | 0.4084 | 0.6304 | 400 | inf | 0.3319 | | 0.3953 | 0.7880 | 500 | inf | 0.3340 | | 0.3605 | 0.9456 | 600 | inf | 0.3109 | | 0.3601 | 1.1032 | 700 | inf | 0.2919 | | 0.3368 | 1.2608 | 800 | inf | 0.2746 | | 0.3102 | 1.4184 | 900 | inf | 0.2691 | | 0.3209 | 1.5760 | 1000 | inf | 0.2602 | | 0.2975 | 1.7336 | 1100 | inf | 0.2488 | | 0.2741 | 1.8913 | 1200 | inf | 0.2356 | | 0.271 | 2.0489 | 1300 | inf | 0.2297 | | 0.2494 | 2.2065 | 1400 | inf | 0.2233 | | 0.254 | 2.3641 | 1500 | inf | 0.2110 | | 0.2484 | 2.5217 | 1600 | inf | 0.2117 | | 0.2416 | 2.6793 | 1700 | inf | 0.2020 | | 0.2366 | 2.8369 | 1800 | inf | 0.1985 | | 0.2313 | 2.9945 | 1900 | inf | 0.1959 | | 0.2228 | 3.1521 | 2000 | inf | 0.1897 | | 0.2138 | 3.3097 | 2100 | inf | 0.1868 | | 0.2116 | 3.4673 | 2200 | inf | 0.1822 | | 0.223 | 3.6249 | 2300 | inf | 0.1788 | | 0.2144 | 3.7825 | 2400 | inf | 0.1774 | | 0.2131 | 3.9401 | 2500 | inf | 0.1756 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.0 - Tokenizers 0.19.1