csikasote's picture
Model save
0fe8be0 verified
|
raw
history blame
3.37 kB
metadata
library_name: transformers
license: cc-by-nc-4.0
base_model: mms-meta/mms-zeroshot-300m
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: mms-zeroshot-300m-genbed-f-model
    results: []

mms-zeroshot-300m-genbed-f-model

This model is a fine-tuned version of mms-meta/mms-zeroshot-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2131
  • Wer: 0.3720

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.5479 200 2.3236 1.0
No log 1.0959 400 0.3331 0.5504
2.6731 1.6438 600 0.2969 0.5190
2.6731 2.1918 800 0.2806 0.5122
0.4193 2.7397 1000 0.2701 0.4742
0.4193 3.2877 1200 0.2703 0.4770
0.4193 3.8356 1400 0.2574 0.4758
0.367 4.3836 1600 0.2487 0.4547
0.367 4.9315 1800 0.2472 0.4337
0.3377 5.4795 2000 0.2424 0.4467
0.3377 6.0274 2200 0.2372 0.4274
0.3377 6.5753 2400 0.2366 0.4225
0.3282 7.1233 2600 0.2339 0.4104
0.3282 7.6712 2800 0.2352 0.4193
0.3018 8.2192 3000 0.2249 0.4097
0.3018 8.7671 3200 0.2254 0.4065
0.3018 9.3151 3400 0.2251 0.4021
0.2945 9.8630 3600 0.2248 0.3969
0.2945 10.4110 3800 0.2212 0.4002
0.2843 10.9589 4000 0.2200 0.3920
0.2843 11.5068 4200 0.2183 0.3853
0.2843 12.0548 4400 0.2174 0.3890
0.2755 12.6027 4600 0.2163 0.3955
0.2755 13.1507 4800 0.2197 0.3894
0.2699 13.6986 5000 0.2163 0.3899
0.2699 14.2466 5200 0.2129 0.3769
0.2699 14.7945 5400 0.2114 0.3759
0.2568 15.3425 5600 0.2100 0.3721
0.2568 15.8904 5800 0.2140 0.3670
0.2521 16.4384 6000 0.2149 0.3743
0.2521 16.9863 6200 0.2131 0.3720

Framework versions

  • Transformers 4.46.0.dev0
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.0