DewiBrynJones's picture
Model save
1427801 verified
|
raw
history blame
2.35 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xlsr-53-ft-btb-ccv-cy
    results: []

wav2vec2-xlsr-53-ft-btb-ccv-cy

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5418
  • Wer: 0.4178

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 300
  • training_steps: 3000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.0672 200 3.1445 1.0
No log 0.1344 400 2.7407 1.0000
4.0188 0.2016 600 1.2700 0.8484
4.0188 0.2688 800 0.9953 0.7435
1.0707 0.3360 1000 0.8647 0.6541
1.0707 0.4032 1200 0.7889 0.5784
1.0707 0.4704 1400 0.7465 0.5440
0.8175 0.5376 1600 0.6828 0.5042
0.8175 0.6048 1800 0.6549 0.4952
0.7148 0.6720 2000 0.6290 0.4906
0.7148 0.7392 2200 0.6113 0.4576
0.7148 0.8065 2400 0.5719 0.4405
0.6374 0.8737 2600 0.5644 0.4314
0.6374 0.9409 2800 0.5483 0.4190
0.6013 1.0081 3000 0.5418 0.4178

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1