DewiBrynJones's picture
End of training
9344f43 verified
|
raw
history blame
3.41 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - automatic-speech-recognition
  - DewiBrynJones/banc-trawsgrifiadau-bangor-clean-with-ccv
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xlsr-53-ft-btb-ccv-cy
    results: []

wav2vec2-xlsr-53-ft-btb-ccv-cy

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the DEWIBRYNJONES/BANC-TRAWSGRIFIADAU-BANGOR-CLEAN-WITH-CCV - DEFAULT dataset. It achieves the following results on the evaluation set:

  • Loss: inf
  • Wer: 0.3289

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 6000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.0772 200 inf 1.0
No log 0.1544 400 inf 0.9938
3.9317 0.2317 600 inf 0.7576
3.9317 0.3089 800 inf 0.6942
0.9699 0.3861 1000 inf 0.5763
0.9699 0.4633 1200 inf 0.5519
0.9699 0.5405 1400 inf 0.5174
0.8031 0.6178 1600 inf 0.5338
0.8031 0.6950 1800 inf 0.4777
0.7169 0.7722 2000 inf 0.4504
0.7169 0.8494 2200 inf 0.4500
0.7169 0.9266 2400 inf 0.4432
0.6687 1.0039 2600 inf 0.4176
0.6687 1.0811 2800 inf 0.4054
0.5609 1.1583 3000 inf 0.4009
0.5609 1.2355 3200 inf 0.4023
0.5609 1.3127 3400 inf 0.3919
0.5324 1.3900 3600 inf 0.3795
0.5324 1.4672 3800 inf 0.3752
0.5196 1.5444 4000 inf 0.3662
0.5196 1.6216 4200 inf 0.3703
0.5196 1.6988 4400 inf 0.3614
0.4967 1.7761 4600 inf 0.3530
0.4967 1.8533 4800 inf 0.3481
0.4735 1.9305 5000 inf 0.3506
0.4735 2.0077 5200 inf 0.3432
0.4735 2.0849 5400 inf 0.3369
0.4244 2.1622 5600 inf 0.3296
0.4244 2.2394 5800 inf 0.3295
0.3674 2.3166 6000 inf 0.3289

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1