speecht5_tts_wolof / README.md
Hawoly18's picture
End of training
70dd6ca verified
|
raw
history blame
5.72 kB
metadata
library_name: transformers
license: mit
base_model: microsoft/speecht5_tts
tags:
  - generated_from_trainer
model-index:
  - name: speecht5_tts_wolof
    results: []

speecht5_tts_wolof

This model is a fine-tuned version of microsoft/speecht5_tts on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2993

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
1.1192 0.5952 50 0.4722
0.9979 1.1905 100 0.4139
0.8933 1.7857 150 0.3900
0.8718 2.3810 200 0.3818
0.8246 2.9762 250 0.3758
0.8062 3.5714 300 0.3615
0.7931 4.1667 350 0.3546
0.756 4.7619 400 0.3469
0.7462 5.3571 450 0.3393
0.7311 5.9524 500 0.3358
0.7298 6.5476 550 0.3315
0.7234 7.1429 600 0.3300
0.7199 7.7381 650 0.3287
0.697 8.3333 700 0.3250
0.7006 8.9286 750 0.3231
0.7081 9.5238 800 0.3218
0.6998 10.1190 850 0.3196
0.7074 10.7143 900 0.3202
0.6831 11.3095 950 0.3161
0.6899 11.9048 1000 0.3169
0.6935 12.5 1050 0.3160
0.6778 13.0952 1100 0.3145
0.6701 13.6905 1150 0.3122
0.6792 14.2857 1200 0.3121
0.6668 14.8810 1250 0.3117
0.6682 15.4762 1300 0.3120
0.6742 16.0714 1350 0.3103
0.6759 16.6667 1400 0.3103
0.6776 17.2619 1450 0.3100
0.6699 17.8571 1500 0.3099
0.6744 18.4524 1550 0.3092
0.6636 19.0476 1600 0.3083
0.6552 19.6429 1650 0.3067
0.6618 20.2381 1700 0.3074
0.6482 20.8333 1750 0.3059
0.6684 21.4286 1800 0.3063
0.6726 22.0238 1850 0.3060
0.648 22.6190 1900 0.3053
0.6542 23.2143 1950 0.3043
0.6516 23.8095 2000 0.3050
0.6654 24.4048 2050 0.3059
0.6556 25.0 2100 0.3050
0.6493 25.5952 2150 0.3051
0.6504 26.1905 2200 0.3033
0.6463 26.7857 2250 0.3033
0.655 27.3810 2300 0.3028
0.6474 27.9762 2350 0.3030
0.6434 28.5714 2400 0.3022
0.6427 29.1667 2450 0.3027
0.6611 29.7619 2500 0.3030
0.6536 30.3571 2550 0.3026
0.6478 30.9524 2600 0.3011
0.6471 31.5476 2650 0.3021
0.6424 32.1429 2700 0.3014
0.6424 32.7381 2750 0.3012
0.645 33.3333 2800 0.3010
0.6454 33.9286 2850 0.3010
0.6373 34.5238 2900 0.3006
0.6409 35.1190 2950 0.3005
0.6382 35.7143 3000 0.3007
0.6377 36.3095 3050 0.3005
0.643 36.9048 3100 0.3007
0.6383 37.5 3150 0.2999
0.6396 38.0952 3200 0.2998
0.6413 38.6905 3250 0.3006
0.6368 39.2857 3300 0.2998
0.6452 39.8810 3350 0.3006
0.6425 40.4762 3400 0.3000
0.6406 41.0714 3450 0.3001
0.657 41.6667 3500 0.2996
0.6353 42.2619 3550 0.2998
0.6369 42.8571 3600 0.2999
0.6314 43.4524 3650 0.2997
0.634 44.0476 3700 0.2992
0.6506 44.6429 3750 0.3010
0.63 45.2381 3800 0.2993
0.6395 45.8333 3850 0.2997
0.6393 46.4286 3900 0.2983
0.6344 47.0238 3950 0.2998
0.6432 47.6190 4000 0.2993

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0