Edit model card

Oyqiz jamoasi a'zolari tomonidan qilingan STT ning eng yaxshi versiyasi!

Foziljon To'lqinov, Shaxboz Zohidov, Abduraxim Jabborov, Yahyoxon Rahimov, Mahmud Jumanazarov

Bu model facebook/wav2vec2-xls-r-300m va MOZILLA-FOUNDATION/COMMON_VOICE_10_0 - UZ dataseti bilan 100ta epoxta o'qitilgan. O'qitish natijalari:

  • Xatolik: 0.1963
  • So'z xatoligi: 0.2102

O'qitish giperparameterlari

O'qitish uchun ishlatilgan giperparameterlar:

  • learning_rate: 0.00003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 2
  • total_train_batch_size: 16
  • total_eval_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP
Downloads last month
410
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using oyqiz/uzbek_stt 3