Edit model card

Oyqiz jamoasi a'zolari tomonidan qilingan STT ning eng kichik versiyasi. Bu yaxshi versiya emas faqat 15ta epoch o'qitilgan!

Foziljon To'lqinov, Shaxboz Zohidov, Abduraxim Jabborov, Yahyoxon Rahimov

Bu model facebook/wav2vec2-base va MOZILLA-FOUNDATION/COMMON_VOICE_10_0 - UZ dataseti bilan 15ta epoxta o'qitilgan. O'qitish natijalari:

  • Xatolik: 0.5763
  • So'z xatoligi: 0.4502

O'qitish giperparameterlari

O'qitish uchun ishlatilgan giperparameterlar:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 2
  • total_train_batch_size: 8
  • total_eval_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 15.0
  • mixed_precision_training: Native AMP

O'qitish natijalari

Xatolik Epox Qadam Tasdiq xatoligi SXD
0.4736 3.4 10000 0.6247 0.6362
0.3392 6.8 20000 0.7254 0.5605
0.2085 10.19 30000 0.5465 0.5097
0.1387 13.59 40000 0.5984 0.4632
Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using oyqiz/uzbek_stt_1_version 1