Edit model card

xlsr-nmcpc-nomi

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3286
  • Wer: 0.3266

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
4.8042 6.0606 200 3.0647 0.9452
2.8665 12.1212 400 2.2960 0.9838
1.4496 18.1818 600 0.5882 0.6085
0.4807 24.2424 800 0.4014 0.4828
0.275 30.3030 1000 0.4216 0.3996
0.1757 36.3636 1200 0.2956 0.3651
0.1298 42.4242 1400 0.4517 0.3712
0.1018 48.4848 1600 0.4099 0.3529
0.08 54.5455 1800 0.3337 0.3651
0.0729 60.6061 2000 0.3765 0.3671
0.0604 66.6667 2200 0.3915 0.3671
0.0504 72.7273 2400 0.3723 0.3590
0.0449 78.7879 2600 0.3246 0.3489
0.0392 84.8485 2800 0.3044 0.3428
0.036 90.9091 3000 0.2869 0.3286
0.0424 96.9697 3200 0.3328 0.3408
0.0308 103.0303 3400 0.3950 0.3387
0.0296 109.0909 3600 0.3217 0.3306
0.0209 115.1515 3800 0.3163 0.3347
0.0179 121.2121 4000 0.3692 0.3387
0.0234 127.2727 4200 0.3597 0.3327
0.0141 133.3333 4400 0.3497 0.3266
0.0125 139.3939 4600 0.3291 0.3225
0.0125 145.4545 4800 0.3130 0.3185
0.0116 151.5152 5000 0.3337 0.3327
0.0117 157.5758 5200 0.3424 0.3367
0.0088 163.6364 5400 0.3385 0.3347
0.0087 169.6970 5600 0.3302 0.3266
0.0062 175.7576 5800 0.3093 0.3286
0.0064 181.8182 6000 0.3367 0.3286
0.0053 187.8788 6200 0.3370 0.3306
0.007 193.9394 6400 0.3354 0.3266
0.0058 200.0 6600 0.3286 0.3266

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
7
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for susmitabhatt/xlsr-nmcpc-nomi

Finetuned
(202)
this model