Edit model card

wallpad-record

This model is a fine-tuned version of namkyeong/facebook_wav2vec2-xls-r-300m_50h on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3326
  • Cer: 0.0806

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 70
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
2.5996 0.78 50 1.3224 0.3389
1.048 1.56 100 0.6909 0.1913
0.6576 2.34 150 0.5882 0.1624
0.4487 3.12 200 0.5037 0.1304
0.3178 3.91 250 0.4947 0.1328
0.2739 4.69 300 0.5637 0.1365
0.279 5.47 350 0.5534 0.1353
0.2661 6.25 400 0.5202 0.1242
0.3306 7.03 450 0.6083 0.1691
0.2876 7.81 500 0.6123 0.1667
0.2746 8.59 550 0.5902 0.1556
0.2399 9.38 600 0.5128 0.1421
0.2257 10.16 650 0.5274 0.1384
0.2041 10.94 700 0.5781 0.1581
0.1853 11.72 750 0.5515 0.1402
0.1612 12.5 800 0.5549 0.1464
0.1797 13.28 850 0.5097 0.1402
0.1423 14.06 900 0.5133 0.1433
0.1544 14.84 950 0.4362 0.1248
0.119 15.62 1000 0.3969 0.1002
0.1342 16.41 1050 0.4917 0.1248
0.1181 17.19 1100 0.6039 0.1507
0.1174 17.97 1150 0.4627 0.1199
0.0913 18.75 1200 0.5063 0.1267
0.0913 19.53 1250 0.5242 0.1310
0.0904 20.31 1300 0.5154 0.1298
0.0974 21.09 1350 0.4267 0.1132
0.0861 21.88 1400 0.4646 0.1273
0.0784 22.66 1450 0.4437 0.1095
0.0723 23.44 1500 0.4498 0.1187
0.0762 24.22 1550 0.4895 0.1205
0.0704 25.0 1600 0.5219 0.1230
0.0673 25.78 1650 0.4321 0.1125
0.059 26.56 1700 0.4554 0.1199
0.056 27.34 1750 0.4489 0.1113
0.0648 28.12 1800 0.4209 0.1082
0.0538 28.91 1850 0.4840 0.1242
0.0472 29.69 1900 0.4573 0.1070
0.056 30.47 1950 0.4232 0.1144
0.0414 31.25 2000 0.3984 0.1107
0.0458 32.03 2050 0.4103 0.0984
0.0399 32.81 2100 0.4675 0.1070
0.0392 33.59 2150 0.4009 0.0898
0.0418 34.38 2200 0.3986 0.0996
0.0428 35.16 2250 0.3776 0.0959
0.0365 35.94 2300 0.4121 0.1039
0.0331 36.72 2350 0.4141 0.1107
0.0298 37.5 2400 0.3763 0.0892
0.0416 38.28 2450 0.4031 0.1009
0.035 39.06 2500 0.3490 0.0935
0.0334 39.84 2550 0.3775 0.0904
0.0341 40.62 2600 0.3555 0.0843
0.0329 41.41 2650 0.3817 0.0879
0.0347 42.19 2700 0.3638 0.0892
0.0314 42.97 2750 0.3870 0.0966
0.0298 43.75 2800 0.3936 0.0959
0.03 44.53 2850 0.3997 0.0916
0.0251 45.31 2900 0.4687 0.1095
0.0263 46.09 2950 0.4156 0.0978
0.0249 46.88 3000 0.4065 0.0984
0.018 47.66 3050 0.3768 0.0904
0.0254 48.44 3100 0.3737 0.0892
0.0253 49.22 3150 0.3920 0.0916
0.0188 50.0 3200 0.3867 0.0892
0.0195 50.78 3250 0.3570 0.0910
0.0189 51.56 3300 0.3475 0.0941
0.0167 52.34 3350 0.3259 0.0824
0.0145 53.12 3400 0.3227 0.0812
0.014 53.91 3450 0.3716 0.0873
0.0165 54.69 3500 0.3610 0.0836
0.014 55.47 3550 0.3537 0.0830
0.0133 56.25 3600 0.3600 0.0830
0.0126 57.03 3650 0.3519 0.0830
0.0131 57.81 3700 0.3479 0.0830
0.0111 58.59 3750 0.3540 0.0830
0.0138 59.38 3800 0.3273 0.0812
0.0102 60.16 3850 0.3247 0.0756
0.0083 60.94 3900 0.3501 0.0775
0.0092 61.72 3950 0.3405 0.0787
0.0113 62.5 4000 0.3435 0.0800
0.0092 63.28 4050 0.3549 0.0806
0.0109 64.06 4100 0.3270 0.0781
0.0113 64.84 4150 0.3218 0.0763
0.0095 65.62 4200 0.3309 0.0787
0.0095 66.41 4250 0.3239 0.0769
0.01 67.19 4300 0.3191 0.0781
0.0077 67.97 4350 0.3247 0.0787
0.0082 68.75 4400 0.3317 0.0793
0.0098 69.53 4450 0.3326 0.0806

Framework versions

  • Transformers 4.17.0
  • Pytorch 1.11.0
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Inference API
Unable to determine this model's library. Check the docs .