Edit model card

wav2vec2-large-xls-r-300m-tr

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - TR dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2224
  • Wer: 0.2869

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
6.8222 0.64 500 3.5026 1.0
3.2136 1.28 1000 3.0593 1.0000
2.8882 1.91 1500 2.4670 0.9939
2.3743 2.55 2000 1.1844 0.8657
1.9456 3.19 2500 0.8228 0.7397
1.7781 3.83 3000 0.6826 0.6753
1.6848 4.46 3500 0.5885 0.6140
1.6228 5.1 4000 0.5274 0.5789
1.5768 5.74 4500 0.4900 0.5519
1.5431 6.38 5000 0.4508 0.5238
1.5019 7.02 5500 0.4248 0.5021
1.4684 7.65 6000 0.4009 0.4827
1.4635 8.29 6500 0.3830 0.4700
1.4291 8.93 7000 0.3707 0.4595
1.4271 9.57 7500 0.3570 0.4514
1.3938 10.2 8000 0.3479 0.4378
1.3914 10.84 8500 0.3396 0.4368
1.3767 11.48 9000 0.3253 0.4262
1.3641 12.12 9500 0.3251 0.4178
1.355 12.76 10000 0.3138 0.4136
1.336 13.39 10500 0.3121 0.4069
1.3292 14.03 11000 0.3041 0.4014
1.3249 14.67 11500 0.3014 0.3931
1.3156 15.31 12000 0.3014 0.3929
1.313 15.94 12500 0.2969 0.3968
1.3068 16.58 13000 0.2965 0.3966
1.2785 17.22 13500 0.2943 0.3850
1.2867 17.86 14000 0.2912 0.3782
1.2714 18.49 14500 0.2819 0.3747
1.2844 19.13 15000 0.2840 0.3740
1.2684 19.77 15500 0.2913 0.3828
1.26 20.41 16000 0.2739 0.3674
1.2543 21.05 16500 0.2740 0.3691
1.2532 21.68 17000 0.2709 0.3756
1.2409 22.32 17500 0.2669 0.3593
1.2404 22.96 18000 0.2673 0.3576
1.2347 23.6 18500 0.2678 0.3643
1.2351 24.23 19000 0.2715 0.3650
1.2409 24.87 19500 0.2637 0.3571
1.2152 25.51 20000 0.2785 0.3609
1.2046 26.15 20500 0.2610 0.3508
1.2082 26.79 21000 0.2619 0.3461
1.2109 27.42 21500 0.2597 0.3502
1.2014 28.06 22000 0.2608 0.3468
1.1948 28.7 22500 0.2573 0.3457
1.205 29.34 23000 0.2619 0.3464
1.2019 29.97 23500 0.2559 0.3474
1.1917 30.61 24000 0.2601 0.3462
1.1939 31.25 24500 0.2575 0.3387
1.1882 31.89 25000 0.2535 0.3368
1.191 32.53 25500 0.2489 0.3365
1.1767 33.16 26000 0.2501 0.3347
1.167 33.8 26500 0.2504 0.3347
1.1678 34.44 27000 0.2480 0.3378
1.1803 35.08 27500 0.2487 0.3345
1.167 35.71 28000 0.2442 0.3319
1.1661 36.35 28500 0.2495 0.3334
1.164 36.99 29000 0.2472 0.3292
1.1578 37.63 29500 0.2442 0.3242
1.1584 38.27 30000 0.2431 0.3314
1.1526 38.9 30500 0.2441 0.3347
1.1542 39.54 31000 0.2437 0.3330
1.1508 40.18 31500 0.2433 0.3294
1.1406 40.82 32000 0.2434 0.3271
1.1514 41.45 32500 0.2426 0.3255
1.1418 42.09 33000 0.2432 0.3233
1.1365 42.73 33500 0.2436 0.3240
1.1348 43.37 34000 0.2483 0.3257
1.1301 44.01 34500 0.2420 0.3271
1.1268 44.64 35000 0.2472 0.3225
1.1224 45.28 35500 0.2382 0.3205
1.1224 45.92 36000 0.2388 0.3184
1.1198 46.56 36500 0.2382 0.3202
1.1274 47.19 37000 0.2404 0.3172
1.1147 47.83 37500 0.2394 0.3164
1.121 48.47 38000 0.2406 0.3202
1.1109 49.11 38500 0.2384 0.3154
1.1164 49.74 39000 0.2375 0.3169
1.1105 50.38 39500 0.2387 0.3173
1.1054 51.02 40000 0.2362 0.3120
1.0893 51.66 40500 0.2399 0.3130
1.0913 52.3 41000 0.2357 0.3088
1.1017 52.93 41500 0.2345 0.3084
1.0937 53.57 42000 0.2330 0.3140
1.0945 54.21 42500 0.2399 0.3107
1.0933 54.85 43000 0.2383 0.3134
1.0912 55.48 43500 0.2372 0.3077
1.0898 56.12 44000 0.2339 0.3083
1.0903 56.76 44500 0.2367 0.3065
1.0947 57.4 45000 0.2352 0.3104
1.0751 58.04 45500 0.2334 0.3084
1.09 58.67 46000 0.2328 0.3100
1.0876 59.31 46500 0.2276 0.3050
1.076 59.95 47000 0.2309 0.3047
1.086 60.59 47500 0.2293 0.3047
1.082 61.22 48000 0.2328 0.3027
1.0714 61.86 48500 0.2290 0.3020
1.0746 62.5 49000 0.2313 0.3059
1.076 63.14 49500 0.2342 0.3050
1.0648 63.78 50000 0.2286 0.3025
1.0586 64.41 50500 0.2338 0.3044
1.0753 65.05 51000 0.2308 0.3045
1.0664 65.69 51500 0.2273 0.3009
1.0739 66.33 52000 0.2298 0.3027
1.0695 66.96 52500 0.2247 0.2996
1.06 67.6 53000 0.2276 0.3015
1.0742 68.24 53500 0.2280 0.2974
1.0618 68.88 54000 0.2291 0.2989
1.062 69.52 54500 0.2302 0.2971
1.0572 70.15 55000 0.2280 0.2990
1.055 70.79 55500 0.2278 0.2983
1.0553 71.43 56000 0.2282 0.2991
1.0509 72.07 56500 0.2261 0.2959
1.0469 72.7 57000 0.2216 0.2919
1.0476 73.34 57500 0.2267 0.2989
1.0494 73.98 58000 0.2260 0.2960
1.0517 74.62 58500 0.2297 0.2989
1.0458 75.26 59000 0.2246 0.2923
1.0382 75.89 59500 0.2255 0.2922
1.0462 76.53 60000 0.2258 0.2954
1.0375 77.17 60500 0.2251 0.2929
1.0332 77.81 61000 0.2277 0.2940
1.0423 78.44 61500 0.2243 0.2896
1.0379 79.08 62000 0.2274 0.2928
1.0398 79.72 62500 0.2237 0.2928
1.0395 80.36 63000 0.2265 0.2956
1.0397 80.99 63500 0.2240 0.2920
1.0262 81.63 64000 0.2244 0.2934
1.0335 82.27 64500 0.2265 0.2936
1.0385 82.91 65000 0.2238 0.2928
1.0289 83.55 65500 0.2219 0.2912
1.0372 84.18 66000 0.2236 0.2898
1.0279 84.82 66500 0.2219 0.2902
1.0325 85.46 67000 0.2240 0.2908
1.0202 86.1 67500 0.2206 0.2886
1.0166 86.73 68000 0.2219 0.2886
1.0259 87.37 68500 0.2235 0.2897
1.0337 88.01 69000 0.2210 0.2873
1.0264 88.65 69500 0.2216 0.2882
1.0231 89.29 70000 0.2223 0.2899
1.0281 89.92 70500 0.2214 0.2872
1.0135 90.56 71000 0.2218 0.2868
1.0291 91.2 71500 0.2209 0.2863
1.0321 91.84 72000 0.2199 0.2876
1.028 92.47 72500 0.2214 0.2858
1.0213 93.11 73000 0.2219 0.2875
1.0261 93.75 73500 0.2232 0.2869
1.0197 94.39 74000 0.2227 0.2866
1.0298 95.03 74500 0.2228 0.2868
1.0192 95.66 75000 0.2230 0.2865
1.0156 96.3 75500 0.2220 0.2869
1.0075 96.94 76000 0.2223 0.2866
1.0201 97.58 76500 0.2219 0.2866
1.0159 98.21 77000 0.2219 0.2876
1.0087 98.85 77500 0.2219 0.2873
1.0159 99.49 78000 0.2223 0.2867

Framework versions

  • Transformers 4.17.0.dev0
  • Pytorch 1.10.2+cu102
  • Datasets 1.18.2.dev0
  • Tokenizers 0.11.0
Downloads last month
49
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for emre/wav2vec2-large-xls-r-300m-tr

Finetuned
(427)
this model

Dataset used to train emre/wav2vec2-large-xls-r-300m-tr

Evaluation results