Edit model card

wav2vec2-large-xlsr-53-english-ser-cosine

This model is a fine-tuned version of jonatasgrosman/wav2vec2-large-xlsr-53-english on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4677
  • Accuracy: 0.8677

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001076429938136877
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_steps: 18
  • num_epochs: 3.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.7885 0.01 10 1.7963 0.1791
1.7907 0.02 20 1.7973 0.2638
1.8288 0.02 30 1.7546 0.2465
1.7803 0.03 40 1.7500 0.2087
1.7941 0.04 50 1.6953 0.2950
1.7934 0.05 60 1.6342 0.3714
1.6559 0.06 70 1.6199 0.2892
1.6214 0.07 80 1.5400 0.4117
1.5226 0.07 90 1.3802 0.4519
1.4954 0.08 100 1.3506 0.4717
1.4062 0.09 110 1.3328 0.4766
1.4507 0.1 120 1.1985 0.5464
1.2812 0.11 130 1.2826 0.4922
1.1494 0.12 140 1.0960 0.6187
1.1035 0.12 150 1.1925 0.5645
1.2784 0.13 160 1.0955 0.6015
1.0302 0.14 170 1.0418 0.6072
1.0068 0.15 180 0.9261 0.6804
1.112 0.16 190 1.1529 0.5867
1.0308 0.16 200 0.8637 0.7058
1.0464 0.17 210 0.9205 0.6426
0.9531 0.18 220 0.9363 0.6886
1.0228 0.19 230 0.9637 0.6615
1.1446 0.2 240 1.3015 0.5489
1.1146 0.21 250 0.9328 0.6483
0.849 0.21 260 0.8504 0.6746
0.7977 0.22 270 0.9533 0.6697
0.9397 0.23 280 0.9300 0.7083
0.8625 0.24 290 1.1020 0.6401
1.333 0.25 300 0.9816 0.6442
1.0022 0.25 310 0.8472 0.7067
0.8002 0.26 320 0.7866 0.7149
0.8987 0.27 330 0.7979 0.6976
0.9473 0.28 340 0.8600 0.6878
0.9001 0.29 350 0.8141 0.7034
0.9793 0.3 360 0.9872 0.6450
0.9189 0.3 370 0.8561 0.6845
0.9281 0.31 380 0.9055 0.6919
0.7118 0.32 390 0.7937 0.6984
1.0565 0.33 400 0.7339 0.7313
0.8467 0.34 410 0.8262 0.6836
0.9601 0.35 420 0.7464 0.7346
0.8911 0.35 430 0.7229 0.7338
0.9033 0.36 440 0.7393 0.7223
0.8961 0.37 450 0.7272 0.7428
0.7216 0.38 460 0.7183 0.7436
0.6935 0.39 470 0.8003 0.7083
0.7588 0.39 480 0.8471 0.7116
0.8766 0.4 490 0.6976 0.7379
0.6866 0.41 500 0.6806 0.7584
0.6822 0.42 510 0.7669 0.7256
0.7067 0.43 520 0.6885 0.7461
0.6159 0.44 530 0.7020 0.7535
0.8814 0.44 540 0.7478 0.7256
0.7786 0.45 550 0.6302 0.7691
0.6363 0.46 560 0.6745 0.7691
0.8518 0.47 570 0.6242 0.7666
0.8194 0.48 580 0.7154 0.7379
0.6755 0.49 590 0.7056 0.7543
0.7743 0.49 600 0.6823 0.7486
0.6457 0.5 610 0.7160 0.7502
0.4976 0.51 620 0.8222 0.7149
0.929 0.52 630 0.7318 0.7371
0.7981 0.53 640 0.7417 0.7461
0.7243 0.53 650 0.6831 0.7461
0.7332 0.54 660 0.6273 0.7592
0.7827 0.55 670 0.6158 0.7724
0.7733 0.56 680 0.7515 0.7371
0.8527 0.57 690 0.7200 0.7412
0.8355 0.58 700 0.7738 0.7436
0.5383 0.58 710 0.9081 0.7132
1.0851 0.59 720 0.6135 0.7831
0.7345 0.6 730 0.7032 0.7642
0.6648 0.61 740 0.6146 0.7781
0.612 0.62 750 0.6338 0.7732
0.6101 0.62 760 0.6772 0.7740
0.6498 0.63 770 0.7153 0.7601
0.6258 0.64 780 0.7871 0.7329
0.7943 0.65 790 0.6975 0.7691
0.8176 0.66 800 0.7692 0.7313
0.6682 0.67 810 0.5766 0.8012
0.4808 0.67 820 0.5882 0.7847
0.6331 0.68 830 0.5855 0.7896
0.874 0.69 840 0.7082 0.7568
0.8984 0.7 850 0.6078 0.7732
0.5861 0.71 860 0.6469 0.7814
0.6896 0.72 870 0.6997 0.7560
0.8237 0.72 880 0.6279 0.7650
0.5818 0.73 890 0.6763 0.7691
0.4781 0.74 900 0.6867 0.7592
0.6851 0.75 910 0.6142 0.7724
0.455 0.76 920 0.9159 0.7141
0.808 0.76 930 0.7518 0.7617
1.0634 0.77 940 0.6015 0.7839
0.6956 0.78 950 0.5895 0.7872
0.5169 0.79 960 0.6394 0.7773
0.6213 0.8 970 0.6890 0.7699
0.5506 0.81 980 0.7471 0.7560
0.6233 0.81 990 0.6525 0.7872
0.7666 0.82 1000 0.8002 0.7403
0.5644 0.83 1010 0.7067 0.7387
0.6038 0.84 1020 0.6091 0.7823
0.6211 0.85 1030 0.6749 0.7707
0.6758 0.86 1040 0.7102 0.7502
0.7353 0.86 1050 0.6959 0.7560
0.5687 0.87 1060 0.6831 0.7675
0.5606 0.88 1070 0.5945 0.7847
0.7309 0.89 1080 0.6737 0.7412
0.5951 0.9 1090 0.6574 0.7675
0.6062 0.9 1100 0.6740 0.7502
0.9606 0.91 1110 0.5730 0.7839
0.6625 0.92 1120 0.5922 0.7749
0.7908 0.93 1130 0.5652 0.7823
0.6387 0.94 1140 0.5268 0.8118
0.7141 0.95 1150 0.5628 0.7896
0.5587 0.95 1160 0.6479 0.7609
0.4817 0.96 1170 0.5410 0.8044
0.4444 0.97 1180 0.5950 0.8044
0.6776 0.98 1190 0.5993 0.8012
0.5989 0.99 1200 0.5745 0.7987
0.6334 1.0 1210 0.6220 0.7913
0.5216 1.0 1220 0.5936 0.7938
0.5127 1.01 1230 0.6741 0.7839
0.5632 1.02 1240 0.6501 0.7954
0.5335 1.03 1250 0.5721 0.8061
0.511 1.04 1260 0.5630 0.8102
0.5424 1.04 1270 0.5396 0.8135
0.771 1.05 1280 0.5580 0.8012
0.435 1.06 1290 0.5764 0.8036
0.5203 1.07 1300 0.6032 0.7913
0.4689 1.08 1310 0.6431 0.7872
0.481 1.09 1320 0.6019 0.7987
0.5938 1.09 1330 0.6198 0.7938
0.3972 1.1 1340 0.5842 0.8061
0.368 1.11 1350 0.5066 0.8127
0.4644 1.12 1360 0.6058 0.8012
0.6914 1.13 1370 0.5384 0.8217
0.3341 1.13 1380 0.5535 0.8143
0.5301 1.14 1390 0.5916 0.8020
0.5294 1.15 1400 0.6297 0.7938
0.7029 1.16 1410 0.5581 0.8102
0.322 1.17 1420 0.6066 0.7831
0.6871 1.18 1430 0.5141 0.8151
0.4026 1.18 1440 0.6888 0.7716
0.4484 1.19 1450 0.5499 0.8077
0.3767 1.2 1460 0.4825 0.8225
0.4274 1.21 1470 0.4932 0.8274
0.4584 1.22 1480 0.5168 0.8299
0.5741 1.23 1490 0.6384 0.7798
0.3877 1.23 1500 0.5789 0.8044
0.3734 1.24 1510 0.6415 0.7855
0.7986 1.25 1520 0.5575 0.8077
0.5634 1.26 1530 0.5684 0.8143
0.5136 1.27 1540 0.5393 0.8143
0.5331 1.27 1550 0.5203 0.8176
0.2918 1.28 1560 0.5510 0.8151
0.4425 1.29 1570 0.5783 0.8094
0.4245 1.3 1580 0.5433 0.8209
0.3317 1.31 1590 0.5845 0.8085
0.4583 1.32 1600 0.6147 0.7954
0.3298 1.32 1610 0.6249 0.8053
0.5248 1.33 1620 0.5722 0.8094
0.665 1.34 1630 0.5446 0.8217
0.3917 1.35 1640 0.5316 0.8258
0.4321 1.36 1650 0.5598 0.8217
0.3005 1.37 1660 0.6190 0.8151
0.4992 1.37 1670 0.5546 0.8184
0.586 1.38 1680 0.6416 0.7913
0.6481 1.39 1690 0.5324 0.8135
0.4008 1.4 1700 0.5786 0.8012
0.3463 1.41 1710 0.5145 0.8209
0.4994 1.41 1720 0.5650 0.8192
0.4093 1.42 1730 0.5191 0.8365
0.6375 1.43 1740 0.5734 0.8135
0.2303 1.44 1750 0.5447 0.8102
0.4824 1.45 1760 0.5139 0.8250
0.5439 1.46 1770 0.4979 0.8258
0.4751 1.46 1780 0.4896 0.8340
0.534 1.47 1790 0.4656 0.8348
0.4526 1.48 1800 0.5322 0.8316
0.4618 1.49 1810 0.5216 0.8233
0.3825 1.5 1820 0.4792 0.8225
0.4557 1.5 1830 0.5071 0.8118
0.5725 1.51 1840 0.5152 0.8102
0.7004 1.52 1850 0.5080 0.8217
0.4367 1.53 1860 0.4920 0.8357
0.3682 1.54 1870 0.5253 0.8299
0.4411 1.55 1880 0.6186 0.8069
0.5391 1.55 1890 0.5074 0.8283
0.4673 1.56 1900 0.4858 0.8398
0.3542 1.57 1910 0.4767 0.8381
0.6483 1.58 1920 0.4694 0.8373
0.3837 1.59 1930 0.4678 0.8472
0.363 1.6 1940 0.4684 0.8463
0.6446 1.6 1950 0.4696 0.8365
0.5627 1.61 1960 0.4651 0.8472
0.3733 1.62 1970 0.5138 0.8291
0.5972 1.63 1980 0.5244 0.8250
0.2388 1.64 1990 0.5020 0.8266
0.6279 1.64 2000 0.5865 0.8118
0.5827 1.65 2010 0.5717 0.8176
0.4598 1.66 2020 0.4691 0.8439
0.3817 1.67 2030 0.5084 0.8340
0.2973 1.68 2040 0.4568 0.8447
0.4039 1.69 2050 0.4681 0.8505
0.4572 1.69 2060 0.4718 0.8389
0.3481 1.7 2070 0.4849 0.8283
0.4553 1.71 2080 0.4574 0.8414
0.4055 1.72 2090 0.4640 0.8463
0.4384 1.73 2100 0.5049 0.8431
0.5593 1.74 2110 0.5192 0.8513
0.3486 1.74 2120 0.4764 0.8480
0.4698 1.75 2130 0.4858 0.8447
0.211 1.76 2140 0.4976 0.8398
0.5209 1.77 2150 0.4934 0.8472
0.4281 1.78 2160 0.4714 0.8578
0.3902 1.78 2170 0.4863 0.8463
0.3083 1.79 2180 0.4807 0.8431
0.4642 1.8 2190 0.4712 0.8472
0.2382 1.81 2200 0.4641 0.8513
0.4154 1.82 2210 0.4900 0.8447
0.3637 1.83 2220 0.4790 0.8488
0.4864 1.83 2230 0.4742 0.8513
0.5024 1.84 2240 0.4803 0.8529
0.4139 1.85 2250 0.4672 0.8521
0.4131 1.86 2260 0.4895 0.8431
0.4851 1.87 2270 0.4432 0.8529
0.3846 1.88 2280 0.4417 0.8422
0.3778 1.88 2290 0.4477 0.8439
0.4128 1.89 2300 0.4336 0.8513
0.3755 1.9 2310 0.4678 0.8439
0.4672 1.91 2320 0.4740 0.8373
0.5216 1.92 2330 0.4343 0.8472
0.3469 1.92 2340 0.4542 0.8316
0.3283 1.93 2350 0.4587 0.8447
0.3495 1.94 2360 0.5050 0.8348
0.4518 1.95 2370 0.5309 0.8266
0.3023 1.96 2380 0.5113 0.8332
0.4014 1.97 2390 0.4989 0.8332
0.4963 1.97 2400 0.4539 0.8505
0.3421 1.98 2410 0.4889 0.8455
0.4126 1.99 2420 0.4696 0.8463
0.479 2.0 2430 0.4498 0.8513
0.3319 2.01 2440 0.4686 0.8488
0.2787 2.01 2450 0.4650 0.8447
0.2105 2.02 2460 0.4665 0.8505
0.4944 2.03 2470 0.4667 0.8488
0.2236 2.04 2480 0.4678 0.8463
0.3076 2.05 2490 0.4621 0.8513
0.2813 2.06 2500 0.4451 0.8562
0.2207 2.06 2510 0.4559 0.8562
0.3693 2.07 2520 0.4634 0.8513
0.3682 2.08 2530 0.4390 0.8562
0.2618 2.09 2540 0.4417 0.8529
0.3139 2.1 2550 0.4618 0.8529
0.1739 2.11 2560 0.4938 0.8488
0.4258 2.11 2570 0.4574 0.8496
0.2136 2.12 2580 0.4495 0.8529
0.2625 2.13 2590 0.4555 0.8570
0.3161 2.14 2600 0.4696 0.8537
0.2515 2.15 2610 0.4649 0.8661
0.3097 2.15 2620 0.4474 0.8685
0.3544 2.16 2630 0.4458 0.8603
0.2967 2.17 2640 0.4555 0.8669
0.4015 2.18 2650 0.4486 0.8652
0.079 2.19 2660 0.4624 0.8620
0.1754 2.2 2670 0.4805 0.8587
0.1854 2.2 2680 0.4803 0.8628
0.3181 2.21 2690 0.4792 0.8595
0.0808 2.22 2700 0.4740 0.8628
0.2027 2.23 2710 0.4846 0.8587
0.3211 2.24 2720 0.5074 0.8505
0.2448 2.25 2730 0.5276 0.8414
0.3618 2.25 2740 0.5133 0.8488
0.1822 2.26 2750 0.5002 0.8578
0.3095 2.27 2760 0.4827 0.8603
0.0762 2.28 2770 0.4792 0.8644
0.187 2.29 2780 0.4897 0.8644
0.5779 2.29 2790 0.4901 0.8652
0.292 2.3 2800 0.4764 0.8603
0.1865 2.31 2810 0.4798 0.8644
0.3594 2.32 2820 0.4837 0.8620
0.421 2.33 2830 0.4812 0.8562
0.1173 2.34 2840 0.4708 0.8603
0.278 2.34 2850 0.4693 0.8685
0.2294 2.35 2860 0.4724 0.8628
0.243 2.36 2870 0.4749 0.8620
0.3979 2.37 2880 0.4633 0.8628
0.4518 2.38 2890 0.4603 0.8669
0.2739 2.38 2900 0.4625 0.8685
0.1782 2.39 2910 0.4652 0.8677
0.3536 2.4 2920 0.4613 0.8644
0.0904 2.41 2930 0.4642 0.8611
0.2315 2.42 2940 0.4613 0.8661
0.1236 2.43 2950 0.4628 0.8652
0.1842 2.43 2960 0.4706 0.8620
0.2414 2.44 2970 0.4683 0.8652
0.3419 2.45 2980 0.4645 0.8677
0.2877 2.46 2990 0.4657 0.8636
0.2524 2.47 3000 0.4701 0.8652
0.1731 2.48 3010 0.4733 0.8644
0.1731 2.48 3020 0.4830 0.8595
0.0921 2.49 3030 0.4904 0.8603
0.1593 2.5 3040 0.4836 0.8595
0.467 2.51 3050 0.4706 0.8628
0.4225 2.52 3060 0.4598 0.8644
0.1251 2.52 3070 0.4511 0.8694
0.2181 2.53 3080 0.4487 0.8735
0.2247 2.54 3090 0.4452 0.8767
0.3722 2.55 3100 0.4469 0.8759
0.1069 2.56 3110 0.4536 0.8735
0.2174 2.57 3120 0.4571 0.8710
0.2586 2.57 3130 0.4626 0.8685
0.2803 2.58 3140 0.4665 0.8677
0.4484 2.59 3150 0.4581 0.8694
0.3104 2.6 3160 0.4539 0.8735
0.2411 2.61 3170 0.4531 0.8726
0.2157 2.62 3180 0.4565 0.8694
0.2342 2.62 3190 0.4549 0.8694
0.2921 2.63 3200 0.4570 0.8677
0.1988 2.64 3210 0.4590 0.8677
0.2142 2.65 3220 0.4601 0.8661
0.1666 2.66 3230 0.4652 0.8661
0.2296 2.66 3240 0.4709 0.8611
0.3847 2.67 3250 0.4676 0.8636
0.4149 2.68 3260 0.4654 0.8636
0.2602 2.69 3270 0.4614 0.8661
0.3786 2.7 3280 0.4605 0.8661
0.3509 2.71 3290 0.4590 0.8661
0.2254 2.71 3300 0.4564 0.8677
0.1775 2.72 3310 0.4553 0.8694
0.2269 2.73 3320 0.4546 0.8669
0.1792 2.74 3330 0.4549 0.8644
0.1107 2.75 3340 0.4580 0.8661
0.2062 2.75 3350 0.4598 0.8636
0.1641 2.76 3360 0.4621 0.8652
0.18 2.77 3370 0.4651 0.8652
0.0959 2.78 3380 0.4673 0.8661
0.217 2.79 3390 0.4672 0.8652
0.3293 2.8 3400 0.4673 0.8644
0.2691 2.8 3410 0.4669 0.8644
0.1945 2.81 3420 0.4659 0.8652
0.2712 2.82 3430 0.4660 0.8677
0.2287 2.83 3440 0.4663 0.8677
0.2103 2.84 3450 0.4661 0.8669
0.2713 2.85 3460 0.4663 0.8669
0.3182 2.85 3470 0.4665 0.8677
0.1698 2.86 3480 0.4668 0.8669
0.2663 2.87 3490 0.4669 0.8677
0.2091 2.88 3500 0.4670 0.8685
0.1406 2.89 3510 0.4677 0.8669
0.16 2.89 3520 0.4682 0.8661
0.1413 2.9 3530 0.4686 0.8661
0.3499 2.91 3540 0.4690 0.8661
0.205 2.92 3550 0.4688 0.8661
0.3849 2.93 3560 0.4684 0.8661
0.209 2.94 3570 0.4680 0.8669
0.1985 2.94 3580 0.4678 0.8677
0.1989 2.95 3590 0.4678 0.8677
0.2031 2.96 3600 0.4677 0.8677
0.2401 2.97 3610 0.4677 0.8677
0.2717 2.98 3620 0.4678 0.8677
0.2821 2.99 3630 0.4678 0.8677
0.1735 2.99 3640 0.4677 0.8677

Framework versions

  • Transformers 4.40.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.1.dev0
  • Tokenizers 0.15.2
Downloads last month
21
Safetensors
Model size
316M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.