mirfan899 commited on
Commit
4c7e894
1 Parent(s): e76a22f

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -71
README.md CHANGED
@@ -14,8 +14,8 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 1.4558
18
- - Cer: 0.4079
19
 
20
  ## Model description
21
 
@@ -34,85 +34,58 @@ More information needed
34
  ### Training hyperparameters
35
 
36
  The following hyperparameters were used during training:
37
- - learning_rate: 0.0001
38
  - train_batch_size: 2
39
  - eval_batch_size: 8
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
- - num_epochs: 50
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss | Cer |
48
  |:-------------:|:-----:|:-----:|:---------------:|:------:|
49
- | 3.0642 | 0.74 | 500 | 4.4995 | 1.0 |
50
- | 2.8486 | 1.48 | 1000 | 3.8639 | 1.0 |
51
- | 2.7909 | 2.22 | 1500 | 3.4712 | 1.0 |
52
- | 1.5475 | 2.96 | 2000 | 1.0263 | 0.6825 |
53
- | 0.7353 | 3.7 | 2500 | 0.8291 | 0.5760 |
54
- | 0.6036 | 4.44 | 3000 | 0.7387 | 0.5327 |
55
- | 0.5553 | 5.19 | 3500 | 0.7382 | 0.5023 |
56
- | 0.4271 | 5.93 | 4000 | 0.7244 | 0.4991 |
57
- | 0.43 | 6.67 | 4500 | 0.7152 | 0.4805 |
58
- | 0.3925 | 7.41 | 5000 | 0.7210 | 0.4587 |
59
- | 0.3719 | 8.15 | 5500 | 0.7888 | 0.4491 |
60
- | 0.3451 | 8.89 | 6000 | 0.7599 | 0.4433 |
61
- | 0.319 | 9.63 | 6500 | 0.7642 | 0.4508 |
62
- | 0.2638 | 10.37 | 7000 | 0.8490 | 0.4426 |
63
- | 0.3084 | 11.11 | 7500 | 0.9387 | 0.4315 |
64
- | 0.2553 | 11.85 | 8000 | 0.8477 | 0.4287 |
65
- | 0.2537 | 12.59 | 8500 | 0.8261 | 0.4301 |
66
- | 0.2058 | 13.33 | 9000 | 1.1093 | 0.4247 |
67
- | 0.2283 | 14.07 | 9500 | 0.7638 | 0.4230 |
68
- | 0.2043 | 14.81 | 10000 | 1.0104 | 0.4219 |
69
- | 0.1918 | 15.56 | 10500 | 0.9618 | 0.4194 |
70
- | 0.1764 | 16.3 | 11000 | 0.9460 | 0.4226 |
71
- | 0.1677 | 17.04 | 11500 | 0.9750 | 0.4233 |
72
- | 0.1751 | 17.78 | 12000 | 0.9600 | 0.4240 |
73
- | 0.1465 | 18.52 | 12500 | 1.1328 | 0.4172 |
74
- | 0.1239 | 19.26 | 13000 | 1.0746 | 0.4176 |
75
- | 0.1495 | 20.0 | 13500 | 1.2143 | 0.4194 |
76
- | 0.1444 | 20.74 | 14000 | 1.1595 | 0.4219 |
77
- | 0.134 | 21.48 | 14500 | 1.1601 | 0.4201 |
78
- | 0.1343 | 22.22 | 15000 | 1.1730 | 0.4233 |
79
- | 0.1051 | 22.96 | 15500 | 1.1257 | 0.4172 |
80
- | 0.1067 | 23.7 | 16000 | 1.1206 | 0.4190 |
81
- | 0.0959 | 24.44 | 16500 | 1.1539 | 0.4133 |
82
- | 0.1028 | 25.19 | 17000 | 1.2425 | 0.4126 |
83
- | 0.1028 | 25.93 | 17500 | 1.2008 | 0.4144 |
84
- | 0.1052 | 26.67 | 18000 | 1.1974 | 0.4094 |
85
- | 0.0813 | 27.41 | 18500 | 1.0960 | 0.4133 |
86
- | 0.0973 | 28.15 | 19000 | 1.1153 | 0.4101 |
87
- | 0.0783 | 28.89 | 19500 | 1.1596 | 0.4126 |
88
- | 0.0704 | 29.63 | 20000 | 1.1881 | 0.4087 |
89
- | 0.068 | 30.37 | 20500 | 1.2289 | 0.4040 |
90
- | 0.0664 | 31.11 | 21000 | 1.2289 | 0.4079 |
91
- | 0.0747 | 31.85 | 21500 | 1.2642 | 0.4122 |
92
- | 0.0663 | 32.59 | 22000 | 1.3062 | 0.4101 |
93
- | 0.0668 | 33.33 | 22500 | 1.3486 | 0.4101 |
94
- | 0.0592 | 34.07 | 23000 | 1.3346 | 0.4040 |
95
- | 0.0513 | 34.81 | 23500 | 1.2958 | 0.4097 |
96
- | 0.0511 | 35.56 | 24000 | 1.3798 | 0.4108 |
97
- | 0.0557 | 36.3 | 24500 | 1.3521 | 0.4065 |
98
- | 0.049 | 37.04 | 25000 | 1.4192 | 0.4094 |
99
- | 0.0465 | 37.78 | 25500 | 1.4308 | 0.4108 |
100
- | 0.0474 | 38.52 | 26000 | 1.4004 | 0.4058 |
101
- | 0.0428 | 39.26 | 26500 | 1.3988 | 0.4054 |
102
- | 0.0509 | 40.0 | 27000 | 1.4218 | 0.4069 |
103
- | 0.0386 | 40.74 | 27500 | 1.3819 | 0.4104 |
104
- | 0.0426 | 41.48 | 28000 | 1.4681 | 0.4090 |
105
- | 0.0408 | 42.22 | 28500 | 1.4543 | 0.4104 |
106
- | 0.0405 | 42.96 | 29000 | 1.4999 | 0.4108 |
107
- | 0.036 | 43.7 | 29500 | 1.4922 | 0.4072 |
108
- | 0.036 | 44.44 | 30000 | 1.4709 | 0.4087 |
109
- | 0.04 | 45.19 | 30500 | 1.4858 | 0.4094 |
110
- | 0.0343 | 45.93 | 31000 | 1.4606 | 0.4087 |
111
- | 0.0288 | 46.67 | 31500 | 1.4599 | 0.4044 |
112
- | 0.0454 | 47.41 | 32000 | 1.4288 | 0.4087 |
113
- | 0.0322 | 48.15 | 32500 | 1.4589 | 0.4083 |
114
- | 0.0327 | 48.89 | 33000 | 1.4502 | 0.4094 |
115
- | 0.0272 | 49.63 | 33500 | 1.4558 | 0.4079 |
116
 
117
 
118
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.6936
18
+ - Cer: 0.2531
19
 
20
  ## Model description
21
 
 
34
  ### Training hyperparameters
35
 
36
  The following hyperparameters were used during training:
37
+ - learning_rate: 4e-05
38
  - train_batch_size: 2
39
  - eval_batch_size: 8
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
+ - num_epochs: 30
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss | Cer |
48
  |:-------------:|:-----:|:-----:|:---------------:|:------:|
49
+ | 3.2437 | 0.74 | 500 | 4.1235 | 1.0 |
50
+ | 2.8562 | 1.48 | 1000 | 3.5824 | 1.0 |
51
+ | 2.7606 | 2.22 | 1500 | 3.2239 | 1.0 |
52
+ | 2.0885 | 2.96 | 2000 | 1.1613 | 0.8147 |
53
+ | 1.0295 | 3.7 | 2500 | 0.7703 | 0.5125 |
54
+ | 0.796 | 4.44 | 3000 | 0.6539 | 0.4420 |
55
+ | 0.6484 | 5.19 | 3500 | 0.6259 | 0.3937 |
56
+ | 0.6099 | 5.93 | 4000 | 0.5749 | 0.3887 |
57
+ | 0.5772 | 6.67 | 4500 | 0.6031 | 0.3637 |
58
+ | 0.5158 | 7.41 | 5000 | 0.5978 | 0.3518 |
59
+ | 0.4923 | 8.15 | 5500 | 0.5621 | 0.3364 |
60
+ | 0.4679 | 8.89 | 6000 | 0.5371 | 0.3396 |
61
+ | 0.4385 | 9.63 | 6500 | 0.5804 | 0.3213 |
62
+ | 0.4818 | 10.37 | 7000 | 0.5469 | 0.3223 |
63
+ | 0.3797 | 11.11 | 7500 | 0.5789 | 0.3118 |
64
+ | 0.3669 | 11.85 | 8000 | 0.5733 | 0.2986 |
65
+ | 0.3777 | 12.59 | 8500 | 0.6053 | 0.3004 |
66
+ | 0.3613 | 13.33 | 9000 | 0.6061 | 0.2895 |
67
+ | 0.3454 | 14.07 | 9500 | 0.6072 | 0.2740 |
68
+ | 0.3532 | 14.81 | 10000 | 0.6119 | 0.2872 |
69
+ | 0.3087 | 15.56 | 10500 | 0.6020 | 0.2849 |
70
+ | 0.3277 | 16.3 | 11000 | 0.6397 | 0.2745 |
71
+ | 0.2978 | 17.04 | 11500 | 0.6216 | 0.2745 |
72
+ | 0.2939 | 17.78 | 12000 | 0.6377 | 0.2690 |
73
+ | 0.2675 | 18.52 | 12500 | 0.6752 | 0.2681 |
74
+ | 0.2873 | 19.26 | 13000 | 0.6677 | 0.2767 |
75
+ | 0.2779 | 20.0 | 13500 | 0.6748 | 0.2717 |
76
+ | 0.28 | 20.74 | 14000 | 0.6771 | 0.2645 |
77
+ | 0.2688 | 21.48 | 14500 | 0.6618 | 0.2604 |
78
+ | 0.2234 | 22.22 | 15000 | 0.6791 | 0.2613 |
79
+ | 0.2464 | 22.96 | 15500 | 0.6665 | 0.2626 |
80
+ | 0.2254 | 23.7 | 16000 | 0.7028 | 0.2572 |
81
+ | 0.2132 | 24.44 | 16500 | 0.6985 | 0.2567 |
82
+ | 0.2424 | 25.19 | 17000 | 0.6731 | 0.2590 |
83
+ | 0.2447 | 25.93 | 17500 | 0.6780 | 0.2544 |
84
+ | 0.2209 | 26.67 | 18000 | 0.6729 | 0.2567 |
85
+ | 0.2102 | 27.41 | 18500 | 0.6844 | 0.2563 |
86
+ | 0.2185 | 28.15 | 19000 | 0.6922 | 0.2585 |
87
+ | 0.2294 | 28.89 | 19500 | 0.6940 | 0.2563 |
88
+ | 0.2208 | 29.63 | 20000 | 0.6936 | 0.2531 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
89
 
90
 
91
  ### Framework versions