salbatarni commited on
Commit
34cee80
1 Parent(s): c3a605f

End of training

Browse files
Files changed (1) hide show
  1. README.md +92 -87
README.md CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
- - name: arabert_cross_relevance_task6_fold5
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
- # arabert_cross_relevance_task6_fold5
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.2242
18
- - Qwk: 0.3701
19
- - Mse: 0.2242
20
 
21
  ## Model description
22
 
@@ -45,88 +45,93 @@ The following hyperparameters were used during training:
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
- |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
50
- | No log | 0.125 | 2 | 0.4047 | 0.2582 | 0.4045 |
51
- | No log | 0.25 | 4 | 0.2750 | 0.5141 | 0.2744 |
52
- | No log | 0.375 | 6 | 0.2556 | 0.3452 | 0.2551 |
53
- | No log | 0.5 | 8 | 0.2742 | 0.2629 | 0.2739 |
54
- | No log | 0.625 | 10 | 0.2556 | 0.2281 | 0.2553 |
55
- | No log | 0.75 | 12 | 0.2574 | 0.2267 | 0.2571 |
56
- | No log | 0.875 | 14 | 0.2555 | 0.3050 | 0.2552 |
57
- | No log | 1.0 | 16 | 0.2258 | 0.2618 | 0.2256 |
58
- | No log | 1.125 | 18 | 0.2150 | 0.3044 | 0.2148 |
59
- | No log | 1.25 | 20 | 0.2079 | 0.3044 | 0.2077 |
60
- | No log | 1.375 | 22 | 0.2022 | 0.3172 | 0.2020 |
61
- | No log | 1.5 | 24 | 0.2011 | 0.3247 | 0.2009 |
62
- | No log | 1.625 | 26 | 0.1872 | 0.3453 | 0.1871 |
63
- | No log | 1.75 | 28 | 0.1813 | 0.3543 | 0.1813 |
64
- | No log | 1.875 | 30 | 0.1764 | 0.3676 | 0.1764 |
65
- | No log | 2.0 | 32 | 0.1744 | 0.4205 | 0.1743 |
66
- | No log | 2.125 | 34 | 0.1788 | 0.4236 | 0.1788 |
67
- | No log | 2.25 | 36 | 0.1817 | 0.3746 | 0.1817 |
68
- | No log | 2.375 | 38 | 0.1911 | 0.4353 | 0.1911 |
69
- | No log | 2.5 | 40 | 0.2063 | 0.6065 | 0.2062 |
70
- | No log | 2.625 | 42 | 0.1851 | 0.4860 | 0.1851 |
71
- | No log | 2.75 | 44 | 0.1768 | 0.3810 | 0.1769 |
72
- | No log | 2.875 | 46 | 0.1839 | 0.3478 | 0.1840 |
73
- | No log | 3.0 | 48 | 0.1903 | 0.3489 | 0.1904 |
74
- | No log | 3.125 | 50 | 0.1887 | 0.3703 | 0.1888 |
75
- | No log | 3.25 | 52 | 0.1832 | 0.3724 | 0.1832 |
76
- | No log | 3.375 | 54 | 0.1852 | 0.3795 | 0.1852 |
77
- | No log | 3.5 | 56 | 0.1937 | 0.4146 | 0.1936 |
78
- | No log | 3.625 | 58 | 0.1934 | 0.3962 | 0.1934 |
79
- | No log | 3.75 | 60 | 0.1920 | 0.3669 | 0.1920 |
80
- | No log | 3.875 | 62 | 0.1930 | 0.4021 | 0.1930 |
81
- | No log | 4.0 | 64 | 0.1970 | 0.4137 | 0.1969 |
82
- | No log | 4.125 | 66 | 0.1944 | 0.4137 | 0.1943 |
83
- | No log | 4.25 | 68 | 0.1936 | 0.3890 | 0.1936 |
84
- | No log | 4.375 | 70 | 0.1981 | 0.3528 | 0.1981 |
85
- | No log | 4.5 | 72 | 0.1927 | 0.3528 | 0.1927 |
86
- | No log | 4.625 | 74 | 0.1928 | 0.3477 | 0.1928 |
87
- | No log | 4.75 | 76 | 0.1965 | 0.3546 | 0.1964 |
88
- | No log | 4.875 | 78 | 0.2095 | 0.3432 | 0.2094 |
89
- | No log | 5.0 | 80 | 0.2044 | 0.3443 | 0.2043 |
90
- | No log | 5.125 | 82 | 0.2006 | 0.3433 | 0.2006 |
91
- | No log | 5.25 | 84 | 0.2036 | 0.3465 | 0.2036 |
92
- | No log | 5.375 | 86 | 0.2052 | 0.3465 | 0.2053 |
93
- | No log | 5.5 | 88 | 0.2022 | 0.3465 | 0.2022 |
94
- | No log | 5.625 | 90 | 0.2017 | 0.3508 | 0.2017 |
95
- | No log | 5.75 | 92 | 0.2073 | 0.3508 | 0.2073 |
96
- | No log | 5.875 | 94 | 0.2122 | 0.3508 | 0.2123 |
97
- | No log | 6.0 | 96 | 0.1997 | 0.3561 | 0.1997 |
98
- | No log | 6.125 | 98 | 0.1967 | 0.3819 | 0.1966 |
99
- | No log | 6.25 | 100 | 0.1968 | 0.3925 | 0.1967 |
100
- | No log | 6.375 | 102 | 0.2088 | 0.3623 | 0.2088 |
101
- | No log | 6.5 | 104 | 0.2513 | 0.3650 | 0.2515 |
102
- | No log | 6.625 | 106 | 0.2811 | 0.3611 | 0.2813 |
103
- | No log | 6.75 | 108 | 0.2626 | 0.3689 | 0.2628 |
104
- | No log | 6.875 | 110 | 0.2260 | 0.3650 | 0.2261 |
105
- | No log | 7.0 | 112 | 0.2040 | 0.3607 | 0.2041 |
106
- | No log | 7.125 | 114 | 0.1937 | 0.3826 | 0.1937 |
107
- | No log | 7.25 | 116 | 0.1925 | 0.3865 | 0.1924 |
108
- | No log | 7.375 | 118 | 0.1942 | 0.3972 | 0.1941 |
109
- | No log | 7.5 | 120 | 0.1957 | 0.3658 | 0.1956 |
110
- | No log | 7.625 | 122 | 0.1993 | 0.3561 | 0.1993 |
111
- | No log | 7.75 | 124 | 0.2083 | 0.3547 | 0.2083 |
112
- | No log | 7.875 | 126 | 0.2197 | 0.3557 | 0.2198 |
113
- | No log | 8.0 | 128 | 0.2322 | 0.3566 | 0.2324 |
114
- | No log | 8.125 | 130 | 0.2346 | 0.3557 | 0.2348 |
115
- | No log | 8.25 | 132 | 0.2299 | 0.3538 | 0.2300 |
116
- | No log | 8.375 | 134 | 0.2254 | 0.3538 | 0.2255 |
117
- | No log | 8.5 | 136 | 0.2201 | 0.3538 | 0.2202 |
118
- | No log | 8.625 | 138 | 0.2148 | 0.3538 | 0.2148 |
119
- | No log | 8.75 | 140 | 0.2117 | 0.3538 | 0.2117 |
120
- | No log | 8.875 | 142 | 0.2123 | 0.3557 | 0.2124 |
121
- | No log | 9.0 | 144 | 0.2127 | 0.3557 | 0.2127 |
122
- | No log | 9.125 | 146 | 0.2153 | 0.3557 | 0.2154 |
123
- | No log | 9.25 | 148 | 0.2195 | 0.3557 | 0.2195 |
124
- | No log | 9.375 | 150 | 0.2224 | 0.3607 | 0.2225 |
125
- | No log | 9.5 | 152 | 0.2248 | 0.3717 | 0.2249 |
126
- | No log | 9.625 | 154 | 0.2259 | 0.3709 | 0.2260 |
127
- | No log | 9.75 | 156 | 0.2256 | 0.3701 | 0.2257 |
128
- | No log | 9.875 | 158 | 0.2249 | 0.3701 | 0.2249 |
129
- | No log | 10.0 | 160 | 0.2242 | 0.3701 | 0.2242 |
 
 
 
 
 
130
 
131
 
132
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
+ - name: arabert_cross_relevance_task6_fold6
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ # arabert_cross_relevance_task6_fold6
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.4430
18
+ - Qwk: 0.1465
19
+ - Mse: 0.4429
20
 
21
  ## Model description
22
 
 
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
+ |:-------------:|:------:|:----:|:---------------:|:------:|:------:|
50
+ | No log | 0.1176 | 2 | 0.7830 | 0.0204 | 0.7810 |
51
+ | No log | 0.2353 | 4 | 0.3944 | 0.0715 | 0.3940 |
52
+ | No log | 0.3529 | 6 | 0.3437 | 0.0480 | 0.3436 |
53
+ | No log | 0.4706 | 8 | 0.3309 | 0.1105 | 0.3305 |
54
+ | No log | 0.5882 | 10 | 0.2835 | 0.1105 | 0.2836 |
55
+ | No log | 0.7059 | 12 | 0.2697 | 0.1105 | 0.2702 |
56
+ | No log | 0.8235 | 14 | 0.2695 | 0.1185 | 0.2700 |
57
+ | No log | 0.9412 | 16 | 0.2715 | 0.1326 | 0.2721 |
58
+ | No log | 1.0588 | 18 | 0.2745 | 0.2102 | 0.2750 |
59
+ | No log | 1.1765 | 20 | 0.2896 | 0.2067 | 0.2899 |
60
+ | No log | 1.2941 | 22 | 0.2766 | 0.2206 | 0.2770 |
61
+ | No log | 1.4118 | 24 | 0.2712 | 0.2420 | 0.2717 |
62
+ | No log | 1.5294 | 26 | 0.2809 | 0.2235 | 0.2814 |
63
+ | No log | 1.6471 | 28 | 0.2904 | 0.2076 | 0.2908 |
64
+ | No log | 1.7647 | 30 | 0.2687 | 0.2352 | 0.2692 |
65
+ | No log | 1.8824 | 32 | 0.2581 | 0.2477 | 0.2588 |
66
+ | No log | 2.0 | 34 | 0.2565 | 0.2472 | 0.2572 |
67
+ | No log | 2.1176 | 36 | 0.2641 | 0.2099 | 0.2646 |
68
+ | No log | 2.2353 | 38 | 0.3196 | 0.2033 | 0.3198 |
69
+ | No log | 2.3529 | 40 | 0.3095 | 0.2076 | 0.3097 |
70
+ | No log | 2.4706 | 42 | 0.2662 | 0.2084 | 0.2667 |
71
+ | No log | 2.5882 | 44 | 0.2644 | 0.2790 | 0.2651 |
72
+ | No log | 2.7059 | 46 | 0.2676 | 0.2311 | 0.2682 |
73
+ | No log | 2.8235 | 48 | 0.2943 | 0.2347 | 0.2946 |
74
+ | No log | 2.9412 | 50 | 0.3231 | 0.2150 | 0.3232 |
75
+ | No log | 3.0588 | 52 | 0.3309 | 0.2064 | 0.3309 |
76
+ | No log | 3.1765 | 54 | 0.3053 | 0.2252 | 0.3056 |
77
+ | No log | 3.2941 | 56 | 0.2808 | 0.2266 | 0.2812 |
78
+ | No log | 3.4118 | 58 | 0.2831 | 0.2170 | 0.2836 |
79
+ | No log | 3.5294 | 60 | 0.2875 | 0.2211 | 0.2881 |
80
+ | No log | 3.6471 | 62 | 0.3017 | 0.2140 | 0.3022 |
81
+ | No log | 3.7647 | 64 | 0.3078 | 0.2125 | 0.3084 |
82
+ | No log | 3.8824 | 66 | 0.3215 | 0.2117 | 0.3220 |
83
+ | No log | 4.0 | 68 | 0.3285 | 0.2052 | 0.3288 |
84
+ | No log | 4.1176 | 70 | 0.3679 | 0.1824 | 0.3679 |
85
+ | No log | 4.2353 | 72 | 0.3668 | 0.1824 | 0.3668 |
86
+ | No log | 4.3529 | 74 | 0.3216 | 0.2135 | 0.3219 |
87
+ | No log | 4.4706 | 76 | 0.2935 | 0.2301 | 0.2940 |
88
+ | No log | 4.5882 | 78 | 0.2944 | 0.2406 | 0.2949 |
89
+ | No log | 4.7059 | 80 | 0.3279 | 0.2200 | 0.3282 |
90
+ | No log | 4.8235 | 82 | 0.3629 | 0.1977 | 0.3631 |
91
+ | No log | 4.9412 | 84 | 0.3823 | 0.1898 | 0.3823 |
92
+ | No log | 5.0588 | 86 | 0.3659 | 0.1977 | 0.3659 |
93
+ | No log | 5.1765 | 88 | 0.3351 | 0.2019 | 0.3353 |
94
+ | No log | 5.2941 | 90 | 0.3442 | 0.2019 | 0.3443 |
95
+ | No log | 5.4118 | 92 | 0.3693 | 0.1977 | 0.3694 |
96
+ | No log | 5.5294 | 94 | 0.3861 | 0.1937 | 0.3861 |
97
+ | No log | 5.6471 | 96 | 0.3622 | 0.1977 | 0.3624 |
98
+ | No log | 5.7647 | 98 | 0.3369 | 0.2023 | 0.3373 |
99
+ | No log | 5.8824 | 100 | 0.3520 | 0.1939 | 0.3524 |
100
+ | No log | 6.0 | 102 | 0.3764 | 0.1898 | 0.3766 |
101
+ | No log | 6.1176 | 104 | 0.4008 | 0.1975 | 0.4009 |
102
+ | No log | 6.2353 | 106 | 0.4228 | 0.1899 | 0.4229 |
103
+ | No log | 6.3529 | 108 | 0.4377 | 0.1725 | 0.4376 |
104
+ | No log | 6.4706 | 110 | 0.4032 | 0.1824 | 0.4033 |
105
+ | No log | 6.5882 | 112 | 0.3828 | 0.1937 | 0.3829 |
106
+ | No log | 6.7059 | 114 | 0.4023 | 0.1975 | 0.4023 |
107
+ | No log | 6.8235 | 116 | 0.4098 | 0.1650 | 0.4097 |
108
+ | No log | 6.9412 | 118 | 0.4555 | 0.1465 | 0.4553 |
109
+ | No log | 7.0588 | 120 | 0.5148 | 0.1339 | 0.5144 |
110
+ | No log | 7.1765 | 122 | 0.5125 | 0.1339 | 0.5122 |
111
+ | No log | 7.2941 | 124 | 0.4645 | 0.1465 | 0.4643 |
112
+ | No log | 7.4118 | 126 | 0.3966 | 0.1751 | 0.3967 |
113
+ | No log | 7.5294 | 128 | 0.3450 | 0.1979 | 0.3452 |
114
+ | No log | 7.6471 | 130 | 0.3262 | 0.2210 | 0.3265 |
115
+ | No log | 7.7647 | 132 | 0.3261 | 0.2210 | 0.3265 |
116
+ | No log | 7.8824 | 134 | 0.3438 | 0.1979 | 0.3441 |
117
+ | No log | 8.0 | 136 | 0.3772 | 0.2015 | 0.3774 |
118
+ | No log | 8.1176 | 138 | 0.4155 | 0.1719 | 0.4155 |
119
+ | No log | 8.2353 | 140 | 0.4512 | 0.1528 | 0.4511 |
120
+ | No log | 8.3529 | 142 | 0.4592 | 0.1404 | 0.4590 |
121
+ | No log | 8.4706 | 144 | 0.4422 | 0.1465 | 0.4421 |
122
+ | No log | 8.5882 | 146 | 0.4170 | 0.1620 | 0.4170 |
123
+ | No log | 8.7059 | 148 | 0.4105 | 0.1757 | 0.4105 |
124
+ | No log | 8.8235 | 150 | 0.4205 | 0.1592 | 0.4204 |
125
+ | No log | 8.9412 | 152 | 0.4328 | 0.1465 | 0.4327 |
126
+ | No log | 9.0588 | 154 | 0.4368 | 0.1465 | 0.4367 |
127
+ | No log | 9.1765 | 156 | 0.4351 | 0.1465 | 0.4350 |
128
+ | No log | 9.2941 | 158 | 0.4352 | 0.1465 | 0.4352 |
129
+ | No log | 9.4118 | 160 | 0.4333 | 0.1465 | 0.4332 |
130
+ | No log | 9.5294 | 162 | 0.4373 | 0.1465 | 0.4372 |
131
+ | No log | 9.6471 | 164 | 0.4392 | 0.1465 | 0.4391 |
132
+ | No log | 9.7647 | 166 | 0.4394 | 0.1465 | 0.4393 |
133
+ | No log | 9.8824 | 168 | 0.4416 | 0.1465 | 0.4415 |
134
+ | No log | 10.0 | 170 | 0.4430 | 0.1465 | 0.4429 |
135
 
136
 
137
  ### Framework versions