salbatarni commited on
Commit
5da9373
1 Parent(s): 782ddb3

End of training

Browse files
Files changed (1) hide show
  1. README.md +92 -87
README.md CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
- - name: arabert_cross_relevance_task5_fold2
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
- # arabert_cross_relevance_task5_fold2
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.3102
18
- - Qwk: 0.0
19
- - Mse: 0.3106
20
 
21
  ## Model description
22
 
@@ -45,88 +45,93 @@ The following hyperparameters were used during training:
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
- |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|
50
- | No log | 0.125 | 2 | 0.8487 | 0.0102 | 0.8475 |
51
- | No log | 0.25 | 4 | 0.3505 | 0.1091 | 0.3504 |
52
- | No log | 0.375 | 6 | 0.4480 | 0.0599 | 0.4484 |
53
- | No log | 0.5 | 8 | 0.3079 | 0.0534 | 0.3080 |
54
- | No log | 0.625 | 10 | 0.3015 | 0.0245 | 0.3018 |
55
- | No log | 0.75 | 12 | 0.3131 | 0.0 | 0.3134 |
56
- | No log | 0.875 | 14 | 0.3140 | 0.0 | 0.3143 |
57
- | No log | 1.0 | 16 | 0.3539 | -0.0164 | 0.3543 |
58
- | No log | 1.125 | 18 | 0.3562 | -0.0164 | 0.3567 |
59
- | No log | 1.25 | 20 | 0.3449 | 0.0 | 0.3454 |
60
- | No log | 1.375 | 22 | 0.3408 | 0.0 | 0.3412 |
61
- | No log | 1.5 | 24 | 0.3416 | 0.0 | 0.3421 |
62
- | No log | 1.625 | 26 | 0.3321 | 0.0 | 0.3325 |
63
- | No log | 1.75 | 28 | 0.3068 | 0.0 | 0.3072 |
64
- | No log | 1.875 | 30 | 0.3135 | 0.0 | 0.3139 |
65
- | No log | 2.0 | 32 | 0.3564 | 0.0122 | 0.3570 |
66
- | No log | 2.125 | 34 | 0.3393 | 0.0 | 0.3399 |
67
- | No log | 2.25 | 36 | 0.2992 | 0.0 | 0.2996 |
68
- | No log | 2.375 | 38 | 0.3002 | 0.0 | 0.3005 |
69
- | No log | 2.5 | 40 | 0.3396 | 0.0122 | 0.3402 |
70
- | No log | 2.625 | 42 | 0.3703 | 0.0368 | 0.3710 |
71
- | No log | 2.75 | 44 | 0.3599 | 0.0203 | 0.3605 |
72
- | No log | 2.875 | 46 | 0.3243 | 0.0 | 0.3248 |
73
- | No log | 3.0 | 48 | 0.3164 | 0.0 | 0.3169 |
74
- | No log | 3.125 | 50 | 0.3455 | -0.0208 | 0.3460 |
75
- | No log | 3.25 | 52 | 0.3796 | -0.0707 | 0.3802 |
76
- | No log | 3.375 | 54 | 0.3560 | -0.0495 | 0.3565 |
77
- | No log | 3.5 | 56 | 0.3121 | 0.0 | 0.3125 |
78
- | No log | 3.625 | 58 | 0.2932 | 0.0 | 0.2935 |
79
- | No log | 3.75 | 60 | 0.2950 | 0.0 | 0.2953 |
80
- | No log | 3.875 | 62 | 0.3159 | 0.0 | 0.3163 |
81
- | No log | 4.0 | 64 | 0.3268 | 0.0 | 0.3273 |
82
- | No log | 4.125 | 66 | 0.3199 | 0.0 | 0.3203 |
83
- | No log | 4.25 | 68 | 0.3055 | 0.0 | 0.3059 |
84
- | No log | 4.375 | 70 | 0.3033 | 0.0 | 0.3037 |
85
- | No log | 4.5 | 72 | 0.3090 | 0.0 | 0.3094 |
86
- | No log | 4.625 | 74 | 0.3329 | 0.0 | 0.3333 |
87
- | No log | 4.75 | 76 | 0.3437 | 0.0 | 0.3442 |
88
- | No log | 4.875 | 78 | 0.3238 | 0.0 | 0.3242 |
89
- | No log | 5.0 | 80 | 0.2995 | 0.0 | 0.2998 |
90
- | No log | 5.125 | 82 | 0.2952 | 0.0 | 0.2954 |
91
- | No log | 5.25 | 84 | 0.3110 | 0.0122 | 0.3113 |
92
- | No log | 5.375 | 86 | 0.3451 | -0.0329 | 0.3455 |
93
- | No log | 5.5 | 88 | 0.3521 | -0.0373 | 0.3526 |
94
- | No log | 5.625 | 90 | 0.3417 | -0.0329 | 0.3422 |
95
- | No log | 5.75 | 92 | 0.3246 | 0.0122 | 0.3250 |
96
- | No log | 5.875 | 94 | 0.3260 | 0.0122 | 0.3264 |
97
- | No log | 6.0 | 96 | 0.3312 | 0.0 | 0.3316 |
98
- | No log | 6.125 | 98 | 0.3222 | 0.0 | 0.3226 |
99
- | No log | 6.25 | 100 | 0.3187 | 0.0 | 0.3191 |
100
- | No log | 6.375 | 102 | 0.3168 | 0.0122 | 0.3171 |
101
- | No log | 6.5 | 104 | 0.3231 | 0.0122 | 0.3235 |
102
- | No log | 6.625 | 106 | 0.3288 | 0.0122 | 0.3292 |
103
- | No log | 6.75 | 108 | 0.3225 | 0.0122 | 0.3229 |
104
- | No log | 6.875 | 110 | 0.3117 | 0.0122 | 0.3120 |
105
- | No log | 7.0 | 112 | 0.3032 | 0.0 | 0.3034 |
106
- | No log | 7.125 | 114 | 0.3022 | 0.0 | 0.3024 |
107
- | No log | 7.25 | 116 | 0.3089 | 0.0122 | 0.3091 |
108
- | No log | 7.375 | 118 | 0.3150 | 0.0122 | 0.3153 |
109
- | No log | 7.5 | 120 | 0.3157 | 0.0122 | 0.3161 |
110
- | No log | 7.625 | 122 | 0.3102 | 0.0122 | 0.3106 |
111
- | No log | 7.75 | 124 | 0.3060 | 0.0 | 0.3063 |
112
- | No log | 7.875 | 126 | 0.3061 | 0.0 | 0.3063 |
113
- | No log | 8.0 | 128 | 0.3089 | 0.0122 | 0.3092 |
114
- | No log | 8.125 | 130 | 0.3108 | 0.0122 | 0.3111 |
115
- | No log | 8.25 | 132 | 0.3154 | 0.0122 | 0.3158 |
116
- | No log | 8.375 | 134 | 0.3210 | 0.0122 | 0.3215 |
117
- | No log | 8.5 | 136 | 0.3202 | 0.0122 | 0.3206 |
118
- | No log | 8.625 | 138 | 0.3172 | 0.0122 | 0.3176 |
119
- | No log | 8.75 | 140 | 0.3139 | 0.0 | 0.3143 |
120
- | No log | 8.875 | 142 | 0.3121 | 0.0 | 0.3125 |
121
- | No log | 9.0 | 144 | 0.3115 | 0.0 | 0.3118 |
122
- | No log | 9.125 | 146 | 0.3118 | 0.0 | 0.3121 |
123
- | No log | 9.25 | 148 | 0.3121 | 0.0 | 0.3124 |
124
- | No log | 9.375 | 150 | 0.3120 | 0.0 | 0.3124 |
125
- | No log | 9.5 | 152 | 0.3116 | 0.0 | 0.3119 |
126
- | No log | 9.625 | 154 | 0.3110 | 0.0 | 0.3114 |
127
- | No log | 9.75 | 156 | 0.3106 | 0.0 | 0.3109 |
128
- | No log | 9.875 | 158 | 0.3103 | 0.0 | 0.3106 |
129
- | No log | 10.0 | 160 | 0.3102 | 0.0 | 0.3106 |
 
 
 
 
 
130
 
131
 
132
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
+ - name: arabert_cross_relevance_task5_fold3
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ # arabert_cross_relevance_task5_fold3
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.3040
18
+ - Qwk: 0.0224
19
+ - Mse: 0.3040
20
 
21
  ## Model description
22
 
 
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|
50
+ | No log | 0.1176 | 2 | 0.3814 | 0.0319 | 0.3814 |
51
+ | No log | 0.2353 | 4 | 0.4935 | 0.0934 | 0.4935 |
52
+ | No log | 0.3529 | 6 | 0.3556 | 0.1346 | 0.3556 |
53
+ | No log | 0.4706 | 8 | 0.2761 | 0.0 | 0.2761 |
54
+ | No log | 0.5882 | 10 | 0.3024 | 0.0 | 0.3024 |
55
+ | No log | 0.7059 | 12 | 0.2997 | -0.0517 | 0.2997 |
56
+ | No log | 0.8235 | 14 | 0.3140 | -0.1029 | 0.3140 |
57
+ | No log | 0.9412 | 16 | 0.2979 | -0.0473 | 0.2979 |
58
+ | No log | 1.0588 | 18 | 0.2994 | 0.0 | 0.2994 |
59
+ | No log | 1.1765 | 20 | 0.2824 | 0.0 | 0.2824 |
60
+ | No log | 1.2941 | 22 | 0.2570 | 0.0 | 0.2570 |
61
+ | No log | 1.4118 | 24 | 0.2638 | -0.0235 | 0.2638 |
62
+ | No log | 1.5294 | 26 | 0.2681 | -0.0235 | 0.2681 |
63
+ | No log | 1.6471 | 28 | 0.2587 | -0.0235 | 0.2587 |
64
+ | No log | 1.7647 | 30 | 0.2781 | 0.0 | 0.2781 |
65
+ | No log | 1.8824 | 32 | 0.3225 | 0.0 | 0.3225 |
66
+ | No log | 2.0 | 34 | 0.2895 | 0.0 | 0.2895 |
67
+ | No log | 2.1176 | 36 | 0.2779 | 0.0 | 0.2779 |
68
+ | No log | 2.2353 | 38 | 0.2807 | 0.0 | 0.2807 |
69
+ | No log | 2.3529 | 40 | 0.2808 | 0.0 | 0.2808 |
70
+ | No log | 2.4706 | 42 | 0.2648 | 0.0 | 0.2648 |
71
+ | No log | 2.5882 | 44 | 0.2627 | 0.0 | 0.2627 |
72
+ | No log | 2.7059 | 46 | 0.2723 | 0.0 | 0.2723 |
73
+ | No log | 2.8235 | 48 | 0.2669 | 0.0 | 0.2669 |
74
+ | No log | 2.9412 | 50 | 0.2817 | 0.0 | 0.2817 |
75
+ | No log | 3.0588 | 52 | 0.2862 | 0.0 | 0.2862 |
76
+ | No log | 3.1765 | 54 | 0.2710 | 0.0 | 0.2710 |
77
+ | No log | 3.2941 | 56 | 0.2691 | 0.0 | 0.2691 |
78
+ | No log | 3.4118 | 58 | 0.2758 | 0.0 | 0.2758 |
79
+ | No log | 3.5294 | 60 | 0.2757 | 0.0 | 0.2757 |
80
+ | No log | 3.6471 | 62 | 0.2734 | 0.0 | 0.2734 |
81
+ | No log | 3.7647 | 64 | 0.2619 | 0.0 | 0.2619 |
82
+ | No log | 3.8824 | 66 | 0.2661 | 0.0 | 0.2661 |
83
+ | No log | 4.0 | 68 | 0.2833 | 0.0 | 0.2833 |
84
+ | No log | 4.1176 | 70 | 0.2841 | 0.0 | 0.2841 |
85
+ | No log | 4.2353 | 72 | 0.2826 | 0.0 | 0.2826 |
86
+ | No log | 4.3529 | 74 | 0.2721 | 0.0 | 0.2721 |
87
+ | No log | 4.4706 | 76 | 0.2658 | 0.0 | 0.2658 |
88
+ | No log | 4.5882 | 78 | 0.2740 | 0.0 | 0.2740 |
89
+ | No log | 4.7059 | 80 | 0.2745 | 0.0 | 0.2745 |
90
+ | No log | 4.8235 | 82 | 0.2783 | 0.0 | 0.2783 |
91
+ | No log | 4.9412 | 84 | 0.2774 | 0.0 | 0.2774 |
92
+ | No log | 5.0588 | 86 | 0.2885 | 0.0 | 0.2885 |
93
+ | No log | 5.1765 | 88 | 0.3222 | 0.0 | 0.3222 |
94
+ | No log | 5.2941 | 90 | 0.3391 | 0.0 | 0.3391 |
95
+ | No log | 5.4118 | 92 | 0.3189 | 0.0 | 0.3189 |
96
+ | No log | 5.5294 | 94 | 0.3185 | 0.0224 | 0.3185 |
97
+ | No log | 5.6471 | 96 | 0.2891 | 0.0 | 0.2891 |
98
+ | No log | 5.7647 | 98 | 0.2697 | 0.0 | 0.2697 |
99
+ | No log | 5.8824 | 100 | 0.2703 | 0.0 | 0.2703 |
100
+ | No log | 6.0 | 102 | 0.2749 | 0.0 | 0.2749 |
101
+ | No log | 6.1176 | 104 | 0.2900 | 0.0 | 0.2900 |
102
+ | No log | 6.2353 | 106 | 0.3272 | 0.0 | 0.3272 |
103
+ | No log | 6.3529 | 108 | 0.3347 | 0.0224 | 0.3347 |
104
+ | No log | 6.4706 | 110 | 0.3020 | 0.0 | 0.3020 |
105
+ | No log | 6.5882 | 112 | 0.2731 | 0.0 | 0.2731 |
106
+ | No log | 6.7059 | 114 | 0.2683 | 0.0 | 0.2683 |
107
+ | No log | 6.8235 | 116 | 0.2736 | 0.0 | 0.2736 |
108
+ | No log | 6.9412 | 118 | 0.2940 | 0.0 | 0.2940 |
109
+ | No log | 7.0588 | 120 | 0.3391 | 0.0224 | 0.3391 |
110
+ | No log | 7.1765 | 122 | 0.3471 | 0.0224 | 0.3471 |
111
+ | No log | 7.2941 | 124 | 0.3232 | 0.0224 | 0.3232 |
112
+ | No log | 7.4118 | 126 | 0.2886 | 0.0 | 0.2886 |
113
+ | No log | 7.5294 | 128 | 0.2755 | 0.0 | 0.2755 |
114
+ | No log | 7.6471 | 130 | 0.2736 | 0.0 | 0.2736 |
115
+ | No log | 7.7647 | 132 | 0.2796 | 0.0 | 0.2796 |
116
+ | No log | 7.8824 | 134 | 0.2901 | 0.0 | 0.2901 |
117
+ | No log | 8.0 | 136 | 0.3054 | 0.0224 | 0.3054 |
118
+ | No log | 8.1176 | 138 | 0.3155 | 0.0224 | 0.3155 |
119
+ | No log | 8.2353 | 140 | 0.3163 | 0.0224 | 0.3163 |
120
+ | No log | 8.3529 | 142 | 0.3326 | 0.0224 | 0.3326 |
121
+ | No log | 8.4706 | 144 | 0.3464 | 0.0224 | 0.3464 |
122
+ | No log | 8.5882 | 146 | 0.3603 | 0.0123 | 0.3603 |
123
+ | No log | 8.7059 | 148 | 0.3747 | -0.0057 | 0.3747 |
124
+ | No log | 8.8235 | 150 | 0.3703 | 0.0123 | 0.3703 |
125
+ | No log | 8.9412 | 152 | 0.3512 | 0.0224 | 0.3512 |
126
+ | No log | 9.0588 | 154 | 0.3262 | 0.0224 | 0.3262 |
127
+ | No log | 9.1765 | 156 | 0.3095 | 0.0224 | 0.3095 |
128
+ | No log | 9.2941 | 158 | 0.3054 | 0.0224 | 0.3054 |
129
+ | No log | 9.4118 | 160 | 0.3038 | 0.0224 | 0.3038 |
130
+ | No log | 9.5294 | 162 | 0.3034 | 0.0224 | 0.3034 |
131
+ | No log | 9.6471 | 164 | 0.3016 | 0.0224 | 0.3016 |
132
+ | No log | 9.7647 | 166 | 0.3019 | 0.0224 | 0.3019 |
133
+ | No log | 9.8824 | 168 | 0.3032 | 0.0224 | 0.3032 |
134
+ | No log | 10.0 | 170 | 0.3040 | 0.0224 | 0.3040 |
135
 
136
 
137
  ### Framework versions