Edit model card

Arabic_FineTuningAraBERT_AugV0_k4_task1_organization_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8154
  • Qwk: 0.6008
  • Mse: 0.8154
  • Rmse: 0.9030

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0588 2 5.0129 -0.0516 5.0129 2.2389
No log 0.1176 4 2.8173 0.1872 2.8173 1.6785
No log 0.1765 6 2.2136 -0.0383 2.2136 1.4878
No log 0.2353 8 1.5361 0.0443 1.5361 1.2394
No log 0.2941 10 1.2092 0.1621 1.2092 1.0996
No log 0.3529 12 1.4958 0.1075 1.4958 1.2230
No log 0.4118 14 1.5388 0.1119 1.5388 1.2405
No log 0.4706 16 1.5160 0.0372 1.5160 1.2313
No log 0.5294 18 1.4444 0.2145 1.4444 1.2018
No log 0.5882 20 1.4664 0.1075 1.4664 1.2109
No log 0.6471 22 1.5163 0.1075 1.5163 1.2314
No log 0.7059 24 1.5229 0.0801 1.5229 1.2341
No log 0.7647 26 1.6039 0.0 1.6039 1.2664
No log 0.8235 28 1.5753 0.0 1.5753 1.2551
No log 0.8824 30 1.4861 0.1075 1.4861 1.2191
No log 0.9412 32 1.4937 0.1075 1.4937 1.2222
No log 1.0 34 1.4969 0.2145 1.4969 1.2235
No log 1.0588 36 1.4441 0.2145 1.4441 1.2017
No log 1.1176 38 1.4261 0.2793 1.4261 1.1942
No log 1.1765 40 1.2291 0.5049 1.2291 1.1087
No log 1.2353 42 1.2232 0.4489 1.2232 1.1060
No log 1.2941 44 1.3270 0.4129 1.3270 1.1519
No log 1.3529 46 1.7030 0.3548 1.7030 1.3050
No log 1.4118 48 1.8269 0.2278 1.8269 1.3516
No log 1.4706 50 1.6784 0.2278 1.6784 1.2955
No log 1.5294 52 1.3571 0.2278 1.3571 1.1649
No log 1.5882 54 1.2220 0.2793 1.2220 1.1055
No log 1.6471 56 1.0662 0.3561 1.0662 1.0326
No log 1.7059 58 1.0652 0.5302 1.0652 1.0321
No log 1.7647 60 1.0683 0.4375 1.0683 1.0336
No log 1.8235 62 1.0448 0.5532 1.0448 1.0221
No log 1.8824 64 1.1419 0.2793 1.1419 1.0686
No log 1.9412 66 1.2778 0.2536 1.2778 1.1304
No log 2.0 68 1.3291 0.4104 1.3291 1.1529
No log 2.0588 70 1.2572 0.3593 1.2572 1.1212
No log 2.1176 72 1.1013 0.4081 1.1013 1.0494
No log 2.1765 74 1.0399 0.3531 1.0399 1.0197
No log 2.2353 76 1.0430 0.3531 1.0430 1.0213
No log 2.2941 78 1.1672 0.3119 1.1672 1.0804
No log 2.3529 80 1.2289 0.4571 1.2289 1.1086
No log 2.4118 82 1.1399 0.4068 1.1399 1.0676
No log 2.4706 84 1.0955 0.3531 1.0955 1.0467
No log 2.5294 86 0.9875 0.2435 0.9875 0.9937
No log 2.5882 88 1.0009 0.4059 1.0009 1.0005
No log 2.6471 90 1.0306 0.4310 1.0306 1.0152
No log 2.7059 92 1.0577 0.4015 1.0577 1.0284
No log 2.7647 94 1.1296 0.3226 1.1296 1.0628
No log 2.8235 96 1.2591 0.1893 1.2591 1.1221
No log 2.8824 98 1.2695 0.4783 1.2695 1.1267
No log 2.9412 100 1.2468 0.4194 1.2468 1.1166
No log 3.0 102 1.0005 0.6303 1.0005 1.0002
No log 3.0588 104 0.9247 0.5031 0.9247 0.9616
No log 3.1176 106 0.9413 0.5973 0.9413 0.9702
No log 3.1765 108 1.0525 0.5973 1.0525 1.0259
No log 3.2353 110 1.3132 0.4408 1.3132 1.1460
No log 3.2941 112 1.4198 0.4415 1.4198 1.1916
No log 3.3529 114 1.3006 0.3782 1.3006 1.1404
No log 3.4118 116 1.0834 0.4571 1.0834 1.0409
No log 3.4706 118 0.8984 0.5260 0.8984 0.9478
No log 3.5294 120 0.8354 0.5291 0.8354 0.9140
No log 3.5882 122 0.8330 0.4591 0.8330 0.9127
No log 3.6471 124 0.8559 0.5291 0.8559 0.9252
No log 3.7059 126 0.9729 0.5253 0.9729 0.9863
No log 3.7647 128 1.1811 0.4360 1.1811 1.0868
No log 3.8235 130 1.3160 0.4167 1.3160 1.1472
No log 3.8824 132 1.3168 0.3794 1.3168 1.1475
No log 3.9412 134 1.1811 0.5 1.1811 1.0868
No log 4.0 136 0.9728 0.5393 0.9728 0.9863
No log 4.0588 138 0.8334 0.5214 0.8334 0.9129
No log 4.1176 140 0.8114 0.6239 0.8114 0.9008
No log 4.1765 142 0.8147 0.6639 0.8147 0.9026
No log 4.2353 144 0.8330 0.5917 0.8330 0.9127
No log 4.2941 146 0.9435 0.6303 0.9435 0.9713
No log 4.3529 148 1.0325 0.5714 1.0325 1.0161
No log 4.4118 150 1.0812 0.5714 1.0812 1.0398
No log 4.4706 152 1.0417 0.5896 1.0417 1.0206
No log 4.5294 154 1.0208 0.5896 1.0208 1.0103
No log 4.5882 156 0.9389 0.6491 0.9389 0.9690
No log 4.6471 158 0.8951 0.5933 0.8951 0.9461
No log 4.7059 160 0.8325 0.6265 0.8325 0.9124
No log 4.7647 162 0.8160 0.5458 0.8160 0.9034
No log 4.8235 164 0.8190 0.5270 0.8190 0.9050
No log 4.8824 166 0.8467 0.5882 0.8467 0.9202
No log 4.9412 168 0.8977 0.6257 0.8977 0.9475
No log 5.0 170 0.9809 0.5200 0.9809 0.9904
No log 5.0588 172 1.1565 0.5188 1.1565 1.0754
No log 5.1176 174 1.2126 0.4808 1.2126 1.1012
No log 5.1765 176 1.1694 0.4806 1.1694 1.0814
No log 5.2353 178 1.0516 0.5769 1.0516 1.0255
No log 5.2941 180 0.9614 0.5804 0.9614 0.9805
No log 5.3529 182 0.9131 0.5804 0.9131 0.9556
No log 5.4118 184 0.8741 0.5804 0.8741 0.9349
No log 5.4706 186 0.8905 0.5629 0.8905 0.9437
No log 5.5294 188 0.8731 0.5600 0.8731 0.9344
No log 5.5882 190 0.8050 0.5629 0.8050 0.8972
No log 5.6471 192 0.7582 0.6257 0.7582 0.8707
No log 5.7059 194 0.7471 0.6610 0.7471 0.8644
No log 5.7647 196 0.7693 0.6610 0.7693 0.8771
No log 5.8235 198 0.8422 0.6610 0.8422 0.9177
No log 5.8824 200 0.9601 0.6610 0.9601 0.9798
No log 5.9412 202 0.9913 0.6161 0.9913 0.9956
No log 6.0 204 1.0106 0.6161 1.0106 1.0053
No log 6.0588 206 0.9522 0.6491 0.9522 0.9758
No log 6.1176 208 0.9329 0.6303 0.9329 0.9659
No log 6.1765 210 0.9673 0.6303 0.9673 0.9835
No log 6.2353 212 1.0462 0.5385 1.0462 1.0228
No log 6.2941 214 1.1080 0.5556 1.1080 1.0526
No log 6.3529 216 1.0823 0.5581 1.0823 1.0403
No log 6.4118 218 0.9983 0.5581 0.9983 0.9991
No log 6.4706 220 0.9298 0.5581 0.9298 0.9643
No log 6.5294 222 0.8591 0.5581 0.8591 0.9269
No log 6.5882 224 0.8290 0.5581 0.8290 0.9105
No log 6.6471 226 0.8368 0.6161 0.8368 0.9147
No log 6.7059 228 0.8911 0.5581 0.8911 0.9440
No log 6.7647 230 1.0036 0.5581 1.0036 1.0018
No log 6.8235 232 1.1029 0.5185 1.1029 1.0502
No log 6.8824 234 1.1714 0.4803 1.1714 1.0823
No log 6.9412 236 1.1705 0.4806 1.1705 1.0819
No log 7.0 238 1.1237 0.4806 1.1237 1.0600
No log 7.0588 240 1.0565 0.4806 1.0565 1.0278
No log 7.1176 242 0.9915 0.5188 0.9915 0.9957
No log 7.1765 244 0.9381 0.5629 0.9381 0.9686
No log 7.2353 246 0.8931 0.5629 0.8931 0.9450
No log 7.2941 248 0.8725 0.5629 0.8725 0.9341
No log 7.3529 250 0.8462 0.5629 0.8462 0.9199
No log 7.4118 252 0.8038 0.5629 0.8038 0.8965
No log 7.4706 254 0.7685 0.5629 0.7685 0.8766
No log 7.5294 256 0.7442 0.5629 0.7442 0.8627
No log 7.5882 258 0.7239 0.6008 0.7239 0.8508
No log 7.6471 260 0.7166 0.6610 0.7166 0.8465
No log 7.7059 262 0.7221 0.6610 0.7221 0.8497
No log 7.7647 264 0.7514 0.6610 0.7514 0.8668
No log 7.8235 266 0.7865 0.6610 0.7865 0.8869
No log 7.8824 268 0.8232 0.6008 0.8232 0.9073
No log 7.9412 270 0.8960 0.5965 0.8960 0.9466
No log 8.0 272 0.9492 0.5965 0.9492 0.9743
No log 8.0588 274 0.9536 0.5965 0.9536 0.9765
No log 8.1176 276 0.9515 0.5965 0.9515 0.9755
No log 8.1765 278 0.9126 0.5965 0.9126 0.9553
No log 8.2353 280 0.8606 0.6545 0.8606 0.9277
No log 8.2941 282 0.8099 0.6545 0.8099 0.9000
No log 8.3529 284 0.7997 0.6545 0.7997 0.8942
No log 8.4118 286 0.8092 0.6545 0.8092 0.8995
No log 8.4706 288 0.8134 0.6545 0.8134 0.9019
No log 8.5294 290 0.8376 0.6008 0.8376 0.9152
No log 8.5882 292 0.8403 0.5629 0.8403 0.9167
No log 8.6471 294 0.8367 0.5629 0.8367 0.9147
No log 8.7059 296 0.8215 0.5629 0.8215 0.9064
No log 8.7647 298 0.7958 0.6008 0.7958 0.8921
No log 8.8235 300 0.7767 0.6008 0.7767 0.8813
No log 8.8824 302 0.7571 0.6008 0.7571 0.8701
No log 8.9412 304 0.7570 0.6008 0.7570 0.8701
No log 9.0 306 0.7614 0.6008 0.7614 0.8726
No log 9.0588 308 0.7724 0.5629 0.7724 0.8789
No log 9.1176 310 0.7734 0.5629 0.7734 0.8794
No log 9.1765 312 0.7722 0.5629 0.7722 0.8787
No log 9.2353 314 0.7636 0.5629 0.7636 0.8738
No log 9.2941 316 0.7599 0.5629 0.7599 0.8717
No log 9.3529 318 0.7606 0.6008 0.7606 0.8721
No log 9.4118 320 0.7642 0.6008 0.7642 0.8742
No log 9.4706 322 0.7705 0.6008 0.7705 0.8778
No log 9.5294 324 0.7825 0.6008 0.7825 0.8846
No log 9.5882 326 0.7944 0.6008 0.7944 0.8913
No log 9.6471 328 0.8023 0.6008 0.8023 0.8957
No log 9.7059 330 0.8083 0.6008 0.8083 0.8991
No log 9.7647 332 0.8122 0.6008 0.8122 0.9012
No log 9.8235 334 0.8148 0.6008 0.8148 0.9026
No log 9.8824 336 0.8165 0.6008 0.8165 0.9036
No log 9.9412 338 0.8158 0.6008 0.8158 0.9032
No log 10.0 340 0.8154 0.6008 0.8154 0.9030

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/Arabic_FineTuningAraBERT_AugV0_k4_task1_organization_fold0

Finetuned
(702)
this model