Edit model card

Arabic_FineTuningAraBERT_AugV0_k5_task1_organization_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7707
  • Qwk: 0.7576
  • Mse: 0.7707
  • Rmse: 0.8779

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0513 2 4.6963 0.0512 4.6963 2.1671
No log 0.1026 4 2.6587 0.2075 2.6587 1.6306
No log 0.1538 6 1.5451 0.1312 1.5451 1.2430
No log 0.2051 8 1.3150 0.2241 1.3150 1.1467
No log 0.2564 10 1.6117 0.1391 1.6117 1.2695
No log 0.3077 12 1.7262 0.0050 1.7262 1.3138
No log 0.3590 14 1.5949 0.0801 1.5949 1.2629
No log 0.4103 16 1.3723 0.2416 1.3723 1.1714
No log 0.4615 18 1.5349 0.0490 1.5349 1.2389
No log 0.5128 20 1.8348 0.1139 1.8348 1.3545
No log 0.5641 22 1.4962 0.0 1.4962 1.2232
No log 0.6154 24 1.2684 0.4310 1.2684 1.1262
No log 0.6667 26 1.4050 0.1873 1.4050 1.1853
No log 0.7179 28 1.4856 0.1348 1.4856 1.2189
No log 0.7692 30 1.3743 0.1391 1.3743 1.1723
No log 0.8205 32 1.4283 0.1119 1.4283 1.1951
No log 0.8718 34 1.6024 0.1370 1.6024 1.2659
No log 0.9231 36 1.5973 0.0300 1.5973 1.2638
No log 0.9744 38 1.5907 0.1500 1.5907 1.2612
No log 1.0256 40 1.6646 0.2611 1.6646 1.2902
No log 1.0769 42 1.6029 0.2711 1.6029 1.2661
No log 1.1282 44 1.4742 0.1500 1.4742 1.2142
No log 1.1795 46 1.3571 0.2518 1.3571 1.1649
No log 1.2308 48 1.2628 0.2776 1.2628 1.1238
No log 1.2821 50 1.1896 0.2776 1.1896 1.0907
No log 1.3333 52 1.1146 0.2435 1.1146 1.0558
No log 1.3846 54 1.0737 0.2435 1.0737 1.0362
No log 1.4359 56 1.0344 0.2454 1.0344 1.0171
No log 1.4872 58 1.0301 0.3546 1.0301 1.0150
No log 1.5385 60 1.0234 0.3546 1.0234 1.0117
No log 1.5897 62 1.0075 0.3546 1.0075 1.0038
No log 1.6410 64 1.0326 0.4555 1.0326 1.0162
No log 1.6923 66 1.0653 0.5484 1.0653 1.0321
No log 1.7436 68 1.1390 0.5243 1.1390 1.0673
No log 1.7949 70 1.1314 0.5243 1.1314 1.0637
No log 1.8462 72 1.1287 0.5011 1.1287 1.0624
No log 1.8974 74 1.0937 0.5011 1.0937 1.0458
No log 1.9487 76 1.1883 0.5204 1.1883 1.0901
No log 2.0 78 1.1590 0.5417 1.1590 1.0766
No log 2.0513 80 1.0537 0.5661 1.0537 1.0265
No log 2.1026 82 0.9748 0.4797 0.9748 0.9873
No log 2.1538 84 0.9595 0.4797 0.9595 0.9796
No log 2.2051 86 0.9454 0.5679 0.9454 0.9723
No log 2.2564 88 0.9212 0.5043 0.9212 0.9598
No log 2.3077 90 0.8703 0.5494 0.8703 0.9329
No log 2.3590 92 0.8984 0.6323 0.8984 0.9479
No log 2.4103 94 1.0262 0.5638 1.0262 1.0130
No log 2.4615 96 0.9829 0.4807 0.9829 0.9914
No log 2.5128 98 0.8922 0.5020 0.8922 0.9446
No log 2.5641 100 0.7934 0.6272 0.7934 0.8907
No log 2.6154 102 0.7458 0.6550 0.7458 0.8636
No log 2.6667 104 0.7198 0.7321 0.7198 0.8484
No log 2.7179 106 0.7429 0.7025 0.7429 0.8619
No log 2.7692 108 0.7712 0.6023 0.7712 0.8782
No log 2.8205 110 0.8170 0.5785 0.8170 0.9039
No log 2.8718 112 0.9368 0.5004 0.9368 0.9679
No log 2.9231 114 0.9925 0.5349 0.9925 0.9962
No log 2.9744 116 0.9268 0.5366 0.9268 0.9627
No log 3.0256 118 0.8453 0.5218 0.8453 0.9194
No log 3.0769 120 0.7938 0.5422 0.7938 0.8909
No log 3.1282 122 0.8459 0.5205 0.8459 0.9197
No log 3.1795 124 0.8118 0.5752 0.8118 0.9010
No log 3.2308 126 0.7268 0.6420 0.7268 0.8525
No log 3.2821 128 0.6940 0.6824 0.6940 0.8331
No log 3.3333 130 0.6935 0.6824 0.6935 0.8328
No log 3.3846 132 0.7148 0.6225 0.7148 0.8455
No log 3.4359 134 0.8006 0.6309 0.8006 0.8948
No log 3.4872 136 0.8836 0.5540 0.8836 0.9400
No log 3.5385 138 0.9300 0.5532 0.9300 0.9644
No log 3.5897 140 0.9075 0.5714 0.9075 0.9526
No log 3.6410 142 0.8476 0.6356 0.8476 0.9206
No log 3.6923 144 0.9058 0.6303 0.9058 0.9517
No log 3.7436 146 1.0258 0.5349 1.0258 1.0128
No log 3.7949 148 1.0290 0.5489 1.0290 1.0144
No log 3.8462 150 0.8547 0.6356 0.8547 0.9245
No log 3.8974 152 0.7195 0.6225 0.7195 0.8483
No log 3.9487 154 0.6912 0.6824 0.6912 0.8314
No log 4.0 156 0.7483 0.6934 0.7483 0.8651
No log 4.0513 158 0.8263 0.6253 0.8263 0.9090
No log 4.1026 160 0.7983 0.6791 0.7983 0.8935
No log 4.1538 162 0.7378 0.6860 0.7378 0.8589
No log 4.2051 164 0.6791 0.6225 0.6791 0.8241
No log 4.2564 166 0.6811 0.6818 0.6811 0.8253
No log 4.3077 168 0.7195 0.6309 0.7195 0.8482
No log 4.3590 170 0.7909 0.6260 0.7909 0.8893
No log 4.4103 172 0.8198 0.6732 0.8198 0.9054
No log 4.4615 174 0.7936 0.6732 0.7936 0.8909
No log 4.5128 176 0.7145 0.7333 0.7145 0.8453
No log 4.5641 178 0.6334 0.6757 0.6334 0.7959
No log 4.6154 180 0.6097 0.6757 0.6097 0.7808
No log 4.6667 182 0.6360 0.7516 0.6360 0.7975
No log 4.7179 184 0.7418 0.7764 0.7418 0.8613
No log 4.7692 186 0.7842 0.7670 0.7842 0.8856
No log 4.8205 188 0.7546 0.7670 0.7546 0.8687
No log 4.8718 190 0.7350 0.7764 0.7350 0.8573
No log 4.9231 192 0.7037 0.7864 0.7037 0.8389
No log 4.9744 194 0.6977 0.7864 0.6977 0.8353
No log 5.0256 196 0.7737 0.7169 0.7737 0.8796
No log 5.0769 198 0.8910 0.6769 0.8910 0.9439
No log 5.1282 200 0.8860 0.6769 0.8860 0.9413
No log 5.1795 202 0.8939 0.6811 0.8939 0.9455
No log 5.2308 204 0.8592 0.6965 0.8592 0.9270
No log 5.2821 206 0.8582 0.6965 0.8582 0.9264
No log 5.3333 208 0.8238 0.7422 0.8238 0.9077
No log 5.3846 210 0.7685 0.8171 0.7685 0.8766
No log 5.4359 212 0.6986 0.7864 0.6986 0.8358
No log 5.4872 214 0.6802 0.7864 0.6802 0.8248
No log 5.5385 216 0.7062 0.7328 0.7062 0.8404
No log 5.5897 218 0.8059 0.7422 0.8059 0.8977
No log 5.6410 220 0.8359 0.7422 0.8359 0.9143
No log 5.6923 222 0.7817 0.7422 0.7817 0.8841
No log 5.7436 224 0.7734 0.7422 0.7734 0.8794
No log 5.7949 226 0.8506 0.6965 0.8506 0.9223
No log 5.8462 228 0.9084 0.6811 0.9084 0.9531
No log 5.8974 230 0.9446 0.6811 0.9446 0.9719
No log 5.9487 232 0.9839 0.6811 0.9839 0.9919
No log 6.0 234 0.9530 0.6811 0.9530 0.9762
No log 6.0513 236 0.8703 0.6811 0.8703 0.9329
No log 6.1026 238 0.7702 0.6610 0.7702 0.8776
No log 6.1538 240 0.7159 0.6791 0.7159 0.8461
No log 6.2051 242 0.6992 0.7670 0.6992 0.8362
No log 6.2564 244 0.7328 0.7670 0.7328 0.8560
No log 6.3077 246 0.7680 0.7583 0.7680 0.8764
No log 6.3590 248 0.7468 0.7670 0.7468 0.8642
No log 6.4103 250 0.7023 0.7670 0.7023 0.8380
No log 6.4615 252 0.6552 0.8171 0.6552 0.8094
No log 6.5128 254 0.6357 0.8171 0.6357 0.7973
No log 6.5641 256 0.6466 0.8070 0.6466 0.8041
No log 6.6154 258 0.7147 0.7586 0.7147 0.8454
No log 6.6667 260 0.8009 0.7059 0.8009 0.8950
No log 6.7179 262 0.8445 0.6662 0.8445 0.9190
No log 6.7692 264 0.9002 0.6704 0.9002 0.9488
No log 6.8205 266 0.8649 0.6704 0.8649 0.9300
No log 6.8718 268 0.7567 0.7944 0.7567 0.8699
No log 6.9231 270 0.6455 0.8070 0.6455 0.8034
No log 6.9744 272 0.6223 0.8070 0.6223 0.7889
No log 7.0256 274 0.6446 0.8070 0.6446 0.8028
No log 7.0769 276 0.7030 0.7586 0.7030 0.8385
No log 7.1282 278 0.7739 0.7426 0.7739 0.8797
No log 7.1795 280 0.7819 0.7429 0.7819 0.8843
No log 7.2308 282 0.7344 0.7426 0.7344 0.8570
No log 7.2821 284 0.6858 0.8070 0.6858 0.8281
No log 7.3333 286 0.6722 0.8070 0.6722 0.8199
No log 7.3846 288 0.6435 0.8070 0.6435 0.8022
No log 7.4359 290 0.6610 0.8070 0.6610 0.8130
No log 7.4872 292 0.6953 0.7504 0.6953 0.8338
No log 7.5385 294 0.7188 0.7583 0.7188 0.8478
No log 7.5897 296 0.7005 0.7583 0.7005 0.8369
No log 7.6410 298 0.6973 0.7670 0.6973 0.8350
No log 7.6923 300 0.6909 0.7670 0.6909 0.8312
No log 7.7436 302 0.7029 0.7658 0.7029 0.8384
No log 7.7949 304 0.7340 0.7576 0.7340 0.8567
No log 7.8462 306 0.7512 0.7576 0.7512 0.8667
No log 7.8974 308 0.7323 0.7576 0.7323 0.8557
No log 7.9487 310 0.7341 0.7579 0.7341 0.8568
No log 8.0 312 0.7289 0.7579 0.7289 0.8537
No log 8.0513 314 0.7382 0.7579 0.7382 0.8592
No log 8.1026 316 0.7256 0.7579 0.7256 0.8518
No log 8.1538 318 0.7236 0.7579 0.7236 0.8506
No log 8.2051 320 0.7482 0.7502 0.7482 0.8650
No log 8.2564 322 0.7599 0.7502 0.7599 0.8717
No log 8.3077 324 0.7546 0.7502 0.7546 0.8687
No log 8.3590 326 0.7741 0.7502 0.7741 0.8798
No log 8.4103 328 0.7839 0.7502 0.7839 0.8854
No log 8.4615 330 0.8150 0.7143 0.8150 0.9028
No log 8.5128 332 0.8256 0.7055 0.8256 0.9086
No log 8.5641 334 0.8191 0.7422 0.8191 0.9050
No log 8.6154 336 0.8028 0.7422 0.8028 0.8960
No log 8.6667 338 0.7993 0.7422 0.7993 0.8940
No log 8.7179 340 0.7979 0.7422 0.7979 0.8932
No log 8.7692 342 0.8110 0.7422 0.8110 0.9006
No log 8.8205 344 0.8374 0.7055 0.8374 0.9151
No log 8.8718 346 0.8500 0.6998 0.8500 0.9220
No log 8.9231 348 0.8398 0.7143 0.8398 0.9164
No log 8.9744 350 0.8324 0.7143 0.8324 0.9124
No log 9.0256 352 0.8066 0.7143 0.8066 0.8981
No log 9.0769 354 0.7907 0.7502 0.7907 0.8892
No log 9.1282 356 0.7780 0.7502 0.7780 0.8821
No log 9.1795 358 0.7648 0.7502 0.7648 0.8745
No log 9.2308 360 0.7516 0.7579 0.7516 0.8670
No log 9.2821 362 0.7361 0.7579 0.7361 0.8579
No log 9.3333 364 0.7270 0.7579 0.7270 0.8526
No log 9.3846 366 0.7223 0.7579 0.7223 0.8499
No log 9.4359 368 0.7196 0.7579 0.7196 0.8483
No log 9.4872 370 0.7215 0.7579 0.7215 0.8494
No log 9.5385 372 0.7263 0.7579 0.7263 0.8522
No log 9.5897 374 0.7371 0.7579 0.7371 0.8585
No log 9.6410 376 0.7441 0.7579 0.7441 0.8626
No log 9.6923 378 0.7507 0.7502 0.7507 0.8664
No log 9.7436 380 0.7592 0.7502 0.7592 0.8713
No log 9.7949 382 0.7656 0.7502 0.7656 0.8750
No log 9.8462 384 0.7689 0.7502 0.7689 0.8769
No log 9.8974 386 0.7698 0.7502 0.7698 0.8774
No log 9.9487 388 0.7700 0.7502 0.7700 0.8775
No log 10.0 390 0.7707 0.7576 0.7707 0.8779

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/Arabic_FineTuningAraBERT_AugV0_k5_task1_organization_fold0

Finetuned
(702)
this model