salbatarni's picture
End of training
d8a0616 verified
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_cross_development_task1_fold0
    results: []

arabert_cross_development_task1_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8306
  • Qwk: 0.2565
  • Mse: 0.8306

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.1333 2 4.0034 -0.0039 4.0034
No log 0.2667 4 1.7572 0.0385 1.7572
No log 0.4 6 0.8795 0.1253 0.8795
No log 0.5333 8 1.0722 0.1468 1.0722
No log 0.6667 10 1.9981 0.0739 1.9981
No log 0.8 12 1.4228 0.1298 1.4228
No log 0.9333 14 0.7120 0.1720 0.7120
No log 1.0667 16 0.5182 0.2951 0.5182
No log 1.2 18 0.6133 0.2736 0.6133
No log 1.3333 20 1.1336 0.1776 1.1336
No log 1.4667 22 1.2656 0.1714 1.2656
No log 1.6 24 0.8675 0.2380 0.8675
No log 1.7333 26 0.5730 0.3050 0.5730
No log 1.8667 28 0.5924 0.2784 0.5924
No log 2.0 30 0.6861 0.2120 0.6861
No log 2.1333 32 0.8867 0.2029 0.8867
No log 2.2667 34 0.9172 0.2121 0.9172
No log 2.4 36 0.7721 0.2159 0.7721
No log 2.5333 38 0.8001 0.2277 0.8001
No log 2.6667 40 0.8684 0.2498 0.8684
No log 2.8 42 0.9570 0.2264 0.9570
No log 2.9333 44 0.8803 0.2439 0.8803
No log 3.0667 46 0.7435 0.2799 0.7435
No log 3.2 48 0.6805 0.3082 0.6805
No log 3.3333 50 0.8424 0.2730 0.8424
No log 3.4667 52 0.8402 0.2670 0.8402
No log 3.6 54 0.8115 0.2861 0.8115
No log 3.7333 56 0.8179 0.2776 0.8179
No log 3.8667 58 0.7692 0.2822 0.7692
No log 4.0 60 0.6605 0.2944 0.6605
No log 4.1333 62 0.6724 0.3033 0.6724
No log 4.2667 64 0.7918 0.2700 0.7918
No log 4.4 66 0.9373 0.2649 0.9373
No log 4.5333 68 0.8734 0.2482 0.8734
No log 4.6667 70 0.6994 0.2866 0.6994
No log 4.8 72 0.5761 0.3307 0.5761
No log 4.9333 74 0.6497 0.2879 0.6497
No log 5.0667 76 0.7333 0.2628 0.7333
No log 5.2 78 1.0341 0.2345 1.0341
No log 5.3333 80 1.1378 0.2214 1.1378
No log 5.4667 82 0.9255 0.2527 0.9255
No log 5.6 84 0.6589 0.3108 0.6589
No log 5.7333 86 0.5709 0.3576 0.5709
No log 5.8667 88 0.6392 0.2982 0.6392
No log 6.0 90 0.7919 0.2436 0.7919
No log 6.1333 92 0.7855 0.2412 0.7855
No log 6.2667 94 0.6800 0.2919 0.6800
No log 6.4 96 0.6850 0.2998 0.6850
No log 6.5333 98 0.8473 0.2545 0.8473
No log 6.6667 100 0.9780 0.2417 0.9780
No log 6.8 102 0.9663 0.2331 0.9663
No log 6.9333 104 0.8213 0.2337 0.8213
No log 7.0667 106 0.6942 0.2789 0.6942
No log 7.2 108 0.7119 0.2738 0.7119
No log 7.3333 110 0.8067 0.2517 0.8067
No log 7.4667 112 0.8998 0.2447 0.8998
No log 7.6 114 0.9907 0.2199 0.9907
No log 7.7333 116 0.9290 0.2364 0.9290
No log 7.8667 118 0.7925 0.2551 0.7925
No log 8.0 120 0.7137 0.2738 0.7137
No log 8.1333 122 0.7202 0.2753 0.7202
No log 8.2667 124 0.7738 0.2663 0.7738
No log 8.4 126 0.8078 0.2539 0.8078
No log 8.5333 128 0.8794 0.2516 0.8794
No log 8.6667 130 0.9809 0.2362 0.9809
No log 8.8 132 1.0066 0.2362 1.0066
No log 8.9333 134 0.9524 0.2397 0.9524
No log 9.0667 136 0.8849 0.2464 0.8849
No log 9.2 138 0.8285 0.2565 0.8285
No log 9.3333 140 0.7806 0.2545 0.7806
No log 9.4667 142 0.7707 0.2569 0.7707
No log 9.6 144 0.7836 0.2545 0.7836
No log 9.7333 146 0.8063 0.2550 0.8063
No log 9.8667 148 0.8258 0.2565 0.8258
No log 10.0 150 0.8306 0.2565 0.8306

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1