salbatarni's picture
End of training
0cefafa verified
|
raw
history blame
7.39 kB
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_cross_relevance_task1_fold2
    results: []

arabert_cross_relevance_task1_fold2

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4917
  • Qwk: -0.0345
  • Mse: 0.4917

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.1176 2 0.5402 0.0504 0.5402
No log 0.2353 4 0.5522 0.1129 0.5522
No log 0.3529 6 0.4760 0.1129 0.4760
No log 0.4706 8 0.2772 0.0 0.2772
No log 0.5882 10 0.2693 0.0 0.2693
No log 0.7059 12 0.2659 -0.0235 0.2659
No log 0.8235 14 0.3218 -0.0448 0.3218
No log 0.9412 16 0.3400 -0.1260 0.3400
No log 1.0588 18 0.3031 -0.0764 0.3031
No log 1.1765 20 0.3006 -0.0235 0.3006
No log 1.2941 22 0.3269 0.0 0.3269
No log 1.4118 24 0.3005 0.0 0.3005
No log 1.5294 26 0.2883 -0.0374 0.2883
No log 1.6471 28 0.3040 -0.0764 0.3040
No log 1.7647 30 0.2964 -0.0374 0.2964
No log 1.8824 32 0.2940 -0.0235 0.2940
No log 2.0 34 0.2905 -0.0473 0.2905
No log 2.1176 36 0.2906 -0.0473 0.2906
No log 2.2353 38 0.2973 0.0 0.2973
No log 2.3529 40 0.3635 0.0 0.3635
No log 2.4706 42 0.4742 -0.0268 0.4742
No log 2.5882 44 0.4561 -0.0185 0.4561
No log 2.7059 46 0.3327 0.0 0.3327
No log 2.8235 48 0.2840 0.0 0.2840
No log 2.9412 50 0.2878 -0.0235 0.2878
No log 3.0588 52 0.2902 -0.0235 0.2902
No log 3.1765 54 0.2821 -0.0473 0.2821
No log 3.2941 56 0.2813 0.0 0.2813
No log 3.4118 58 0.3125 0.0 0.3125
No log 3.5294 60 0.3580 0.0 0.3580
No log 3.6471 62 0.3412 0.0 0.3412
No log 3.7647 64 0.3136 0.0 0.3136
No log 3.8824 66 0.2971 -0.0235 0.2971
No log 4.0 68 0.2918 -0.0135 0.2918
No log 4.1176 70 0.2894 -0.0235 0.2894
No log 4.2353 72 0.2973 0.0 0.2973
No log 4.3529 74 0.3060 0.0 0.3060
No log 4.4706 76 0.3066 0.0 0.3066
No log 4.5882 78 0.3086 0.0 0.3086
No log 4.7059 80 0.3207 0.0 0.3207
No log 4.8235 82 0.3228 0.0 0.3228
No log 4.9412 84 0.3232 0.0 0.3232
No log 5.0588 86 0.3306 0.0 0.3306
No log 5.1765 88 0.3322 0.0 0.3322
No log 5.2941 90 0.3595 0.0 0.3595
No log 5.4118 92 0.3659 0.0 0.3659
No log 5.5294 94 0.3968 0.0 0.3968
No log 5.6471 96 0.4325 -0.0185 0.4325
No log 5.7647 98 0.4240 -0.0185 0.4240
No log 5.8824 100 0.3886 -0.0096 0.3886
No log 6.0 102 0.3688 -0.0096 0.3688
No log 6.1176 104 0.3669 -0.0096 0.3669
No log 6.2353 106 0.3824 -0.0096 0.3824
No log 6.3529 108 0.3765 -0.0323 0.3765
No log 6.4706 110 0.3534 -0.0235 0.3534
No log 6.5882 112 0.3724 -0.0235 0.3724
No log 6.7059 114 0.4165 -0.0323 0.4165
No log 6.8235 116 0.4596 -0.0533 0.4596
No log 6.9412 118 0.4708 -0.0591 0.4708
No log 7.0588 120 0.4502 -0.0405 0.4502
No log 7.1765 122 0.4102 -0.0323 0.4102
No log 7.2941 124 0.4035 -0.0323 0.4035
No log 7.4118 126 0.4253 -0.0323 0.4253
No log 7.5294 128 0.4225 -0.0323 0.4225
No log 7.6471 130 0.4086 -0.0323 0.4086
No log 7.7647 132 0.4280 -0.0260 0.4280
No log 7.8824 134 0.4752 -0.0233 0.4752
No log 8.0 136 0.4963 -0.0193 0.4963
No log 8.1176 138 0.5295 -0.0104 0.5295
No log 8.2353 140 0.5401 -0.0104 0.5401
No log 8.3529 142 0.5096 -0.0193 0.5096
No log 8.4706 144 0.4677 -0.0279 0.4677
No log 8.5882 146 0.4416 -0.0135 0.4416
No log 8.7059 148 0.4452 -0.0135 0.4452
No log 8.8235 150 0.4725 -0.0279 0.4725
No log 8.9412 152 0.4991 -0.0233 0.4991
No log 9.0588 154 0.5151 -0.0251 0.5151
No log 9.1765 156 0.5160 -0.0251 0.5160
No log 9.2941 158 0.5242 -0.0251 0.5242
No log 9.4118 160 0.5189 -0.0251 0.5189
No log 9.5294 162 0.5080 -0.0193 0.5080
No log 9.6471 164 0.4972 -0.0233 0.4972
No log 9.7647 166 0.4929 -0.0167 0.4929
No log 9.8824 168 0.4916 -0.0345 0.4916
No log 10.0 170 0.4917 -0.0345 0.4917

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1