salbatarni's picture
End of training
b856cdd verified
|
raw
history blame
3.46 kB
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_baseline_development_task5_fold0
    results: []

arabert_baseline_development_task5_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0419
  • Qwk: 0.5183
  • Mse: 1.0419

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.3333 2 1.6380 0.0929 1.6380
No log 0.6667 4 1.5106 0.0 1.5106
No log 1.0 6 1.3471 0.0 1.3471
No log 1.3333 8 1.2335 0.0 1.2335
No log 1.6667 10 1.2261 0.1576 1.2261
No log 2.0 12 1.2476 0.2560 1.2476
No log 2.3333 14 1.2739 0.2628 1.2739
No log 2.6667 16 1.2494 0.2628 1.2494
No log 3.0 18 1.1857 0.4062 1.1857
No log 3.3333 20 1.1005 0.3598 1.1005
No log 3.6667 22 1.1228 0.3976 1.1228
No log 4.0 24 1.0985 0.3976 1.0985
No log 4.3333 26 1.0341 0.375 1.0341
No log 4.6667 28 0.9989 0.5789 0.9989
No log 5.0 30 0.9963 0.5789 0.9963
No log 5.3333 32 0.9852 0.5361 0.9852
No log 5.6667 34 0.9961 0.5991 0.9961
No log 6.0 36 1.0348 0.4725 1.0348
No log 6.3333 38 1.0147 0.5238 1.0147
No log 6.6667 40 1.0123 0.5238 1.0123
No log 7.0 42 0.9900 0.5833 0.9900
No log 7.3333 44 0.9764 0.4872 0.9764
No log 7.6667 46 0.9850 0.4872 0.9850
No log 8.0 48 0.9775 0.5327 0.9775
No log 8.3333 50 0.9860 0.5047 0.9860
No log 8.6667 52 0.9983 0.5183 0.9983
No log 9.0 54 1.0128 0.5183 1.0128
No log 9.3333 56 1.0321 0.5183 1.0321
No log 9.6667 58 1.0407 0.5183 1.0407
No log 10.0 60 1.0419 0.5183 1.0419

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1