salbatarni's picture
Training in progress, step 57
e2e79e4 verified
|
raw
history blame
3.59 kB
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_cross_relevance_task6_fold6
    results: []

arabert_cross_relevance_task6_fold6

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2708
  • Qwk: 0.2181
  • Mse: 0.2714

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0308 2 0.3997 0.1409 0.4004
No log 0.0615 4 0.3224 0.1635 0.3230
No log 0.0923 6 0.3114 0.1336 0.3118
No log 0.1231 8 0.2792 0.1434 0.2797
No log 0.1538 10 0.5126 0.1528 0.5116
No log 0.1846 12 0.5725 0.1322 0.5710
No log 0.2154 14 0.3291 0.2313 0.3295
No log 0.2462 16 0.2709 0.2220 0.2712
No log 0.2769 18 0.2704 0.1746 0.2703
No log 0.3077 20 0.2714 0.1858 0.2712
No log 0.3385 22 0.2684 0.2083 0.2682
No log 0.3692 24 0.2725 0.2135 0.2725
No log 0.4 26 0.2759 0.2135 0.2760
No log 0.4308 28 0.2785 0.2173 0.2789
No log 0.4615 30 0.2798 0.1719 0.2803
No log 0.4923 32 0.2835 0.1711 0.2841
No log 0.5231 34 0.2859 0.1750 0.2866
No log 0.5538 36 0.2843 0.1681 0.2850
No log 0.5846 38 0.2824 0.1738 0.2833
No log 0.6154 40 0.2828 0.2186 0.2837
No log 0.6462 42 0.2811 0.2161 0.2821
No log 0.6769 44 0.2826 0.2239 0.2836
No log 0.7077 46 0.2875 0.2181 0.2884
No log 0.7385 48 0.2931 0.2181 0.2940
No log 0.7692 50 0.2938 0.2181 0.2947
No log 0.8 52 0.2942 0.2266 0.2950
No log 0.8308 54 0.2868 0.2266 0.2876
No log 0.8615 56 0.2800 0.2224 0.2808
No log 0.8923 58 0.2751 0.2181 0.2758
No log 0.9231 60 0.2725 0.2181 0.2732
No log 0.9538 62 0.2712 0.2181 0.2718
No log 0.9846 64 0.2708 0.2181 0.2714

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1