salbatarni's picture
End of training
ac37300 verified
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_baseline_organization_task5_fold1
    results: []

arabert_baseline_organization_task5_fold1

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8569
  • Qwk: 0.4778
  • Mse: 0.8569

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.3333 2 2.4863 -0.0641 2.4863
No log 0.6667 4 1.3863 0.0 1.3863
No log 1.0 6 1.0547 0.1389 1.0547
No log 1.3333 8 1.0296 0.2049 1.0296
No log 1.6667 10 0.9224 0.2049 0.9224
No log 2.0 12 0.8332 0.3787 0.8332
No log 2.3333 14 0.8196 0.2895 0.8196
No log 2.6667 16 0.8152 0.2895 0.8152
No log 3.0 18 0.8909 0.2895 0.8909
No log 3.3333 20 0.9518 0.2736 0.9518
No log 3.6667 22 0.8848 0.2895 0.8848
No log 4.0 24 0.9478 0.4663 0.9478
No log 4.3333 26 1.0547 0.3316 1.0547
No log 4.6667 28 1.0817 0.3516 1.0817
No log 5.0 30 1.0055 0.4643 1.0055
No log 5.3333 32 0.8978 0.5068 0.8978
No log 5.6667 34 0.8372 0.5479 0.8372
No log 6.0 36 0.8407 0.5455 0.8407
No log 6.3333 38 0.9061 0.4778 0.9061
No log 6.6667 40 0.9725 0.4643 0.9725
No log 7.0 42 0.9463 0.4778 0.9463
No log 7.3333 44 0.8661 0.5455 0.8661
No log 7.6667 46 0.8220 0.5844 0.8220
No log 8.0 48 0.8110 0.5844 0.8110
No log 8.3333 50 0.8224 0.5844 0.8224
No log 8.6667 52 0.8540 0.4778 0.8540
No log 9.0 54 0.8631 0.4778 0.8631
No log 9.3333 56 0.8567 0.4778 0.8567
No log 9.6667 58 0.8579 0.4778 0.8579
No log 10.0 60 0.8569 0.4778 0.8569

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1