salbatarni's picture
End of training
77cc047 verified
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_cross_vocabulary_task3_fold2
    results: []

arabert_cross_vocabulary_task3_fold2

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2889
  • Qwk: 0.1455
  • Mse: 1.2889

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0282 2 8.9247 0.0 8.9247
No log 0.0563 4 6.0585 -0.0018 6.0585
No log 0.0845 6 3.3683 0.0 3.3683
No log 0.1127 8 1.9295 0.0353 1.9295
No log 0.1408 10 1.1575 0.0 1.1575
No log 0.1690 12 0.8179 0.0531 0.8179
No log 0.1972 14 0.7602 -0.0155 0.7602
No log 0.2254 16 0.7582 -0.0014 0.7582
No log 0.2535 18 0.7660 0.0643 0.7660
No log 0.2817 20 0.7464 0.0434 0.7464
No log 0.3099 22 0.7568 0.0 0.7568
No log 0.3380 24 0.7822 0.0 0.7822
No log 0.3662 26 0.8311 0.0 0.8311
No log 0.3944 28 0.8980 0.0 0.8980
No log 0.4225 30 0.9093 0.0 0.9093
No log 0.4507 32 0.8669 0.0 0.8669
No log 0.4789 34 0.8507 0.0 0.8507
No log 0.5070 36 0.8627 0.0 0.8627
No log 0.5352 38 0.8443 0.0 0.8443
No log 0.5634 40 0.8722 0.0 0.8722
No log 0.5915 42 0.9229 0.0 0.9229
No log 0.6197 44 0.9827 0.0 0.9827
No log 0.6479 46 1.0396 0.0 1.0396
No log 0.6761 48 1.1196 0.0 1.1196
No log 0.7042 50 1.1654 0.0 1.1654
No log 0.7324 52 1.1934 0.0086 1.1934
No log 0.7606 54 1.2497 0.0603 1.2497
No log 0.7887 56 1.2723 0.0272 1.2723
No log 0.8169 58 1.2595 0.1040 1.2595
No log 0.8451 60 1.2516 0.1696 1.2516
No log 0.8732 62 1.2442 0.1807 1.2442
No log 0.9014 64 1.2514 0.1696 1.2514
No log 0.9296 66 1.2627 0.1524 1.2627
No log 0.9577 68 1.2780 0.1455 1.2780
No log 0.9859 70 1.2889 0.1455 1.2889

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1