Edit model card

arabert_cross_vocabulary_task2_fold4

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4047
  • Qwk: 0.8288
  • Mse: 0.4047

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0328 2 1.7452 0.0006 1.7452
No log 0.0656 4 1.2370 0.1274 1.2370
No log 0.0984 6 0.8329 0.4478 0.8329
No log 0.1311 8 0.6416 0.5311 0.6416
No log 0.1639 10 0.5045 0.6702 0.5045
No log 0.1967 12 0.7393 0.7456 0.7393
No log 0.2295 14 0.6764 0.7611 0.6764
No log 0.2623 16 0.4405 0.7742 0.4405
No log 0.2951 18 0.4529 0.6406 0.4529
No log 0.3279 20 0.4250 0.6970 0.4250
No log 0.3607 22 0.4454 0.8071 0.4454
No log 0.3934 24 0.6139 0.7900 0.6139
No log 0.4262 26 0.5738 0.7676 0.5738
No log 0.4590 28 0.4153 0.7533 0.4153
No log 0.4918 30 0.3615 0.7288 0.3615
No log 0.5246 32 0.3546 0.7617 0.3546
No log 0.5574 34 0.3710 0.7951 0.3710
No log 0.5902 36 0.4183 0.8076 0.4183
No log 0.6230 38 0.4794 0.8180 0.4794
No log 0.6557 40 0.4977 0.8093 0.4977
No log 0.6885 42 0.4901 0.8202 0.4901
No log 0.7213 44 0.4251 0.8233 0.4251
No log 0.7541 46 0.3869 0.8256 0.3869
No log 0.7869 48 0.3765 0.8231 0.3765
No log 0.8197 50 0.3956 0.8234 0.3956
No log 0.8525 52 0.4092 0.8176 0.4092
No log 0.8852 54 0.4118 0.8198 0.4118
No log 0.9180 56 0.4170 0.8220 0.4170
No log 0.9508 58 0.4094 0.8265 0.4094
No log 0.9836 60 0.4047 0.8288 0.4047

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
135M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for salbatarni/arabert_cross_vocabulary_task2_fold4

Finetuned
(682)
this model