salbatarni's picture
Training in progress, step 57
111fbb3 verified
|
raw
history blame
3.63 kB
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_cross_organization_task1_fold2
    results: []

arabert_cross_organization_task1_fold2

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1151
  • Qwk: 0.0397
  • Mse: 1.1151

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0308 2 8.1063 0.0 8.1063
No log 0.0615 4 4.3611 0.0014 4.3611
No log 0.0923 6 2.2521 0.0584 2.2521
No log 0.1231 8 1.4384 0.0131 1.4384
No log 0.1538 10 1.0665 0.1602 1.0665
No log 0.1846 12 1.0608 0.0513 1.0608
No log 0.2154 14 1.0636 0.0036 1.0636
No log 0.2462 16 1.0686 0.0036 1.0686
No log 0.2769 18 1.0668 0.0036 1.0668
No log 0.3077 20 1.0565 0.0036 1.0565
No log 0.3385 22 1.0769 0.0289 1.0769
No log 0.3692 24 1.0972 0.0487 1.0972
No log 0.4 26 1.1159 0.0549 1.1159
No log 0.4308 28 1.1270 0.0151 1.1270
No log 0.4615 30 1.1101 0.0506 1.1101
No log 0.4923 32 1.0734 0.0639 1.0734
No log 0.5231 34 1.0767 0.0 1.0767
No log 0.5538 36 1.0893 -0.0185 1.0893
No log 0.5846 38 1.0994 -0.0366 1.0994
No log 0.6154 40 1.1021 -0.0710 1.1021
No log 0.6462 42 1.1002 -0.0366 1.1002
No log 0.6769 44 1.1002 -0.0366 1.1002
No log 0.7077 46 1.0992 -0.0366 1.0992
No log 0.7385 48 1.0933 -0.0366 1.0933
No log 0.7692 50 1.0949 0.0170 1.0949
No log 0.8 52 1.1043 0.0338 1.1043
No log 0.8308 54 1.1170 0.0169 1.1170
No log 0.8615 56 1.1228 0.0651 1.1228
No log 0.8923 58 1.1212 0.0474 1.1212
No log 0.9231 60 1.1220 0.0474 1.1220
No log 0.9538 62 1.1176 0.0397 1.1176
No log 0.9846 64 1.1151 0.0397 1.1151

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1