xlm-roberta-large-xnli-v5.0
This model is a fine-tuned version of joeddav/xlm-roberta-large-xnli on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4987
- F1 Macro: 0.8279
- F1 Micro: 0.8288
- Accuracy Balanced: 0.8278
- Accuracy: 0.8288
- Precision Macro: 0.8281
- Recall Macro: 0.8278
- Precision Micro: 0.8288
- Recall Micro: 0.8288
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9e-06
- train_batch_size: 8
- eval_batch_size: 64
- seed: 40
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Micro | Accuracy Balanced | Accuracy | Precision Macro | Recall Macro | Precision Micro | Recall Micro |
---|---|---|---|---|---|---|---|---|---|---|---|
0.3851 | 0.85 | 200 | 0.4586 | 0.8017 | 0.8025 | 0.8029 | 0.8025 | 0.8012 | 0.8029 | 0.8025 | 0.8025 |
0.2689 | 1.69 | 400 | 0.4498 | 0.8137 | 0.8147 | 0.8145 | 0.8147 | 0.8133 | 0.8145 | 0.8147 | 0.8147 |
0.194 | 2.54 | 600 | 0.5334 | 0.8244 | 0.8253 | 0.8252 | 0.8253 | 0.8239 | 0.8252 | 0.8253 | 0.8253 |
eval result
Datasets | asadfgglie/nli-zh-tw-all/test | asadfgglie/BanBan_2024-10-17-facial_expressions-nli/test | eval_dataset | test_dataset |
---|---|---|---|---|
eval_loss | 0.535 | 0.278 | 0.552 | 0.499 |
eval_f1_macro | 0.817 | 0.916 | 0.823 | 0.828 |
eval_f1_micro | 0.818 | 0.916 | 0.824 | 0.829 |
eval_accuracy_balanced | 0.817 | 0.917 | 0.823 | 0.828 |
eval_accuracy | 0.818 | 0.916 | 0.824 | 0.829 |
eval_precision_macro | 0.817 | 0.917 | 0.823 | 0.828 |
eval_recall_macro | 0.817 | 0.917 | 0.823 | 0.828 |
eval_precision_micro | 0.818 | 0.916 | 0.824 | 0.829 |
eval_recall_micro | 0.818 | 0.916 | 0.824 | 0.829 |
eval_runtime | 50.89 | 0.639 | 11.177 | 44.352 |
eval_samples_per_second | 167.026 | 1480.253 | 169.012 | 170.387 |
eval_steps_per_second | 2.613 | 23.471 | 2.684 | 2.683 |
Size of dataset | 8500 | 946 | 1889 | 7557 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.5.1+cu121
- Datasets 2.14.7
- Tokenizers 0.13.3
- Downloads last month
- 7
Model tree for 61347023S/xlm-roberta-large-xnli-v5.0
Base model
joeddav/xlm-roberta-large-xnli