jluckyboyj's picture
update model card README.md
5245e14
metadata
license: mit
tags:
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: xlm-roberta-large-finetuned-augument-visquad2-27-3-2023-3
    results: []

xlm-roberta-large-finetuned-augument-visquad2-27-3-2023-3

This model is a fine-tuned version of xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:

  • Best F1: 75.3631
  • Loss: 2.0450
  • Exact: 38.9165
  • F1: 56.3720
  • Total: 3821
  • Hasans Exact: 55.9744
  • Hasans F1: 81.1148
  • Hasans Total: 2653
  • Noans Exact: 0.1712
  • Noans F1: 0.1712
  • Noans Total: 1168
  • Best Exact: 59.7749
  • Best Exact Thresh: 0.5183
  • Best F1 Thresh: 0.8690

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Best F1 Validation Loss Exact F1 Total Hasans Exact Hasans F1 Hasans Total Noans Exact Noans F1 Noans Total Best Exact Best Exact Thresh Best F1 Thresh
0.8597 1.0 4221 66.4890 1.2255 36.1947 54.1414 3821 52.1297 77.9775 2653 0.0 0.0 1168 52.9704 0.8158 0.9074
0.4623 2.0 8443 70.0050 1.1813 37.8173 55.5970 3821 54.4666 80.0740 2653 0.0 0.0 1168 55.1950 0.7529 0.8275
0.2999 3.0 12664 75.0810 1.2417 39.8587 56.3329 3821 57.3690 81.0961 2653 0.0856 0.0856 1168 60.4030 0.9294 0.9459
0.1915 4.0 16886 74.7037 1.6500 38.7333 56.2476 3821 55.7482 80.9733 2653 0.0856 0.0856 1168 58.6496 0.7690 0.9767
0.1185 5.0 21105 75.3631 2.0450 38.9165 56.3720 3821 55.9744 81.1148 2653 0.1712 0.1712 1168 59.7749 0.5183 0.8690

Framework versions

  • Transformers 4.27.3
  • Pytorch 1.13.1+cu117
  • Datasets 2.10.1
  • Tokenizers 0.13.2