xlm-roberta-base-finetuned

This model is a fine-tuned version of FacebookAI/xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2266
  • Accuracy: 0.9541
  • F1: 0.9542

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.2327 200 0.2630 0.8904 0.8872
No log 0.4654 400 0.2128 0.9117 0.9129
No log 0.6981 600 0.1844 0.9308 0.9311
No log 0.9308 800 0.1673 0.9346 0.9346
No log 1.1635 1000 0.1735 0.9346 0.9338
No log 1.3962 1200 0.1440 0.9433 0.9429
No log 1.6289 1400 0.1443 0.9467 0.9469
No log 1.8615 1600 0.1357 0.9504 0.9507
0.2197 2.0942 1800 0.1532 0.9469 0.9473
0.2197 2.3269 2000 0.1478 0.9496 0.9500
0.2197 2.5596 2200 0.1379 0.9501 0.9504
0.2197 2.7923 2400 0.1381 0.9511 0.9514
0.2197 3.0250 2600 0.1627 0.9493 0.9496
0.2197 3.2577 2800 0.1596 0.9546 0.9546
0.2197 3.4904 3000 0.1421 0.9526 0.9527
0.2197 3.7231 3200 0.1459 0.9539 0.9539
0.2197 3.9558 3400 0.1348 0.9495 0.9499
0.1176 4.1885 3600 0.1519 0.9501 0.9507
0.1176 4.4212 3800 0.1570 0.9525 0.9529
0.1176 4.6539 4000 0.1367 0.9511 0.9514
0.1176 4.8866 4200 0.1409 0.954 0.9541
0.1176 5.1193 4400 0.1690 0.9539 0.9541
0.1176 5.3519 4600 0.1757 0.9544 0.9545
0.1176 5.5846 4800 0.1508 0.9513 0.9518
0.1176 5.8173 5000 0.1537 0.9545 0.9546
0.0849 6.0500 5200 0.1814 0.954 0.9540
0.0849 6.2827 5400 0.1674 0.9543 0.9546
0.0849 6.5154 5600 0.1923 0.9538 0.9539
0.0849 6.7481 5800 0.1750 0.9543 0.9545
0.0849 6.9808 6000 0.1890 0.9527 0.9529
0.0849 7.2135 6200 0.1999 0.9547 0.9548
0.0849 7.4462 6400 0.1722 0.9547 0.9549
0.0849 7.6789 6600 0.1693 0.9524 0.9528
0.0849 7.9116 6800 0.1848 0.9548 0.9549
0.0614 8.1443 7000 0.2067 0.9554 0.9554
0.0614 8.3770 7200 0.2073 0.9544 0.9546
0.0614 8.6097 7400 0.1929 0.9547 0.9548
0.0614 8.8424 7600 0.2081 0.9544 0.9545
0.0614 9.0750 7800 0.2031 0.9541 0.9542
0.0614 9.3077 8000 0.2232 0.9549 0.9552
0.0614 9.5404 8200 0.2238 0.9544 0.9545
0.0614 9.7731 8400 0.2266 0.9541 0.9542

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
115
Safetensors
Model size
278M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for RonTon05/xml-roberta-base-finetuned-70kURL

Finetuned
(2596)
this model