anvorja's picture
Uploading best model based on F1 score
f654dcc verified
|
raw
history blame
3.72 kB
metadata
library_name: transformers
base_model: FacebookAI/xlm-roberta-large-finetuned-conll03-english
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: xlm-roberta-large-clinical-ner-breast-cancer-sp
    results: []

xlm-roberta-large-clinical-ner-breast-cancer-sp

This model is a fine-tuned version of FacebookAI/xlm-roberta-large-finetuned-conll03-english on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2999
  • Precision: 0.8965
  • Recall: 0.8959
  • F1: 0.8962
  • Accuracy: 0.9474

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 64
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
1.2687 1.0 213 1.3859 0.4556 0.3389 0.3887 0.6763
0.4857 2.0 426 0.5022 0.7673 0.7919 0.7794 0.8970
0.2519 3.0 639 0.3412 0.8407 0.8452 0.8430 0.9259
0.1671 4.0 852 0.3058 0.8711 0.8659 0.8685 0.9355
0.1423 5.0 1065 0.2983 0.8585 0.8659 0.8622 0.9340
0.0973 6.0 1278 0.2795 0.8773 0.8732 0.8753 0.9397
0.0655 7.0 1491 0.2775 0.8755 0.8726 0.8740 0.9393
0.0734 8.0 1704 0.2755 0.8799 0.8846 0.8822 0.9422
0.0575 9.0 1917 0.2900 0.8828 0.8793 0.8810 0.9409
0.0522 10.0 2130 0.2852 0.8864 0.8846 0.8855 0.9417
0.0559 11.0 2343 0.2735 0.8863 0.8893 0.8878 0.9441
0.0401 12.0 2556 0.2845 0.8833 0.8939 0.8886 0.9434
0.0326 13.0 2769 0.2845 0.8951 0.8933 0.8942 0.9462
0.0513 14.0 2982 0.2864 0.8886 0.8886 0.8886 0.9453
0.0223 15.0 3195 0.2920 0.8923 0.8899 0.8911 0.9455
0.0332 16.0 3408 0.2956 0.8906 0.8906 0.8906 0.9470
0.0262 17.0 3621 0.2987 0.8953 0.8959 0.8956 0.9469
0.018 18.0 3834 0.2999 0.8965 0.8959 0.8962 0.9474
0.02 19.0 4047 0.3023 0.8965 0.8959 0.8962 0.9472
0.0222 19.9088 4240 0.3023 0.8965 0.8959 0.8962 0.9474

Framework versions

  • Transformers 4.48.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0