Edit model card

detect-femicide-news-xlmr-nl-mono-freeze2

This model is a fine-tuned version of xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6487
  • Accuracy: 0.6429
  • Precision Neg: 0.6429
  • Precision Pos: 0.0
  • Recall Neg: 1.0
  • Recall Pos: 0.0
  • F1 Score Neg: 0.7826
  • F1 Score Pos: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 24
  • eval_batch_size: 8
  • seed: 1996
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Neg Precision Pos Recall Neg Recall Pos F1 Score Neg F1 Score Pos
0.7312 1.0 23 0.7413 0.3571 0.0 0.3571 0.0 1.0 0.0 0.5263
0.7151 2.0 46 0.7177 0.3571 0.0 0.3571 0.0 1.0 0.0 0.5263
0.7049 3.0 69 0.6988 0.3571 0.0 0.3571 0.0 1.0 0.0 0.5263
0.6934 4.0 92 0.6945 0.3571 0.0 0.3571 0.0 1.0 0.0 0.5263
0.6886 5.0 115 0.6903 0.6071 0.8182 0.4706 0.5 0.8 0.6207 0.5926
0.6911 6.0 138 0.6846 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6856 7.0 161 0.6786 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6888 8.0 184 0.6783 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6862 9.0 207 0.6819 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6807 10.0 230 0.6758 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6839 11.0 253 0.6721 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6878 12.0 276 0.6708 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6799 13.0 299 0.6692 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6813 14.0 322 0.6673 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6792 15.0 345 0.6676 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6774 16.0 368 0.6683 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6807 17.0 391 0.6679 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6834 18.0 414 0.6693 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6784 19.0 437 0.6679 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.676 20.0 460 0.6698 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6791 21.0 483 0.6661 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6775 22.0 506 0.6633 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6688 23.0 529 0.6589 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6748 24.0 552 0.6580 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6771 25.0 575 0.6619 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6761 26.0 598 0.6639 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6773 27.0 621 0.6651 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6737 28.0 644 0.6656 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6721 29.0 667 0.6650 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6683 30.0 690 0.6612 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6663 31.0 713 0.6592 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6724 32.0 736 0.6576 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6739 33.0 759 0.6601 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6691 34.0 782 0.6602 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6652 35.0 805 0.6588 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6717 36.0 828 0.6596 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6637 37.0 851 0.6587 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6704 38.0 874 0.6579 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6608 39.0 897 0.6599 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6615 40.0 920 0.6580 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6662 41.0 943 0.6592 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6622 42.0 966 0.6616 0.6071 0.6296 0.0 0.9444 0.0 0.7556 0.0
0.664 43.0 989 0.6610 0.6071 0.6296 0.0 0.9444 0.0 0.7556 0.0
0.6695 44.0 1012 0.6570 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6677 45.0 1035 0.6557 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6705 46.0 1058 0.6546 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6591 47.0 1081 0.6547 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6675 48.0 1104 0.6532 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6622 49.0 1127 0.6544 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6571 50.0 1150 0.6552 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6678 51.0 1173 0.6555 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6596 52.0 1196 0.6544 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6583 53.0 1219 0.6517 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6641 54.0 1242 0.6508 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.671 55.0 1265 0.6502 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6645 56.0 1288 0.6513 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6604 57.0 1311 0.6510 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6644 58.0 1334 0.6509 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6617 59.0 1357 0.6528 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6608 60.0 1380 0.6536 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6533 61.0 1403 0.6533 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6596 62.0 1426 0.6518 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6607 63.0 1449 0.6511 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.658 64.0 1472 0.6509 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6546 65.0 1495 0.6514 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6613 66.0 1518 0.6516 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.662 67.0 1541 0.6506 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.661 68.0 1564 0.6503 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6571 69.0 1587 0.6497 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6656 70.0 1610 0.6500 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6637 71.0 1633 0.6508 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6519 72.0 1656 0.6518 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6593 73.0 1679 0.6516 0.6071 0.6296 0.0 0.9444 0.0 0.7556 0.0
0.6539 74.0 1702 0.6514 0.6071 0.6296 0.0 0.9444 0.0 0.7556 0.0
0.6568 75.0 1725 0.6506 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6581 76.0 1748 0.6504 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6557 77.0 1771 0.6499 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6542 78.0 1794 0.6500 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6557 79.0 1817 0.6498 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6637 80.0 1840 0.6493 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6603 81.0 1863 0.6490 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6568 82.0 1886 0.6485 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6561 83.0 1909 0.6490 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6665 84.0 1932 0.6499 0.6071 0.6296 0.0 0.9444 0.0 0.7556 0.0
0.655 85.0 1955 0.6492 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6509 86.0 1978 0.6493 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6549 87.0 2001 0.6493 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.655 88.0 2024 0.6489 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6576 89.0 2047 0.6493 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6612 90.0 2070 0.6492 0.6071 0.6296 0.0 0.9444 0.0 0.7556 0.0
0.6641 91.0 2093 0.6492 0.6071 0.6296 0.0 0.9444 0.0 0.7556 0.0
0.654 92.0 2116 0.6487 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6556 93.0 2139 0.6488 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6566 94.0 2162 0.6486 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6565 95.0 2185 0.6487 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6516 96.0 2208 0.6488 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6509 97.0 2231 0.6487 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6588 98.0 2254 0.6487 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6532 99.0 2277 0.6487 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0
0.6548 100.0 2300 0.6487 0.6429 0.6429 0.0 1.0 0.0 0.7826 0.0

Framework versions

  • Transformers 4.16.2
  • Pytorch 1.10.2+cu113
  • Datasets 1.18.3
  • Tokenizers 0.11.0
Downloads last month
9
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.