Edit model card

results

This model is a fine-tuned version of microsoft/SportsBERT on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8218
  • Accuracy: 0.7533
  • F1: 0.5308
  • Precision: 0.5341
  • Recall: 0.5609

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
6.3122 0.03 10 6.3037 0.0 0.0 0.0 0.0
6.275 0.05 20 6.2851 0.0 0.0 0.0 0.0
6.2656 0.08 30 6.2558 0.0 0.0 0.0 0.0
6.2088 0.1 40 6.1995 0.0015 0.0001 0.0001 0.0008
6.1868 0.13 50 6.1330 0.1196 0.0016 0.0009 0.0058
6.1126 0.16 60 6.0544 0.1315 0.0013 0.0007 0.0056
6.0563 0.18 70 5.9539 0.1315 0.0013 0.0007 0.0056
5.9162 0.21 80 5.8702 0.1315 0.0013 0.0007 0.0056
5.9922 0.23 90 5.8097 0.1315 0.0013 0.0007 0.0056
5.7775 0.26 100 5.7482 0.1315 0.0013 0.0007 0.0056
5.7196 0.29 110 5.6948 0.1315 0.0013 0.0007 0.0056
5.6704 0.31 120 5.6474 0.1315 0.0013 0.0007 0.0056
5.732 0.34 130 5.6125 0.1315 0.0013 0.0007 0.0056
5.5617 0.37 140 5.5776 0.1315 0.0013 0.0007 0.0056
5.5267 0.39 150 5.5558 0.1315 0.0013 0.0007 0.0056
5.7006 0.42 160 5.5266 0.1315 0.0013 0.0007 0.0056
5.4703 0.44 170 5.4877 0.1315 0.0013 0.0007 0.0056
5.5078 0.47 180 5.4431 0.1315 0.0013 0.0007 0.0056
5.4349 0.5 190 5.4124 0.1315 0.0013 0.0007 0.0056
5.6199 0.52 200 5.3824 0.1315 0.0013 0.0007 0.0056
5.002 0.55 210 5.3414 0.1315 0.0013 0.0007 0.0056
5.2497 0.57 220 5.3030 0.1315 0.0013 0.0007 0.0056
4.8403 0.6 230 5.2723 0.1315 0.0013 0.0007 0.0056
5.2636 0.63 240 5.2070 0.1448 0.0092 0.0131 0.0122
5.3921 0.65 250 5.2232 0.1713 0.0191 0.0192 0.0285
5.3127 0.68 260 5.1045 0.1462 0.0118 0.0120 0.0158
5.0857 0.7 270 5.0514 0.1787 0.0243 0.0304 0.0313
5.0425 0.73 280 4.9835 0.1802 0.0230 0.0238 0.0332
5.0088 0.76 290 4.9355 0.1905 0.0269 0.0283 0.0338
5.0987 0.78 300 4.8663 0.2112 0.0317 0.0360 0.0434
5.0445 0.81 310 4.8652 0.2378 0.0414 0.0369 0.0562
4.928 0.84 320 4.7639 0.2230 0.0369 0.0361 0.0487
4.7661 0.86 330 4.6684 0.2482 0.0491 0.0495 0.0606
4.7903 0.89 340 4.6012 0.2806 0.0713 0.0804 0.0929
4.5534 0.91 350 4.4948 0.3087 0.0853 0.0846 0.1086
4.6951 0.94 360 4.4281 0.3146 0.0922 0.0886 0.1136
4.379 0.97 370 4.4067 0.2984 0.0881 0.0950 0.1084
4.5588 0.99 380 4.3093 0.3176 0.0865 0.0944 0.1062
4.4383 1.02 390 4.2825 0.3486 0.1070 0.1039 0.1341
4.505 1.04 400 4.2196 0.3604 0.1125 0.1124 0.1386
3.7169 1.07 410 4.1832 0.3250 0.1008 0.1166 0.1180
3.7538 1.1 420 4.2051 0.4062 0.1526 0.1565 0.2016
3.5894 1.12 430 4.1349 0.3294 0.1042 0.1156 0.1206
4.045 1.15 440 4.0583 0.3855 0.1330 0.1388 0.1624
3.1886 1.17 450 3.9559 0.3944 0.1403 0.1485 0.1799
3.8633 1.2 460 3.9481 0.4092 0.1523 0.1468 0.2027
3.7127 1.23 470 3.8654 0.4151 0.1574 0.1591 0.1974
3.6555 1.25 480 3.8571 0.4210 0.1615 0.1603 0.2015
3.854 1.28 490 3.8096 0.4151 0.1542 0.1488 0.1897
3.7229 1.31 500 3.7690 0.4328 0.1716 0.1653 0.2164
3.9952 1.33 510 3.7769 0.4092 0.1638 0.1533 0.1937
3.4301 1.36 520 3.7061 0.4756 0.1968 0.1962 0.2515
4.1302 1.38 530 3.6265 0.4549 0.1919 0.1898 0.2297
3.5395 1.41 540 3.5806 0.4564 0.1917 0.1845 0.2368
3.2723 1.44 550 3.4968 0.4904 0.2195 0.2175 0.2682
3.1159 1.46 560 3.4794 0.4461 0.1967 0.2059 0.2205
3.3653 1.49 570 3.4419 0.4963 0.2216 0.2162 0.2864
3.1493 1.51 580 3.3549 0.4682 0.2123 0.2014 0.2564
2.8797 1.54 590 3.3212 0.5066 0.2594 0.2696 0.2992
3.3974 1.57 600 3.4043 0.4934 0.2324 0.2481 0.2751
2.9646 1.59 610 3.2956 0.4815 0.2238 0.2297 0.2535
2.5164 1.62 620 3.2025 0.4978 0.2191 0.2216 0.2596
3.1884 1.64 630 3.2710 0.5258 0.2607 0.2536 0.3152
3.5247 1.67 640 3.2109 0.5022 0.2348 0.2314 0.2739
3.2349 1.7 650 3.1718 0.5022 0.2391 0.2310 0.2804
3.0547 1.72 660 3.1540 0.5244 0.2693 0.2701 0.3103
2.6583 1.75 670 3.1089 0.5244 0.2491 0.2536 0.2864
2.7558 1.78 680 3.0492 0.5539 0.2897 0.2936 0.3351
2.289 1.8 690 3.1638 0.5022 0.2647 0.2837 0.2901
2.993 1.83 700 3.1440 0.5598 0.3071 0.3153 0.3659
3.1635 1.85 710 3.1380 0.4712 0.2289 0.2331 0.2569
3.1843 1.88 720 3.1009 0.4919 0.2453 0.2614 0.2718
2.6742 1.91 730 2.9769 0.5598 0.2931 0.2973 0.3383
2.9256 1.93 740 2.9560 0.5554 0.2834 0.2840 0.3348
2.4105 1.96 750 2.9320 0.5495 0.3003 0.3150 0.3446
2.9523 1.98 760 2.8481 0.5539 0.3025 0.3149 0.3452
2.6287 2.01 770 2.8092 0.5746 0.3231 0.3314 0.3631
2.6051 2.04 780 2.7894 0.5761 0.3248 0.3368 0.3639
1.9671 2.06 790 2.8455 0.5716 0.3203 0.3400 0.3522
2.2805 2.09 800 2.8596 0.5687 0.3086 0.3227 0.3452
2.5447 2.11 810 2.7331 0.6086 0.3369 0.3323 0.3932
2.7499 2.14 820 2.7018 0.5997 0.3358 0.3367 0.3867
2.1172 2.17 830 2.6629 0.5982 0.3320 0.3368 0.3762
2.143 2.19 840 2.6428 0.6278 0.3565 0.3612 0.4012
2.4473 2.22 850 2.5931 0.6233 0.3622 0.3769 0.4044
1.8206 2.25 860 2.5862 0.6145 0.3554 0.3525 0.4019
2.4884 2.27 870 2.5626 0.6263 0.3631 0.3640 0.4066
1.9498 2.3 880 2.5493 0.6189 0.3530 0.3655 0.3908
2.4184 2.32 890 2.5093 0.6381 0.3699 0.3801 0.4069
1.995 2.35 900 2.5456 0.6499 0.3831 0.3924 0.4172
2.4355 2.38 910 2.5306 0.6425 0.3787 0.3879 0.4126
1.8367 2.4 920 2.4947 0.6484 0.3799 0.3811 0.4207
1.6993 2.43 930 2.4694 0.6588 0.3971 0.3966 0.4462
1.6198 2.45 940 2.4768 0.6691 0.4087 0.4091 0.4498
2.0119 2.48 950 2.4319 0.6514 0.3892 0.3983 0.4202
1.3796 2.51 960 2.4279 0.6647 0.4042 0.4188 0.4332
1.5978 2.53 970 2.4716 0.6677 0.4062 0.4048 0.4537
2.26 2.56 980 2.4160 0.6736 0.4150 0.4216 0.4459
1.9445 2.58 990 2.4038 0.6750 0.4178 0.4259 0.4529
1.9551 2.61 1000 2.3866 0.6736 0.4056 0.4077 0.4442
2.052 2.64 1010 2.3938 0.6765 0.4157 0.4100 0.4654
2.0671 2.66 1020 2.4113 0.6736 0.4158 0.4119 0.4687
1.7332 2.69 1030 2.3930 0.6706 0.3984 0.3972 0.4432
1.9113 2.72 1040 2.3661 0.6780 0.4061 0.4035 0.4553
1.7881 2.74 1050 2.3104 0.6765 0.4149 0.4181 0.4588
1.6475 2.77 1060 2.2779 0.6824 0.4282 0.4316 0.4735
1.9959 2.79 1070 2.2720 0.6898 0.4332 0.4354 0.4746
1.5039 2.82 1080 2.2858 0.6839 0.4273 0.4330 0.4693
1.9764 2.85 1090 2.3054 0.6780 0.4123 0.4199 0.4555
1.7056 2.87 1100 2.2503 0.6809 0.4100 0.4185 0.4408
1.4112 2.9 1110 2.2162 0.7046 0.4379 0.4441 0.4758
2.1521 2.92 1120 2.2133 0.7046 0.4444 0.4507 0.4827
1.4928 2.95 1130 2.1953 0.7046 0.4498 0.4584 0.4882
1.5147 2.98 1140 2.1814 0.7061 0.4485 0.4544 0.4882
2.0173 3.0 1150 2.1921 0.6957 0.4344 0.4373 0.4792
1.4601 3.03 1160 2.1690 0.6957 0.4434 0.4473 0.4838
1.3261 3.05 1170 2.1156 0.7149 0.4656 0.4750 0.4958
1.6506 3.08 1180 2.0940 0.7149 0.4542 0.4632 0.4857
1.1869 3.11 1190 2.0919 0.7134 0.4597 0.4590 0.5002
1.4337 3.13 1200 2.1363 0.7090 0.4560 0.4518 0.5073
1.2734 3.16 1210 2.1231 0.7090 0.4585 0.4522 0.5095
1.6794 3.19 1220 2.0523 0.7238 0.4643 0.4660 0.4985
1.5335 3.21 1230 2.0347 0.7282 0.4611 0.4578 0.4994
0.9728 3.24 1240 2.0415 0.7253 0.4643 0.4549 0.5103
1.4616 3.26 1250 2.0451 0.7164 0.4612 0.4522 0.5098
1.2002 3.29 1260 2.0137 0.7253 0.4793 0.4808 0.5211
1.2331 3.32 1270 2.0234 0.7267 0.4794 0.4782 0.5223
1.3334 3.34 1280 2.0507 0.7134 0.4694 0.4721 0.5150
1.2774 3.37 1290 2.0493 0.7208 0.4813 0.4814 0.5217
1.0036 3.39 1300 2.0843 0.7179 0.4879 0.4945 0.5166
1.4289 3.42 1310 2.0544 0.7179 0.4843 0.4859 0.5262
1.0987 3.45 1320 2.0389 0.7164 0.4822 0.4803 0.5251
0.9749 3.47 1330 2.0509 0.7194 0.4927 0.4955 0.5334
1.23 3.5 1340 2.0327 0.7297 0.4963 0.5032 0.5260
1.2873 3.52 1350 2.0175 0.7312 0.4935 0.4969 0.5285
1.3335 3.55 1360 1.9970 0.7282 0.4955 0.4976 0.5331
1.1209 3.58 1370 1.9708 0.7326 0.4930 0.4929 0.5307
1.2895 3.6 1380 1.9480 0.7356 0.4967 0.4926 0.5358
1.3765 3.63 1390 1.9850 0.7297 0.5002 0.4913 0.5492
1.0298 3.66 1400 1.9649 0.7312 0.5022 0.5002 0.5443
1.4707 3.68 1410 1.9589 0.7326 0.4968 0.4995 0.5341
1.5404 3.71 1420 1.9712 0.7386 0.5056 0.5069 0.5395
1.2394 3.73 1430 1.9733 0.7386 0.5127 0.5142 0.5469
1.0191 3.76 1440 1.9696 0.7326 0.5005 0.5031 0.5342
0.8809 3.79 1450 1.9569 0.7386 0.5076 0.5022 0.5463
0.8113 3.81 1460 1.9445 0.7386 0.4960 0.4922 0.5313
0.8888 3.84 1470 1.9434 0.7415 0.4944 0.4934 0.5263
0.9775 3.86 1480 1.9311 0.7430 0.5001 0.4967 0.5343
1.5036 3.89 1490 1.9008 0.7445 0.5008 0.4964 0.5368
1.1425 3.92 1500 1.8969 0.7459 0.5113 0.5075 0.5461
1.0492 3.94 1510 1.8869 0.7459 0.5057 0.5027 0.5424
1.0938 3.97 1520 1.8864 0.7489 0.5111 0.5114 0.5432
1.3599 3.99 1530 1.8901 0.7459 0.5102 0.5071 0.5457
1.0393 4.02 1540 1.8817 0.7459 0.5145 0.5100 0.5515
0.8796 4.05 1550 1.8760 0.7430 0.5076 0.5043 0.5456
0.6769 4.07 1560 1.8813 0.7445 0.5094 0.5078 0.5466
1.1151 4.1 1570 1.8843 0.7430 0.5127 0.5083 0.5548
0.8389 4.13 1580 1.8787 0.7489 0.5175 0.5165 0.5556
0.8193 4.15 1590 1.8831 0.7415 0.5091 0.5064 0.5475
0.9354 4.18 1600 1.8832 0.7445 0.5086 0.5046 0.5456
0.7061 4.2 1610 1.8696 0.7445 0.5079 0.5039 0.5455
0.8033 4.23 1620 1.8655 0.7430 0.5037 0.4979 0.5458
1.0084 4.26 1630 1.8592 0.7459 0.5053 0.4996 0.5474
0.9944 4.28 1640 1.8578 0.7474 0.5107 0.5089 0.5464
0.9228 4.31 1650 1.8606 0.7459 0.5153 0.5178 0.5443
0.9574 4.33 1660 1.8575 0.7489 0.5194 0.5242 0.5483
0.7753 4.36 1670 1.8569 0.7489 0.5182 0.5223 0.5483
0.7223 4.39 1680 1.8525 0.7489 0.5188 0.5194 0.5515
0.8973 4.41 1690 1.8519 0.7518 0.5194 0.5184 0.5527
0.771 4.44 1700 1.8503 0.7533 0.5282 0.5339 0.5570
0.9367 4.46 1710 1.8546 0.7533 0.5260 0.5348 0.5527
1.1453 4.49 1720 1.8503 0.7533 0.5252 0.5335 0.5520
1.1738 4.52 1730 1.8443 0.7504 0.5263 0.5315 0.5552
1.0122 4.54 1740 1.8402 0.7548 0.5274 0.5296 0.5578
1.0207 4.57 1750 1.8371 0.7548 0.5251 0.5273 0.5582
0.8991 4.6 1760 1.8367 0.7504 0.5225 0.5213 0.5580
0.8017 4.62 1770 1.8375 0.7489 0.5213 0.5208 0.5579
0.9423 4.65 1780 1.8384 0.7518 0.5220 0.5217 0.5584
0.8043 4.67 1790 1.8366 0.7518 0.5241 0.5245 0.5588
0.7625 4.7 1800 1.8358 0.7504 0.5216 0.5203 0.5566
0.9742 4.73 1810 1.8344 0.7533 0.5278 0.5299 0.5606
0.8809 4.75 1820 1.8313 0.7504 0.5292 0.5307 0.5629
0.8433 4.78 1830 1.8299 0.7504 0.5292 0.5307 0.5629
0.7195 4.8 1840 1.8282 0.7533 0.5319 0.5342 0.5639
0.7989 4.83 1850 1.8270 0.7533 0.5316 0.5342 0.5635
0.7612 4.86 1860 1.8253 0.7563 0.5348 0.5372 0.5666
0.9571 4.88 1870 1.8240 0.7563 0.5351 0.5373 0.5666
0.7009 4.91 1880 1.8232 0.7563 0.5351 0.5373 0.5666
0.7424 4.93 1890 1.8224 0.7533 0.5293 0.5314 0.5609
1.0661 4.96 1900 1.8218 0.7533 0.5291 0.5315 0.5609
0.9666 4.99 1910 1.8217 0.7533 0.5291 0.5315 0.5609

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.0+cu121
  • Tokenizers 0.15.2
Downloads last month
7
Safetensors
Model size
94.1M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for anilerkul/results

Finetuned
(1)
this model