metadata
license: mit
tags:
- generated_from_trainer
model-index:
- name: mlcovid19-classifier
results: []
mlcovid19-classifier
This model is a fine-tuned version of oscarwu/mlcovid19-classifier on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2879
- F1 Macro: 0.7978
- F1 Misinformation: 0.9347
- F1 Factual: 0.9423
- F1 Other: 0.5166
- Prec Macro: 0.8156
- Prec Misinformation: 0.9277
- Prec Factual: 0.9345
- Prec Other: 0.5846
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 4096
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2607
- num_epochs: 400
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Misinformation | F1 Factual | F1 Other | Prec Macro | Prec Misinformation | Prec Factual | Prec Other |
---|---|---|---|---|---|---|---|---|---|---|---|
0.4535 | 1.98 | 10 | 0.4122 | 0.6809 | 0.8906 | 0.8993 | 0.2529 | 0.7749 | 0.8433 | 0.9169 | 0.5646 |
0.4445 | 3.98 | 20 | 0.4056 | 0.6844 | 0.8918 | 0.9004 | 0.2611 | 0.7706 | 0.8461 | 0.9171 | 0.5487 |
0.4362 | 5.98 | 30 | 0.3966 | 0.6870 | 0.8930 | 0.9020 | 0.2658 | 0.7672 | 0.8490 | 0.9171 | 0.5356 |
0.4229 | 7.98 | 40 | 0.3864 | 0.6885 | 0.8955 | 0.9055 | 0.2645 | 0.7652 | 0.8531 | 0.9179 | 0.5246 |
0.4134 | 9.98 | 50 | 0.3774 | 0.6889 | 0.8983 | 0.9091 | 0.2594 | 0.7697 | 0.8573 | 0.9173 | 0.5345 |
0.4004 | 11.98 | 60 | 0.3682 | 0.6907 | 0.8996 | 0.9111 | 0.2616 | 0.7763 | 0.8605 | 0.9148 | 0.5536 |
0.3893 | 13.98 | 70 | 0.3583 | 0.6960 | 0.9014 | 0.9124 | 0.2740 | 0.7853 | 0.8629 | 0.9152 | 0.5778 |
0.3853 | 15.98 | 80 | 0.3483 | 0.7036 | 0.9031 | 0.9157 | 0.2920 | 0.7749 | 0.8683 | 0.9172 | 0.5390 |
0.369 | 17.98 | 90 | 0.3399 | 0.7011 | 0.9037 | 0.9167 | 0.2828 | 0.7775 | 0.8690 | 0.9159 | 0.5476 |
0.36 | 19.98 | 100 | 0.3312 | 0.7102 | 0.9056 | 0.9194 | 0.3055 | 0.7836 | 0.8733 | 0.9167 | 0.5609 |
0.3445 | 21.98 | 110 | 0.3237 | 0.7116 | 0.9065 | 0.9204 | 0.3078 | 0.7860 | 0.8749 | 0.9165 | 0.5667 |
0.3406 | 23.98 | 120 | 0.3181 | 0.7058 | 0.9068 | 0.9212 | 0.2893 | 0.7880 | 0.8740 | 0.9162 | 0.5738 |
0.3286 | 25.98 | 130 | 0.3094 | 0.7183 | 0.9099 | 0.9250 | 0.32 | 0.7932 | 0.8782 | 0.9216 | 0.5797 |
0.3213 | 27.98 | 140 | 0.3049 | 0.7187 | 0.9111 | 0.9254 | 0.3196 | 0.7957 | 0.8800 | 0.9204 | 0.5867 |
0.3111 | 29.98 | 150 | 0.3017 | 0.7219 | 0.9129 | 0.9264 | 0.3263 | 0.7983 | 0.8843 | 0.9178 | 0.5927 |
0.3087 | 31.98 | 160 | 0.2970 | 0.7231 | 0.9132 | 0.9276 | 0.3287 | 0.7977 | 0.8850 | 0.9188 | 0.5893 |
0.2992 | 33.98 | 170 | 0.2926 | 0.7243 | 0.9141 | 0.9293 | 0.3293 | 0.8003 | 0.8839 | 0.9235 | 0.5935 |
0.2924 | 35.98 | 180 | 0.2892 | 0.7312 | 0.9150 | 0.9303 | 0.3482 | 0.7971 | 0.8889 | 0.9218 | 0.5806 |
0.2878 | 37.98 | 190 | 0.2870 | 0.7356 | 0.9173 | 0.9324 | 0.3571 | 0.8027 | 0.8906 | 0.9246 | 0.5929 |
0.2811 | 39.98 | 200 | 0.2844 | 0.7439 | 0.9188 | 0.9328 | 0.3801 | 0.8109 | 0.8954 | 0.9213 | 0.6161 |
0.2751 | 41.98 | 210 | 0.2816 | 0.7500 | 0.9197 | 0.9340 | 0.3963 | 0.8060 | 0.8973 | 0.9250 | 0.5956 |
0.2683 | 43.98 | 220 | 0.2798 | 0.7517 | 0.9210 | 0.9339 | 0.4000 | 0.8068 | 0.8976 | 0.9272 | 0.5956 |
0.2643 | 45.98 | 230 | 0.2766 | 0.7544 | 0.9221 | 0.9349 | 0.4062 | 0.8064 | 0.8990 | 0.9290 | 0.5910 |
0.2619 | 47.98 | 240 | 0.2736 | 0.7579 | 0.9227 | 0.9356 | 0.4155 | 0.8085 | 0.9002 | 0.9298 | 0.5954 |
0.2539 | 49.98 | 250 | 0.2733 | 0.7567 | 0.9231 | 0.9357 | 0.4111 | 0.8060 | 0.9006 | 0.9302 | 0.5872 |
0.2496 | 51.98 | 260 | 0.2713 | 0.7600 | 0.9235 | 0.9360 | 0.4206 | 0.8070 | 0.9009 | 0.9320 | 0.5881 |
0.2455 | 53.98 | 270 | 0.2697 | 0.7575 | 0.9231 | 0.9356 | 0.4139 | 0.8052 | 0.9009 | 0.9304 | 0.5844 |
0.2371 | 55.98 | 280 | 0.2686 | 0.7652 | 0.9239 | 0.9356 | 0.4360 | 0.8058 | 0.9058 | 0.9283 | 0.5833 |
0.2316 | 57.98 | 290 | 0.2686 | 0.7664 | 0.9243 | 0.9361 | 0.4389 | 0.8037 | 0.9073 | 0.9288 | 0.5749 |
0.2258 | 59.98 | 300 | 0.2664 | 0.7680 | 0.9247 | 0.9360 | 0.4431 | 0.8018 | 0.9095 | 0.9279 | 0.5680 |
0.2207 | 61.98 | 310 | 0.2663 | 0.7736 | 0.9262 | 0.9373 | 0.4574 | 0.8015 | 0.9145 | 0.9276 | 0.5625 |
0.2167 | 63.98 | 320 | 0.2643 | 0.7715 | 0.9268 | 0.9380 | 0.4498 | 0.8003 | 0.9127 | 0.9312 | 0.5571 |
0.2131 | 65.98 | 330 | 0.2627 | 0.7753 | 0.9287 | 0.9398 | 0.4573 | 0.8064 | 0.9123 | 0.9356 | 0.5714 |
0.2075 | 67.98 | 340 | 0.2644 | 0.7760 | 0.9290 | 0.9397 | 0.4593 | 0.8056 | 0.9136 | 0.9349 | 0.5682 |
0.2049 | 69.98 | 350 | 0.2648 | 0.7768 | 0.9290 | 0.9390 | 0.4623 | 0.8050 | 0.9174 | 0.9292 | 0.5685 |
0.2016 | 71.98 | 360 | 0.2631 | 0.7771 | 0.9295 | 0.9394 | 0.4623 | 0.8055 | 0.9165 | 0.9316 | 0.5685 |
0.1979 | 73.98 | 370 | 0.2644 | 0.7793 | 0.9305 | 0.9397 | 0.4677 | 0.8041 | 0.9208 | 0.9295 | 0.5620 |
0.1939 | 75.98 | 380 | 0.2671 | 0.7909 | 0.9312 | 0.9392 | 0.5023 | 0.8099 | 0.9272 | 0.9256 | 0.5771 |
0.1932 | 77.98 | 390 | 0.2648 | 0.7927 | 0.9325 | 0.9422 | 0.5035 | 0.8104 | 0.9242 | 0.9361 | 0.5709 |
0.1856 | 79.98 | 400 | 0.2615 | 0.7922 | 0.9331 | 0.9431 | 0.5004 | 0.8111 | 0.9235 | 0.9379 | 0.5719 |
0.1837 | 81.98 | 410 | 0.2624 | 0.7898 | 0.9328 | 0.9447 | 0.4920 | 0.8141 | 0.9183 | 0.9432 | 0.5808 |
0.1781 | 83.98 | 420 | 0.2660 | 0.7988 | 0.9334 | 0.9432 | 0.5196 | 0.8128 | 0.9263 | 0.9388 | 0.5733 |
0.172 | 85.98 | 430 | 0.2642 | 0.7909 | 0.9335 | 0.9428 | 0.4964 | 0.8139 | 0.9234 | 0.9353 | 0.5829 |
0.172 | 87.98 | 440 | 0.2695 | 0.7880 | 0.9321 | 0.9430 | 0.4889 | 0.8121 | 0.9172 | 0.9422 | 0.5771 |
0.1656 | 89.98 | 450 | 0.2671 | 0.7928 | 0.9337 | 0.9436 | 0.5012 | 0.8145 | 0.9212 | 0.9411 | 0.5811 |
0.163 | 91.98 | 460 | 0.2693 | 0.7949 | 0.9331 | 0.9429 | 0.5088 | 0.8111 | 0.9232 | 0.9408 | 0.5692 |
0.1555 | 93.98 | 470 | 0.2696 | 0.7967 | 0.9332 | 0.9431 | 0.5138 | 0.8142 | 0.9203 | 0.9449 | 0.5776 |
0.1513 | 95.98 | 480 | 0.2710 | 0.7985 | 0.9340 | 0.9443 | 0.5172 | 0.8156 | 0.9220 | 0.9450 | 0.5798 |
0.1478 | 97.98 | 490 | 0.2722 | 0.7991 | 0.9342 | 0.9442 | 0.5189 | 0.8138 | 0.9243 | 0.9436 | 0.5736 |
0.1435 | 99.98 | 500 | 0.2725 | 0.7981 | 0.9343 | 0.9432 | 0.5166 | 0.8124 | 0.9248 | 0.9424 | 0.57 |
0.1409 | 101.98 | 510 | 0.2754 | 0.7994 | 0.9345 | 0.9432 | 0.5206 | 0.8161 | 0.9231 | 0.9433 | 0.5819 |
0.1384 | 103.98 | 520 | 0.2817 | 0.7991 | 0.9347 | 0.9441 | 0.5184 | 0.8166 | 0.9233 | 0.9436 | 0.5828 |
0.1333 | 105.98 | 530 | 0.2833 | 0.7934 | 0.9351 | 0.9434 | 0.5016 | 0.8178 | 0.9232 | 0.9380 | 0.5921 |
0.1267 | 107.98 | 540 | 0.2929 | 0.7884 | 0.9337 | 0.9429 | 0.4886 | 0.8167 | 0.9198 | 0.9377 | 0.5925 |
0.1234 | 109.98 | 550 | 0.2879 | 0.7978 | 0.9347 | 0.9423 | 0.5166 | 0.8156 | 0.9277 | 0.9345 | 0.5846 |
Framework versions
- Transformers 4.23.1
- Pytorch 1.12.1+cu113
- Datasets 2.6.1
- Tokenizers 0.13.1