lora-roberta-large-fine-emo
This model is a fine-tuned version of roberta-large on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7876
- Accuracy: 0.7175
- Prec: 0.6298
- Recall: 0.5922
- F1: 0.6045
- B Acc: 0.5922
- Prec Joy: 0.6497
- Recall Joy: 0.7540
- F1 Joy: 0.6980
- Prec Anger: 0.6146
- Recall Anger: 0.6435
- F1 Anger: 0.6287
- Prec Disgust: 0.4805
- Recall Disgust: 0.4393
- F1 Disgust: 0.4590
- Prec Fear: 0.6954
- Recall Fear: 0.5953
- F1 Fear: 0.6415
- Prec Neutral: 0.8410
- Recall Neutral: 0.8250
- F1 Neutral: 0.8329
- Prec Sadness: 0.6719
- Recall Sadness: 0.6124
- F1 Sadness: 0.6408
- Prec Surprise: 0.5377
- Recall Surprise: 0.4215
- F1 Surprise: 0.4726
- Prec Amusement: 0.7297
- Recall Amusement: 0.8284
- F1 Amusement: 0.7759
- Prec Anxiety: 0.4420
- Recall Anxiety: 0.4896
- F1 Anxiety: 0.4646
- Prec Guilt: 0.7911
- Recall Guilt: 0.7740
- F1 Guilt: 0.7824
- Prec Love: 0.6545
- Recall Love: 0.5996
- F1 Love: 0.6259
- Prec Optimism: 0.7228
- Recall Optimism: 0.5915
- F1 Optimism: 0.6506
- Prec Pessimism: 0.3571
- Recall Pessimism: 0.125
- F1 Pessimism: 0.1852
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 20.0
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Prec | Recall | F1 | B Acc | Prec Joy | Recall Joy | F1 Joy | Prec Anger | Recall Anger | F1 Anger | Prec Disgust | Recall Disgust | F1 Disgust | Prec Fear | Recall Fear | F1 Fear | Prec Neutral | Recall Neutral | F1 Neutral | Prec Sadness | Recall Sadness | F1 Sadness | Prec Surprise | Recall Surprise | F1 Surprise | Prec Amusement | Recall Amusement | F1 Amusement | Prec Anxiety | Recall Anxiety | F1 Anxiety | Prec Guilt | Recall Guilt | F1 Guilt | Prec Love | Recall Love | F1 Love | Prec Optimism | Recall Optimism | F1 Optimism | Prec Pessimism | Recall Pessimism | F1 Pessimism |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.0157 | 1.0 | 1756 | 0.9691 | 0.6663 | 0.5337 | 0.5038 | 0.5122 | 0.5038 | 0.6173 | 0.6805 | 0.6474 | 0.5681 | 0.4796 | 0.5201 | 0.4619 | 0.3893 | 0.4225 | 0.6567 | 0.4117 | 0.5061 | 0.8134 | 0.8269 | 0.8201 | 0.5633 | 0.56 | 0.5616 | 0.4613 | 0.3204 | 0.3782 | 0.6554 | 0.7657 | 0.7062 | 0.3637 | 0.5292 | 0.4311 | 0.7048 | 0.6130 | 0.6557 | 0.4626 | 0.5186 | 0.4890 | 0.6092 | 0.4538 | 0.5202 | 0.0 | 0.0 | 0.0 |
0.9186 | 2.0 | 3512 | 0.8979 | 0.6873 | 0.5497 | 0.5388 | 0.5376 | 0.5388 | 0.6239 | 0.7230 | 0.6698 | 0.5644 | 0.5817 | 0.5729 | 0.4373 | 0.4357 | 0.4365 | 0.5923 | 0.5743 | 0.5831 | 0.8235 | 0.8298 | 0.8266 | 0.6103 | 0.5373 | 0.5715 | 0.5451 | 0.2796 | 0.3696 | 0.5699 | 0.8746 | 0.6901 | 0.4437 | 0.4174 | 0.4302 | 0.7510 | 0.6872 | 0.7177 | 0.5687 | 0.5129 | 0.5393 | 0.6165 | 0.5509 | 0.5818 | 0.0 | 0.0 | 0.0 |
0.8686 | 3.0 | 5268 | 0.8550 | 0.6993 | 0.5880 | 0.5603 | 0.5700 | 0.5603 | 0.6758 | 0.6879 | 0.6818 | 0.5462 | 0.6339 | 0.5868 | 0.4386 | 0.4339 | 0.4363 | 0.6578 | 0.5485 | 0.5982 | 0.8159 | 0.8431 | 0.8293 | 0.6360 | 0.5707 | 0.6015 | 0.4874 | 0.3753 | 0.4241 | 0.6702 | 0.8251 | 0.7396 | 0.4647 | 0.4431 | 0.4537 | 0.8147 | 0.6600 | 0.7293 | 0.5840 | 0.5634 | 0.5735 | 0.6426 | 0.5994 | 0.6202 | 0.2105 | 0.1 | 0.1356 |
0.8386 | 4.0 | 7024 | 0.8276 | 0.7080 | 0.6131 | 0.5607 | 0.5805 | 0.5607 | 0.6812 | 0.6885 | 0.6848 | 0.6102 | 0.5716 | 0.5903 | 0.4691 | 0.4339 | 0.4508 | 0.6514 | 0.5813 | 0.6143 | 0.7868 | 0.8752 | 0.8286 | 0.6617 | 0.5702 | 0.6126 | 0.5489 | 0.3441 | 0.4230 | 0.7143 | 0.8251 | 0.7657 | 0.4824 | 0.4481 | 0.4646 | 0.7817 | 0.7251 | 0.7523 | 0.6292 | 0.5500 | 0.5870 | 0.6866 | 0.5759 | 0.6264 | 0.2667 | 0.1 | 0.1455 |
0.8125 | 5.0 | 8780 | 0.8200 | 0.7102 | 0.6011 | 0.5845 | 0.5887 | 0.5845 | 0.6745 | 0.7044 | 0.6891 | 0.6341 | 0.5779 | 0.6047 | 0.4914 | 0.4071 | 0.4453 | 0.7038 | 0.5836 | 0.6381 | 0.8196 | 0.8475 | 0.8333 | 0.6499 | 0.5858 | 0.6162 | 0.4950 | 0.4237 | 0.4565 | 0.6650 | 0.8845 | 0.7592 | 0.4218 | 0.5361 | 0.4721 | 0.7771 | 0.7251 | 0.7502 | 0.6509 | 0.5777 | 0.6121 | 0.6140 | 0.6197 | 0.6168 | 0.2174 | 0.125 | 0.1587 |
0.7981 | 6.0 | 10536 | 0.8111 | 0.7131 | 0.5989 | 0.5719 | 0.5793 | 0.5719 | 0.6839 | 0.6987 | 0.6912 | 0.6277 | 0.6090 | 0.6182 | 0.5151 | 0.3964 | 0.4480 | 0.6894 | 0.6023 | 0.6429 | 0.8023 | 0.8626 | 0.8313 | 0.7311 | 0.5231 | 0.6098 | 0.5577 | 0.3688 | 0.4440 | 0.6710 | 0.8548 | 0.7518 | 0.4138 | 0.5697 | 0.4794 | 0.8109 | 0.7288 | 0.7676 | 0.5881 | 0.6301 | 0.6084 | 0.6943 | 0.5900 | 0.6379 | 0.0 | 0.0 | 0.0 |
0.7883 | 7.0 | 12292 | 0.8049 | 0.7133 | 0.6065 | 0.5998 | 0.6008 | 0.5998 | 0.6620 | 0.7362 | 0.6971 | 0.6193 | 0.6080 | 0.6136 | 0.4283 | 0.4911 | 0.4576 | 0.6456 | 0.6199 | 0.6325 | 0.8404 | 0.8292 | 0.8348 | 0.6721 | 0.5849 | 0.6255 | 0.5047 | 0.4075 | 0.4509 | 0.7096 | 0.8548 | 0.7754 | 0.4352 | 0.5114 | 0.4702 | 0.7722 | 0.7541 | 0.7630 | 0.6345 | 0.6072 | 0.6206 | 0.7111 | 0.5931 | 0.6468 | 0.25 | 0.2 | 0.2222 |
0.7724 | 8.0 | 14048 | 0.7981 | 0.7174 | 0.6296 | 0.5832 | 0.5941 | 0.5832 | 0.6869 | 0.7086 | 0.6976 | 0.6292 | 0.6277 | 0.6284 | 0.5897 | 0.3696 | 0.4544 | 0.6136 | 0.6538 | 0.6331 | 0.8224 | 0.8514 | 0.8367 | 0.6859 | 0.5716 | 0.6235 | 0.5257 | 0.4075 | 0.4591 | 0.7333 | 0.7987 | 0.7646 | 0.4078 | 0.5618 | 0.4725 | 0.7782 | 0.7740 | 0.7761 | 0.6324 | 0.6101 | 0.6211 | 0.7046 | 0.5712 | 0.6309 | 0.375 | 0.075 | 0.1250 |
0.7644 | 9.0 | 15804 | 0.7984 | 0.7155 | 0.6223 | 0.5727 | 0.5888 | 0.5727 | 0.6514 | 0.7481 | 0.6964 | 0.6069 | 0.6473 | 0.6265 | 0.5131 | 0.4196 | 0.4617 | 0.6629 | 0.6234 | 0.6426 | 0.8348 | 0.8311 | 0.8329 | 0.6112 | 0.6533 | 0.6316 | 0.5514 | 0.3806 | 0.4504 | 0.7335 | 0.8086 | 0.7692 | 0.4865 | 0.3749 | 0.4235 | 0.8020 | 0.7396 | 0.7695 | 0.6567 | 0.5891 | 0.6211 | 0.7298 | 0.5790 | 0.6457 | 0.25 | 0.05 | 0.0833 |
0.7531 | 10.0 | 17560 | 0.7929 | 0.7181 | 0.6020 | 0.5761 | 0.5853 | 0.5761 | 0.6524 | 0.7454 | 0.6958 | 0.6170 | 0.6291 | 0.6230 | 0.5325 | 0.3804 | 0.4437 | 0.6675 | 0.6222 | 0.6441 | 0.8269 | 0.8428 | 0.8348 | 0.6892 | 0.5924 | 0.6372 | 0.5144 | 0.4215 | 0.4634 | 0.6939 | 0.8680 | 0.7713 | 0.4655 | 0.4402 | 0.4525 | 0.7802 | 0.7703 | 0.7753 | 0.6405 | 0.6149 | 0.6274 | 0.7464 | 0.5618 | 0.6411 | 0.0 | 0.0 | 0.0 |
0.7462 | 11.0 | 19316 | 0.7868 | 0.7187 | 0.6001 | 0.5776 | 0.5861 | 0.5776 | 0.6634 | 0.7412 | 0.7001 | 0.6197 | 0.6425 | 0.6309 | 0.4802 | 0.4125 | 0.4438 | 0.6403 | 0.6246 | 0.6323 | 0.8266 | 0.8413 | 0.8339 | 0.6714 | 0.6067 | 0.6374 | 0.5649 | 0.3978 | 0.4669 | 0.7096 | 0.8548 | 0.7754 | 0.4643 | 0.4688 | 0.4665 | 0.7973 | 0.7468 | 0.7712 | 0.6703 | 0.5815 | 0.6228 | 0.6930 | 0.5900 | 0.6374 | 0.0 | 0.0 | 0.0 |
0.7357 | 12.0 | 21072 | 0.7876 | 0.7175 | 0.6298 | 0.5922 | 0.6045 | 0.5922 | 0.6497 | 0.7540 | 0.6980 | 0.6146 | 0.6435 | 0.6287 | 0.4805 | 0.4393 | 0.4590 | 0.6954 | 0.5953 | 0.6415 | 0.8410 | 0.8250 | 0.8329 | 0.6719 | 0.6124 | 0.6408 | 0.5377 | 0.4215 | 0.4726 | 0.7297 | 0.8284 | 0.7759 | 0.4420 | 0.4896 | 0.4646 | 0.7911 | 0.7740 | 0.7824 | 0.6545 | 0.5996 | 0.6259 | 0.7228 | 0.5915 | 0.6506 | 0.3571 | 0.125 | 0.1852 |
0.7242 | 13.0 | 22828 | 0.7810 | 0.7205 | 0.6004 | 0.5846 | 0.5903 | 0.5846 | 0.6679 | 0.7310 | 0.6980 | 0.6422 | 0.6277 | 0.6348 | 0.4782 | 0.4304 | 0.4530 | 0.6712 | 0.6304 | 0.6502 | 0.8237 | 0.8440 | 0.8337 | 0.6598 | 0.6231 | 0.6409 | 0.5565 | 0.4022 | 0.4669 | 0.7213 | 0.8713 | 0.7892 | 0.4847 | 0.4243 | 0.4525 | 0.7698 | 0.7740 | 0.7719 | 0.6276 | 0.6330 | 0.6303 | 0.7022 | 0.6088 | 0.6521 | 0.0 | 0.0 | 0.0 |
0.7216 | 14.0 | 24584 | 0.7839 | 0.7200 | 0.6543 | 0.5946 | 0.6044 | 0.5946 | 0.6680 | 0.7423 | 0.7032 | 0.5927 | 0.6814 | 0.6340 | 0.5271 | 0.4 | 0.4548 | 0.7023 | 0.6070 | 0.6512 | 0.8409 | 0.8267 | 0.8337 | 0.6973 | 0.5907 | 0.6396 | 0.5560 | 0.4108 | 0.4725 | 0.7330 | 0.8515 | 0.7878 | 0.4193 | 0.5707 | 0.4835 | 0.7890 | 0.7776 | 0.7832 | 0.6516 | 0.6044 | 0.6271 | 0.7283 | 0.5915 | 0.6528 | 0.6 | 0.075 | 0.1333 |
0.7136 | 15.0 | 26340 | 0.7812 | 0.7228 | 0.6390 | 0.5889 | 0.6012 | 0.5889 | 0.6691 | 0.7383 | 0.7020 | 0.6679 | 0.5898 | 0.6265 | 0.5154 | 0.4482 | 0.4795 | 0.6839 | 0.6175 | 0.6490 | 0.8221 | 0.8489 | 0.8353 | 0.6613 | 0.6213 | 0.6407 | 0.5508 | 0.4140 | 0.4727 | 0.7260 | 0.8482 | 0.7823 | 0.4718 | 0.4718 | 0.4718 | 0.7932 | 0.7631 | 0.7779 | 0.6205 | 0.6454 | 0.6327 | 0.7254 | 0.5994 | 0.6564 | 0.4 | 0.05 | 0.0889 |
0.7092 | 16.0 | 28096 | 0.7793 | 0.7225 | 0.6047 | 0.5858 | 0.5928 | 0.5858 | 0.6690 | 0.7391 | 0.7023 | 0.6432 | 0.6296 | 0.6363 | 0.5098 | 0.4179 | 0.4593 | 0.6622 | 0.6398 | 0.6508 | 0.8315 | 0.8426 | 0.8370 | 0.6647 | 0.616 | 0.6394 | 0.5688 | 0.4 | 0.4697 | 0.7301 | 0.8482 | 0.7847 | 0.4468 | 0.4896 | 0.4672 | 0.7833 | 0.7649 | 0.7740 | 0.6341 | 0.6177 | 0.6258 | 0.7182 | 0.6103 | 0.6599 | 0.0 | 0.0 | 0.0 |
0.7052 | 17.0 | 29852 | 0.7800 | 0.7224 | 0.6070 | 0.5875 | 0.5949 | 0.5875 | 0.6625 | 0.7469 | 0.7021 | 0.6365 | 0.6344 | 0.6355 | 0.5312 | 0.4107 | 0.4632 | 0.6874 | 0.6199 | 0.6519 | 0.8338 | 0.8367 | 0.8353 | 0.6665 | 0.62 | 0.6424 | 0.5529 | 0.4161 | 0.4748 | 0.7270 | 0.8614 | 0.7885 | 0.4559 | 0.4906 | 0.4726 | 0.7996 | 0.7649 | 0.7819 | 0.6444 | 0.6168 | 0.6303 | 0.6935 | 0.6197 | 0.6545 | 0.0 | 0.0 | 0.0 |
0.6988 | 18.0 | 31608 | 0.7770 | 0.7233 | 0.6089 | 0.5873 | 0.5950 | 0.5873 | 0.6674 | 0.7446 | 0.7038 | 0.6192 | 0.6483 | 0.6334 | 0.5072 | 0.4393 | 0.4708 | 0.6937 | 0.6199 | 0.6547 | 0.8287 | 0.8427 | 0.8356 | 0.6801 | 0.6084 | 0.6423 | 0.5853 | 0.3946 | 0.4714 | 0.7216 | 0.8812 | 0.7935 | 0.4646 | 0.4738 | 0.4691 | 0.7885 | 0.7685 | 0.7784 | 0.6457 | 0.6063 | 0.6254 | 0.7132 | 0.6072 | 0.6560 | 0.0 | 0.0 | 0.0 |
0.7021 | 19.0 | 33364 | 0.7778 | 0.7238 | 0.6356 | 0.5900 | 0.6000 | 0.5900 | 0.6731 | 0.7389 | 0.7045 | 0.6348 | 0.6306 | 0.6327 | 0.5010 | 0.4482 | 0.4731 | 0.6972 | 0.6222 | 0.6576 | 0.8285 | 0.8431 | 0.8358 | 0.6651 | 0.6249 | 0.6444 | 0.5852 | 0.3989 | 0.4744 | 0.7421 | 0.8548 | 0.7945 | 0.4534 | 0.4817 | 0.4671 | 0.7896 | 0.7667 | 0.7780 | 0.6365 | 0.6311 | 0.6338 | 0.7228 | 0.6041 | 0.6581 | 0.3333 | 0.025 | 0.0465 |
0.703 | 20.0 | 35120 | 0.7772 | 0.7230 | 0.6333 | 0.5894 | 0.5987 | 0.5894 | 0.6716 | 0.7385 | 0.7035 | 0.6341 | 0.6310 | 0.6326 | 0.5083 | 0.4375 | 0.4702 | 0.6816 | 0.6234 | 0.6512 | 0.8272 | 0.8432 | 0.8351 | 0.6761 | 0.6133 | 0.6432 | 0.5690 | 0.4032 | 0.4720 | 0.7358 | 0.8548 | 0.7908 | 0.4536 | 0.4886 | 0.4705 | 0.7860 | 0.7703 | 0.7781 | 0.6407 | 0.6273 | 0.6339 | 0.7153 | 0.6056 | 0.6559 | 0.3333 | 0.025 | 0.0465 |
Framework versions
- Transformers 4.32.0.dev0
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.11.0
- Downloads last month
- 16
Model tree for anniew666/lora-roberta-large-fine-emo
Base model
FacebookAI/roberta-large