Edit model card

ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold0

This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6338
  • Qwk: 0.4989
  • Mse: 0.6338
  • Rmse: 0.7961

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0048 2 9.7884 0.0036 9.7884 3.1286
No log 0.0097 4 8.3457 0.0 8.3457 2.8889
No log 0.0145 6 7.3232 0.0 7.3232 2.7061
No log 0.0193 8 6.6114 0.0 6.6114 2.5713
No log 0.0242 10 5.9011 0.0090 5.9011 2.4292
No log 0.0290 12 5.0913 0.0112 5.0913 2.2564
No log 0.0338 14 4.3007 0.0077 4.3007 2.0738
No log 0.0386 16 3.5755 0.0039 3.5755 1.8909
No log 0.0435 18 2.8540 0.0 2.8540 1.6894
No log 0.0483 20 2.2425 0.0742 2.2425 1.4975
No log 0.0531 22 1.7255 0.0382 1.7255 1.3136
No log 0.0580 24 1.3306 0.0316 1.3306 1.1535
No log 0.0628 26 1.0734 0.0316 1.0734 1.0361
No log 0.0676 28 0.8781 0.1273 0.8781 0.9370
No log 0.0725 30 0.8005 0.1169 0.8005 0.8947
No log 0.0773 32 0.7497 0.1076 0.7497 0.8659
No log 0.0821 34 0.8244 0.0689 0.8244 0.9080
No log 0.0870 36 0.8958 0.0521 0.8958 0.9465
No log 0.0918 38 0.9542 0.0348 0.9542 0.9769
No log 0.0966 40 1.1204 0.0348 1.1204 1.0585
No log 0.1014 42 1.1989 0.0174 1.1989 1.0949
No log 0.1063 44 1.0374 0.0174 1.0374 1.0185
No log 0.1111 46 0.9324 0.0174 0.9324 0.9656
No log 0.1159 48 0.9645 0.0174 0.9645 0.9821
No log 0.1208 50 1.1371 0.0174 1.1371 1.0663
No log 0.1256 52 1.0818 0.0174 1.0818 1.0401
No log 0.1304 54 1.1180 0.0174 1.1180 1.0573
No log 0.1353 56 1.0966 0.0 1.0966 1.0472
No log 0.1401 58 0.9594 0.0 0.9594 0.9795
No log 0.1449 60 0.8625 0.0 0.8625 0.9287
No log 0.1498 62 0.8699 0.0 0.8699 0.9327
No log 0.1546 64 0.9776 0.0 0.9776 0.9887
No log 0.1594 66 1.0314 0.0 1.0314 1.0156
No log 0.1643 68 0.8824 0.0174 0.8824 0.9393
No log 0.1691 70 0.8842 0.0174 0.8842 0.9403
No log 0.1739 72 1.0974 0.0 1.0974 1.0475
No log 0.1787 74 1.1630 0.0 1.1630 1.0784
No log 0.1836 76 0.9696 0.0 0.9696 0.9847
No log 0.1884 78 0.9355 0.0 0.9355 0.9672
No log 0.1932 80 1.0228 0.0 1.0228 1.0113
No log 0.1981 82 1.0875 0.0 1.0875 1.0428
No log 0.2029 84 1.0401 0.0 1.0401 1.0198
No log 0.2077 86 0.9557 0.0 0.9557 0.9776
No log 0.2126 88 0.9466 0.0 0.9466 0.9729
No log 0.2174 90 0.9541 0.0 0.9541 0.9768
No log 0.2222 92 0.9751 0.0 0.9751 0.9875
No log 0.2271 94 0.9317 0.0 0.9317 0.9652
No log 0.2319 96 0.8861 0.0 0.8861 0.9413
No log 0.2367 98 0.8846 0.0 0.8846 0.9405
No log 0.2415 100 0.9770 0.0 0.9770 0.9885
No log 0.2464 102 1.0583 0.0 1.0583 1.0287
No log 0.2512 104 0.9901 0.0 0.9901 0.9950
No log 0.2560 106 0.9516 0.0174 0.9516 0.9755
No log 0.2609 108 1.0214 0.0 1.0214 1.0106
No log 0.2657 110 0.9706 0.0 0.9706 0.9852
No log 0.2705 112 0.8449 0.0 0.8449 0.9192
No log 0.2754 114 0.7743 0.0049 0.7743 0.8799
No log 0.2802 116 0.7784 0.0147 0.7784 0.8823
No log 0.2850 118 0.8229 0.0 0.8229 0.9071
No log 0.2899 120 1.0002 0.1325 1.0002 1.0001
No log 0.2947 122 1.0582 0.1573 1.0582 1.0287
No log 0.2995 124 1.0952 0.1566 1.0952 1.0465
No log 0.3043 126 1.2585 0.1007 1.2585 1.1218
No log 0.3092 128 1.0667 0.2120 1.0667 1.0328
No log 0.3140 130 0.7665 0.1629 0.7665 0.8755
No log 0.3188 132 0.9435 0.2251 0.9435 0.9714
No log 0.3237 134 1.2567 0.1784 1.2567 1.1210
No log 0.3285 136 0.9381 0.2307 0.9381 0.9685
No log 0.3333 138 0.7142 0.2795 0.7142 0.8451
No log 0.3382 140 0.6960 0.2368 0.6960 0.8343
No log 0.3430 142 0.7194 0.1643 0.7194 0.8482
No log 0.3478 144 0.7069 0.2029 0.7069 0.8408
No log 0.3527 146 0.8457 0.2275 0.8457 0.9196
No log 0.3575 148 0.7414 0.1941 0.7414 0.8610
No log 0.3623 150 0.7162 0.2938 0.7162 0.8463
No log 0.3671 152 0.7151 0.2805 0.7151 0.8457
No log 0.3720 154 0.7294 0.1767 0.7294 0.8541
No log 0.3768 156 1.0364 0.2084 1.0364 1.0181
No log 0.3816 158 1.1460 0.1853 1.1460 1.0705
No log 0.3865 160 1.0650 0.1970 1.0650 1.0320
No log 0.3913 162 0.9547 0.2146 0.9547 0.9771
No log 0.3961 164 0.7515 0.1664 0.7515 0.8669
No log 0.4010 166 0.7146 0.2398 0.7146 0.8453
No log 0.4058 168 0.7202 0.2807 0.7202 0.8486
No log 0.4106 170 0.9297 0.3111 0.9297 0.9642
No log 0.4155 172 1.1130 0.2237 1.1130 1.0550
No log 0.4203 174 0.8935 0.3526 0.8935 0.9452
No log 0.4251 176 0.7485 0.3612 0.7485 0.8652
No log 0.4300 178 0.7487 0.3652 0.7487 0.8653
No log 0.4348 180 0.7959 0.3743 0.7959 0.8921
No log 0.4396 182 0.8790 0.3561 0.8790 0.9375
No log 0.4444 184 0.8131 0.3858 0.8131 0.9017
No log 0.4493 186 0.6981 0.3787 0.6981 0.8355
No log 0.4541 188 0.6664 0.3902 0.6664 0.8164
No log 0.4589 190 0.6466 0.3808 0.6466 0.8041
No log 0.4638 192 0.6441 0.3945 0.6441 0.8026
No log 0.4686 194 0.7083 0.3771 0.7083 0.8416
No log 0.4734 196 0.8337 0.3348 0.8337 0.9131
No log 0.4783 198 0.7679 0.3700 0.7679 0.8763
No log 0.4831 200 0.6496 0.4087 0.6496 0.8060
No log 0.4879 202 0.6448 0.3172 0.6448 0.8030
No log 0.4928 204 0.6885 0.2440 0.6885 0.8298
No log 0.4976 206 0.8107 0.3165 0.8107 0.9004
No log 0.5024 208 1.0132 0.2603 1.0132 1.0066
No log 0.5072 210 1.0492 0.2428 1.0492 1.0243
No log 0.5121 212 0.8933 0.2702 0.8933 0.9451
No log 0.5169 214 0.6569 0.3295 0.6569 0.8105
No log 0.5217 216 0.6409 0.3527 0.6409 0.8005
No log 0.5266 218 0.6300 0.3866 0.6300 0.7937
No log 0.5314 220 0.8547 0.3163 0.8547 0.9245
No log 0.5362 222 0.9754 0.2782 0.9754 0.9876
No log 0.5411 224 0.8529 0.3246 0.8529 0.9235
No log 0.5459 226 0.6840 0.2933 0.6840 0.8270
No log 0.5507 228 0.6217 0.3660 0.6217 0.7885
No log 0.5556 230 0.6445 0.3616 0.6445 0.8028
No log 0.5604 232 0.6875 0.3841 0.6875 0.8291
No log 0.5652 234 0.6002 0.4674 0.6002 0.7747
No log 0.5700 236 0.6340 0.4779 0.6340 0.7962
No log 0.5749 238 0.6832 0.4418 0.6832 0.8266
No log 0.5797 240 0.5909 0.5007 0.5909 0.7687
No log 0.5845 242 0.5939 0.4093 0.5939 0.7706
No log 0.5894 244 0.5803 0.4322 0.5803 0.7618
No log 0.5942 246 0.6677 0.4329 0.6677 0.8172
No log 0.5990 248 0.9902 0.3306 0.9902 0.9951
No log 0.6039 250 0.9869 0.3203 0.9869 0.9934
No log 0.6087 252 0.7199 0.3776 0.7199 0.8484
No log 0.6135 254 0.6199 0.4368 0.6199 0.7874
No log 0.6184 256 0.6016 0.4359 0.6016 0.7756
No log 0.6232 258 0.6023 0.4257 0.6023 0.7761
No log 0.6280 260 0.6339 0.4306 0.6339 0.7962
No log 0.6329 262 0.7260 0.4105 0.7260 0.8520
No log 0.6377 264 0.7948 0.3844 0.7948 0.8915
No log 0.6425 266 0.6656 0.4497 0.6656 0.8158
No log 0.6473 268 0.5675 0.4684 0.5675 0.7533
No log 0.6522 270 0.5618 0.4300 0.5618 0.7495
No log 0.6570 272 0.5491 0.4477 0.5491 0.7410
No log 0.6618 274 0.5510 0.4729 0.5510 0.7423
No log 0.6667 276 0.5425 0.4794 0.5425 0.7366
No log 0.6715 278 0.6099 0.4791 0.6099 0.7810
No log 0.6763 280 0.6061 0.4839 0.6061 0.7786
No log 0.6812 282 0.5278 0.4896 0.5278 0.7265
No log 0.6860 284 0.5469 0.4460 0.5469 0.7395
No log 0.6908 286 0.5380 0.4797 0.5380 0.7335
No log 0.6957 288 0.5378 0.5305 0.5378 0.7334
No log 0.7005 290 0.6036 0.5275 0.6036 0.7769
No log 0.7053 292 0.6518 0.5202 0.6518 0.8073
No log 0.7101 294 0.5815 0.5521 0.5815 0.7626
No log 0.7150 296 0.6317 0.5175 0.6317 0.7948
No log 0.7198 298 0.5793 0.5322 0.5793 0.7611
No log 0.7246 300 0.7073 0.4870 0.7073 0.8410
No log 0.7295 302 0.6423 0.4907 0.6423 0.8014
No log 0.7343 304 0.5469 0.4514 0.5469 0.7396
No log 0.7391 306 0.6290 0.4562 0.6290 0.7931
No log 0.7440 308 0.5726 0.4594 0.5726 0.7567
No log 0.7488 310 0.5502 0.4640 0.5502 0.7418
No log 0.7536 312 0.5479 0.4382 0.5479 0.7402
No log 0.7585 314 0.5798 0.4434 0.5798 0.7615
No log 0.7633 316 0.5601 0.4432 0.5601 0.7484
No log 0.7681 318 0.5422 0.4379 0.5422 0.7364
No log 0.7729 320 0.5739 0.4597 0.5739 0.7575
No log 0.7778 322 0.5998 0.4597 0.5998 0.7744
No log 0.7826 324 0.5428 0.4874 0.5428 0.7367
No log 0.7874 326 0.5459 0.4665 0.5459 0.7388
No log 0.7923 328 0.6149 0.4466 0.6149 0.7841
No log 0.7971 330 0.6008 0.4300 0.6008 0.7751
No log 0.8019 332 0.5639 0.4420 0.5639 0.7509
No log 0.8068 334 0.5663 0.4335 0.5663 0.7525
No log 0.8116 336 0.6073 0.4322 0.6073 0.7793
No log 0.8164 338 0.7503 0.4317 0.7503 0.8662
No log 0.8213 340 0.7104 0.4243 0.7104 0.8428
No log 0.8261 342 0.5794 0.4798 0.5794 0.7612
No log 0.8309 344 0.7353 0.4168 0.7353 0.8575
No log 0.8357 346 0.6988 0.4094 0.6988 0.8359
No log 0.8406 348 0.5819 0.4614 0.5819 0.7628
No log 0.8454 350 0.6966 0.4274 0.6966 0.8346
No log 0.8502 352 0.6598 0.4491 0.6598 0.8123
No log 0.8551 354 0.5731 0.4695 0.5731 0.7570
No log 0.8599 356 0.6861 0.4152 0.6861 0.8283
No log 0.8647 358 0.6613 0.4218 0.6613 0.8132
No log 0.8696 360 0.5721 0.4670 0.5721 0.7564
No log 0.8744 362 0.7264 0.4363 0.7264 0.8523
No log 0.8792 364 0.8739 0.3913 0.8739 0.9349
No log 0.8841 366 0.7313 0.4293 0.7313 0.8552
No log 0.8889 368 0.5945 0.5008 0.5945 0.7710
No log 0.8937 370 0.6306 0.4690 0.6306 0.7941
No log 0.8986 372 0.6189 0.4811 0.6189 0.7867
No log 0.9034 374 0.6407 0.5069 0.6407 0.8005
No log 0.9082 376 0.8693 0.4153 0.8693 0.9324
No log 0.9130 378 0.9686 0.3969 0.9686 0.9842
No log 0.9179 380 0.7622 0.4090 0.7622 0.8730
No log 0.9227 382 0.5952 0.4832 0.5952 0.7715
No log 0.9275 384 0.5749 0.4479 0.5749 0.7582
No log 0.9324 386 0.5869 0.4698 0.5869 0.7661
No log 0.9372 388 0.6879 0.4415 0.6879 0.8294
No log 0.9420 390 0.6862 0.4525 0.6862 0.8283
No log 0.9469 392 0.6271 0.4711 0.6271 0.7919
No log 0.9517 394 0.6621 0.4657 0.6621 0.8137
No log 0.9565 396 0.7243 0.4701 0.7243 0.8511
No log 0.9614 398 0.6952 0.4560 0.6952 0.8338
No log 0.9662 400 0.5920 0.4401 0.5920 0.7694
No log 0.9710 402 0.5406 0.4735 0.5406 0.7352
No log 0.9758 404 0.5606 0.4797 0.5606 0.7488
No log 0.9807 406 0.5542 0.4880 0.5542 0.7445
No log 0.9855 408 0.5656 0.4641 0.5656 0.7521
No log 0.9903 410 0.5598 0.4875 0.5598 0.7482
No log 0.9952 412 0.5654 0.4879 0.5654 0.7519
No log 1.0 414 0.5967 0.5004 0.5967 0.7724
No log 1.0048 416 0.6055 0.5170 0.6055 0.7781
No log 1.0097 418 0.5804 0.5221 0.5804 0.7618
No log 1.0145 420 0.5666 0.5118 0.5666 0.7527
No log 1.0193 422 0.5895 0.5255 0.5895 0.7678
No log 1.0242 424 0.5697 0.5319 0.5697 0.7548
No log 1.0290 426 0.6259 0.5038 0.6259 0.7912
No log 1.0338 428 0.6697 0.4944 0.6697 0.8183
No log 1.0386 430 0.5692 0.5551 0.5692 0.7544
No log 1.0435 432 0.6608 0.4884 0.6608 0.8129
No log 1.0483 434 0.6204 0.5025 0.6204 0.7876
No log 1.0531 436 0.5643 0.5175 0.5643 0.7512
No log 1.0580 438 0.6181 0.4652 0.6181 0.7862
No log 1.0628 440 0.5918 0.4781 0.5918 0.7693
No log 1.0676 442 0.5837 0.4610 0.5837 0.7640
No log 1.0725 444 0.6268 0.4404 0.6268 0.7917
No log 1.0773 446 0.7865 0.4318 0.7865 0.8869
No log 1.0821 448 0.8325 0.4030 0.8325 0.9124
No log 1.0870 450 0.7097 0.4009 0.7097 0.8424
No log 1.0918 452 0.6230 0.4214 0.6230 0.7893
No log 1.0966 454 0.6219 0.4450 0.6219 0.7886
No log 1.1014 456 0.6602 0.4060 0.6602 0.8125
No log 1.1063 458 0.8361 0.4141 0.8361 0.9144
No log 1.1111 460 0.7944 0.4204 0.7944 0.8913
No log 1.1159 462 0.6283 0.4107 0.6283 0.7926
No log 1.1208 464 0.6113 0.4390 0.6113 0.7819
No log 1.1256 466 0.6580 0.4379 0.6580 0.8111
No log 1.1304 468 0.6096 0.4402 0.6096 0.7807
No log 1.1353 470 0.5697 0.4985 0.5697 0.7548
No log 1.1401 472 0.5775 0.5157 0.5775 0.7599
No log 1.1449 474 0.5896 0.5394 0.5896 0.7678
No log 1.1498 476 0.6675 0.5300 0.6675 0.8170
No log 1.1546 478 0.7546 0.4850 0.7546 0.8687
No log 1.1594 480 0.6185 0.5061 0.6185 0.7865
No log 1.1643 482 0.5779 0.5398 0.5779 0.7602
No log 1.1691 484 0.6439 0.5168 0.6439 0.8024
No log 1.1739 486 0.5688 0.5326 0.5688 0.7542
No log 1.1787 488 0.5815 0.4819 0.5815 0.7626
No log 1.1836 490 0.6205 0.4913 0.6205 0.7877
No log 1.1884 492 0.5510 0.4939 0.5510 0.7423
No log 1.1932 494 0.5598 0.5283 0.5598 0.7482
No log 1.1981 496 0.5677 0.5509 0.5677 0.7535
No log 1.2029 498 0.5941 0.5419 0.5941 0.7708
1.006 1.2077 500 0.6758 0.5151 0.6758 0.8220
1.006 1.2126 502 0.6052 0.5619 0.6052 0.7780
1.006 1.2174 504 0.6164 0.5571 0.6164 0.7851
1.006 1.2222 506 0.7070 0.4966 0.7070 0.8409
1.006 1.2271 508 0.6159 0.5705 0.6159 0.7848
1.006 1.2319 510 0.6100 0.5811 0.6100 0.7810
1.006 1.2367 512 0.5981 0.5895 0.5981 0.7734
1.006 1.2415 514 0.5755 0.5539 0.5755 0.7586
1.006 1.2464 516 0.6028 0.5150 0.6028 0.7764
1.006 1.2512 518 0.5452 0.5514 0.5452 0.7384
1.006 1.2560 520 0.5842 0.5165 0.5842 0.7643
1.006 1.2609 522 0.6252 0.4805 0.6252 0.7907
1.006 1.2657 524 0.5512 0.4616 0.5512 0.7424
1.006 1.2705 526 0.5600 0.5282 0.5600 0.7483
1.006 1.2754 528 0.5831 0.4820 0.5831 0.7636
1.006 1.2802 530 0.5445 0.4965 0.5445 0.7379
1.006 1.2850 532 0.6660 0.4704 0.6660 0.8161
1.006 1.2899 534 0.7488 0.4750 0.7488 0.8654
1.006 1.2947 536 0.6338 0.4989 0.6338 0.7961

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
17
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold0

Finetuned
(2174)
this model