Edit model card

ASAP_FineTuningBERT_AugV3_k15_task1_organization_fold1

This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8505
  • Qwk: 0.0676
  • Mse: 0.8505
  • Rmse: 0.9222

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0011 2 10.0047 0.0 10.0047 3.1630
No log 0.0023 4 8.9454 0.0 8.9454 2.9909
No log 0.0034 6 7.6345 0.0082 7.6345 2.7631
No log 0.0045 8 6.4765 0.0069 6.4765 2.5449
No log 0.0056 10 5.1755 0.0037 5.1755 2.2750
No log 0.0068 12 4.2966 0.0 4.2966 2.0728
No log 0.0079 14 3.4764 0.0614 3.4764 1.8645
No log 0.0090 16 2.7420 0.0253 2.7420 1.6559
No log 0.0101 18 2.1923 0.0079 2.1923 1.4806
No log 0.0113 20 2.6331 0.0340 2.6331 1.6227
No log 0.0124 22 2.5181 0.0253 2.5181 1.5868
No log 0.0135 24 1.4859 0.0039 1.4859 1.2190
No log 0.0147 26 1.4553 0.0024 1.4553 1.2063
No log 0.0158 28 2.0731 0.0 2.0731 1.4398
No log 0.0169 30 2.3882 0.0 2.3882 1.5454
No log 0.0180 32 2.8720 -0.0009 2.8720 1.6947
No log 0.0192 34 2.2022 0.0 2.2022 1.4840
No log 0.0203 36 2.1363 0.0 2.1363 1.4616
No log 0.0214 38 2.3261 0.0 2.3261 1.5252
No log 0.0225 40 2.1542 0.0 2.1542 1.4677
No log 0.0237 42 1.7925 0.0 1.7925 1.3388
No log 0.0248 44 1.7598 0.0 1.7598 1.3266
No log 0.0259 46 1.7037 0.0 1.7037 1.3053
No log 0.0271 48 1.8133 0.0 1.8133 1.3466
No log 0.0282 50 2.4095 0.0 2.4095 1.5523
No log 0.0293 52 2.2750 0.0 2.2750 1.5083
No log 0.0304 54 1.8373 0.0 1.8373 1.3555
No log 0.0316 56 1.7890 0.0 1.7890 1.3375
No log 0.0327 58 1.7752 0.0 1.7752 1.3324
No log 0.0338 60 1.9470 0.0 1.9470 1.3954
No log 0.0349 62 2.3749 0.0 2.3749 1.5411
No log 0.0361 64 2.5639 0.0 2.5639 1.6012
No log 0.0372 66 2.3363 0.0 2.3363 1.5285
No log 0.0383 68 2.0901 0.0 2.0901 1.4457
No log 0.0395 70 1.8142 0.0 1.8142 1.3469
No log 0.0406 72 1.5811 0.0 1.5811 1.2574
No log 0.0417 74 1.6051 0.0 1.6051 1.2669
No log 0.0428 76 1.9364 0.0 1.9364 1.3915
No log 0.0440 78 2.7479 0.0 2.7479 1.6577
No log 0.0451 80 3.1647 0.0 3.1647 1.7790
No log 0.0462 82 2.9378 0.0 2.9378 1.7140
No log 0.0474 84 2.3250 0.0 2.3250 1.5248
No log 0.0485 86 1.9022 0.0 1.9022 1.3792
No log 0.0496 88 1.9596 0.0 1.9596 1.3999
No log 0.0507 90 2.0232 0.0 2.0232 1.4224
No log 0.0519 92 2.1708 0.0 2.1708 1.4734
No log 0.0530 94 2.1052 0.0 2.1052 1.4509
No log 0.0541 96 1.7352 0.0 1.7352 1.3173
No log 0.0552 98 1.6309 0.0 1.6309 1.2771
No log 0.0564 100 1.5875 -0.0010 1.5875 1.2600
No log 0.0575 102 1.5852 0.0 1.5852 1.2590
No log 0.0586 104 1.6687 0.0 1.6687 1.2918
No log 0.0598 106 1.7366 0.0 1.7366 1.3178
No log 0.0609 108 1.8360 0.0 1.8360 1.3550
No log 0.0620 110 1.8293 0.0 1.8293 1.3525
No log 0.0631 112 2.0746 0.0 2.0746 1.4403
No log 0.0643 114 2.4498 0.0 2.4498 1.5652
No log 0.0654 116 2.5572 0.0 2.5572 1.5991
No log 0.0665 118 2.1713 0.0 2.1713 1.4735
No log 0.0676 120 2.0996 0.0 2.0996 1.4490
No log 0.0688 122 2.0345 0.0 2.0345 1.4263
No log 0.0699 124 2.0377 0.0 2.0377 1.4275
No log 0.0710 126 2.0398 0.0 2.0398 1.4282
No log 0.0722 128 1.8206 0.0 1.8206 1.3493
No log 0.0733 130 1.6920 0.0 1.6920 1.3008
No log 0.0744 132 1.6918 0.0 1.6918 1.3007
No log 0.0755 134 1.7606 0.0 1.7606 1.3269
No log 0.0767 136 1.8214 0.0 1.8214 1.3496
No log 0.0778 138 1.9290 0.0 1.9290 1.3889
No log 0.0789 140 2.0099 0.0 2.0099 1.4177
No log 0.0800 142 2.2602 0.0 2.2602 1.5034
No log 0.0812 144 2.3509 0.0 2.3509 1.5333
No log 0.0823 146 2.2748 0.0 2.2748 1.5082
No log 0.0834 148 2.2192 0.0 2.2192 1.4897
No log 0.0846 150 2.2036 0.0 2.2036 1.4845
No log 0.0857 152 2.0851 0.0 2.0851 1.4440
No log 0.0868 154 1.9277 0.0 1.9277 1.3884
No log 0.0879 156 1.9318 0.0 1.9318 1.3899
No log 0.0891 158 1.9771 0.0 1.9771 1.4061
No log 0.0902 160 2.0224 0.0 2.0224 1.4221
No log 0.0913 162 1.9576 0.0 1.9576 1.3991
No log 0.0924 164 1.7850 0.0 1.7850 1.3360
No log 0.0936 166 1.7255 0.0 1.7255 1.3136
No log 0.0947 168 1.8145 0.0 1.8145 1.3470
No log 0.0958 170 1.9883 0.0 1.9883 1.4101
No log 0.0970 172 2.1711 0.0 2.1711 1.4735
No log 0.0981 174 2.2092 0.0 2.2092 1.4863
No log 0.0992 176 2.2816 0.0 2.2816 1.5105
No log 0.1003 178 2.3667 0.0 2.3667 1.5384
No log 0.1015 180 2.4460 0.0 2.4460 1.5640
No log 0.1026 182 2.3247 0.0 2.3247 1.5247
No log 0.1037 184 2.1933 0.0 2.1933 1.4810
No log 0.1048 186 2.1068 0.0 2.1068 1.4515
No log 0.1060 188 1.8973 0.0 1.8973 1.3774
No log 0.1071 190 1.8338 0.0 1.8338 1.3542
No log 0.1082 192 1.9527 0.0 1.9527 1.3974
No log 0.1094 194 2.0163 0.0 2.0163 1.4200
No log 0.1105 196 2.1862 0.0 2.1862 1.4786
No log 0.1116 198 2.4934 0.0 2.4934 1.5791
No log 0.1127 200 2.3538 0.0 2.3538 1.5342
No log 0.1139 202 2.1314 0.0 2.1314 1.4599
No log 0.1150 204 2.0276 0.0 2.0276 1.4239
No log 0.1161 206 2.1921 0.0 2.1921 1.4806
No log 0.1172 208 2.3520 0.0 2.3520 1.5336
No log 0.1184 210 2.6467 0.0 2.6467 1.6269
No log 0.1195 212 2.4306 0.0 2.4306 1.5590
No log 0.1206 214 1.9017 0.0 1.9017 1.3790
No log 0.1218 216 1.4978 -0.0139 1.4978 1.2238
No log 0.1229 218 1.4043 -0.0359 1.4043 1.1851
No log 0.1240 220 1.5060 -0.0076 1.5060 1.2272
No log 0.1251 222 1.8336 0.0 1.8336 1.3541
No log 0.1263 224 2.3565 0.0 2.3565 1.5351
No log 0.1274 226 2.6620 0.0 2.6620 1.6316
No log 0.1285 228 2.7252 0.0 2.7252 1.6508
No log 0.1297 230 2.3642 0.0 2.3642 1.5376
No log 0.1308 232 2.0187 0.0 2.0187 1.4208
No log 0.1319 234 1.8010 0.0 1.8009 1.3420
No log 0.1330 236 1.6997 0.0 1.6997 1.3037
No log 0.1342 238 1.7753 0.0 1.7753 1.3324
No log 0.1353 240 1.9195 0.0 1.9195 1.3855
No log 0.1364 242 1.9717 0.0 1.9717 1.4042
No log 0.1375 244 2.0455 0.0 2.0455 1.4302
No log 0.1387 246 1.9000 0.0 1.9000 1.3784
No log 0.1398 248 1.7286 0.0 1.7286 1.3148
No log 0.1409 250 1.6931 0.0 1.6931 1.3012
No log 0.1421 252 1.7974 0.0 1.7974 1.3407
No log 0.1432 254 1.9091 0.0 1.9091 1.3817
No log 0.1443 256 1.9390 0.0 1.9390 1.3925
No log 0.1454 258 1.8604 0.0 1.8604 1.3640
No log 0.1466 260 1.7518 0.0 1.7518 1.3235
No log 0.1477 262 1.7268 0.0 1.7268 1.3141
No log 0.1488 264 1.8238 0.0 1.8238 1.3505
No log 0.1499 266 1.7855 0.0 1.7855 1.3362
No log 0.1511 268 1.5833 0.0029 1.5833 1.2583
No log 0.1522 270 1.3473 0.0710 1.3473 1.1607
No log 0.1533 272 1.2441 0.0647 1.2441 1.1154
No log 0.1545 274 1.2623 0.0819 1.2623 1.1235
No log 0.1556 276 1.4108 0.0665 1.4108 1.1878
No log 0.1567 278 1.5740 0.0034 1.5740 1.2546
No log 0.1578 280 1.6837 0.0 1.6837 1.2976
No log 0.1590 282 1.6330 0.0005 1.6330 1.2779
No log 0.1601 284 1.4988 0.0325 1.4988 1.2242
No log 0.1612 286 1.3547 0.0838 1.3547 1.1639
No log 0.1623 288 1.3265 0.0914 1.3265 1.1517
No log 0.1635 290 1.4357 0.0428 1.4357 1.1982
No log 0.1646 292 1.4758 0.0331 1.4758 1.2148
No log 0.1657 294 1.4023 0.0635 1.4023 1.1842
No log 0.1669 296 1.3553 0.0747 1.3553 1.1642
No log 0.1680 298 1.3463 0.0827 1.3463 1.1603
No log 0.1691 300 1.3060 0.0862 1.3060 1.1428
No log 0.1702 302 1.3442 0.0716 1.3442 1.1594
No log 0.1714 304 1.4507 0.0430 1.4507 1.2045
No log 0.1725 306 1.5142 0.0119 1.5142 1.2305
No log 0.1736 308 1.6687 0.0 1.6687 1.2918
No log 0.1747 310 1.8485 0.0 1.8485 1.3596
No log 0.1759 312 1.9351 0.0 1.9351 1.3911
No log 0.1770 314 1.8345 0.0 1.8345 1.3544
No log 0.1781 316 1.5906 0.0 1.5906 1.2612
No log 0.1793 318 1.3406 0.1048 1.3406 1.1579
No log 0.1804 320 1.2384 0.0921 1.2384 1.1128
No log 0.1815 322 1.2328 0.0921 1.2328 1.1103
No log 0.1826 324 1.2713 0.1197 1.2713 1.1275
No log 0.1838 326 1.3517 0.0968 1.3517 1.1626
No log 0.1849 328 1.3799 0.0757 1.3799 1.1747
No log 0.1860 330 1.4759 0.0031 1.4759 1.2149
No log 0.1871 332 1.5380 -0.0005 1.5380 1.2402
No log 0.1883 334 1.5357 -0.0005 1.5357 1.2392
No log 0.1894 336 1.5078 0.0104 1.5078 1.2279
No log 0.1905 338 1.4953 0.0244 1.4953 1.2228
No log 0.1917 340 1.3762 0.1089 1.3762 1.1731
No log 0.1928 342 1.2525 0.1360 1.2525 1.1191
No log 0.1939 344 1.1876 0.1121 1.1876 1.0898
No log 0.1950 346 1.1763 0.1041 1.1763 1.0846
No log 0.1962 348 1.1523 0.0868 1.1523 1.0735
No log 0.1973 350 1.1415 0.0977 1.1415 1.0684
No log 0.1984 352 1.1644 0.1094 1.1644 1.0791
No log 0.1995 354 1.2059 0.1110 1.2059 1.0981
No log 0.2007 356 1.2484 0.1179 1.2484 1.1173
No log 0.2018 358 1.1463 0.1075 1.1463 1.0707
No log 0.2029 360 1.0586 0.1045 1.0586 1.0289
No log 0.2041 362 1.0035 0.0744 1.0035 1.0018
No log 0.2052 364 0.9864 0.0467 0.9864 0.9932
No log 0.2063 366 0.9696 0.0467 0.9696 0.9847
No log 0.2074 368 0.9763 0.0600 0.9763 0.9881
No log 0.2086 370 1.0467 0.1014 1.0467 1.0231
No log 0.2097 372 1.1056 0.1060 1.1056 1.0515
No log 0.2108 374 1.0847 0.1060 1.0847 1.0415
No log 0.2120 376 1.0379 0.1053 1.0379 1.0188
No log 0.2131 378 0.9986 0.0995 0.9986 0.9993
No log 0.2142 380 0.9677 0.0766 0.9677 0.9837
No log 0.2153 382 0.9596 0.0766 0.9596 0.9796
No log 0.2165 384 0.9779 0.0905 0.9779 0.9889
No log 0.2176 386 0.9829 0.0981 0.9829 0.9914
No log 0.2187 388 0.9945 0.1014 0.9945 0.9972
No log 0.2198 390 1.0299 0.1053 1.0299 1.0148
No log 0.2210 392 1.1083 0.1082 1.1083 1.0528
No log 0.2221 394 1.1971 0.1177 1.1971 1.0941
No log 0.2232 396 1.2446 0.1237 1.2446 1.1156
No log 0.2244 398 1.1894 0.1229 1.1894 1.0906
No log 0.2255 400 1.1429 0.1082 1.1429 1.0691
No log 0.2266 402 1.1032 0.1056 1.1032 1.0504
No log 0.2277 404 1.0770 0.1034 1.0770 1.0378
No log 0.2289 406 1.0586 0.1010 1.0586 1.0289
No log 0.2300 408 1.0146 0.1010 1.0146 1.0073
No log 0.2311 410 0.9658 0.0860 0.9658 0.9827
No log 0.2322 412 0.9344 0.0843 0.9344 0.9666
No log 0.2334 414 0.9266 0.0860 0.9266 0.9626
No log 0.2345 416 0.9202 0.0881 0.9202 0.9593
No log 0.2356 418 0.9121 0.0743 0.9121 0.9550
No log 0.2368 420 0.9162 0.0600 0.9162 0.9572
No log 0.2379 422 0.9287 0.0365 0.9287 0.9637
No log 0.2390 424 0.9378 0.0365 0.9378 0.9684
No log 0.2401 426 0.9444 0.0390 0.9444 0.9718
No log 0.2413 428 0.9574 0.0390 0.9574 0.9785
No log 0.2424 430 0.9799 0.0676 0.9799 0.9899
No log 0.2435 432 1.0149 0.0896 1.0149 1.0074
No log 0.2446 434 1.0818 0.1010 1.0818 1.0401
No log 0.2458 436 1.1287 0.1029 1.1287 1.0624
No log 0.2469 438 1.1481 0.1014 1.1481 1.0715
No log 0.2480 440 1.0856 0.1010 1.0856 1.0419
No log 0.2492 442 1.0272 0.1010 1.0272 1.0135
No log 0.2503 444 0.9721 0.1010 0.9721 0.9860
No log 0.2514 446 0.9233 0.0881 0.9233 0.9609
No log 0.2525 448 0.9006 0.0870 0.9006 0.9490
No log 0.2537 450 0.8903 0.0991 0.8903 0.9435
No log 0.2548 452 0.8821 0.1025 0.8821 0.9392
No log 0.2559 454 0.8696 0.0910 0.8696 0.9325
No log 0.2570 456 0.8643 0.0773 0.8643 0.9297
No log 0.2582 458 0.8634 0.0594 0.8634 0.9292
No log 0.2593 460 0.8694 0.0816 0.8694 0.9324
No log 0.2604 462 0.8815 0.0816 0.8815 0.9389
No log 0.2616 464 0.8725 0.0837 0.8725 0.9341
No log 0.2627 466 0.8627 0.0843 0.8627 0.9288
No log 0.2638 468 0.8663 0.1010 0.8663 0.9307
No log 0.2649 470 0.8768 0.1010 0.8768 0.9364
No log 0.2661 472 0.8781 0.1010 0.8781 0.9371
No log 0.2672 474 0.8797 0.0751 0.8797 0.9379
No log 0.2683 476 0.8855 0.0262 0.8855 0.9410
No log 0.2694 478 0.8942 0.0262 0.8942 0.9456
No log 0.2706 480 0.8976 0.0262 0.8976 0.9474
No log 0.2717 482 0.9092 0.0262 0.9092 0.9535
No log 0.2728 484 0.9344 0.0875 0.9344 0.9667
No log 0.2740 486 0.9357 0.1029 0.9357 0.9673
No log 0.2751 488 0.9024 0.0868 0.9024 0.9499
No log 0.2762 490 0.8768 0.0586 0.8768 0.9364
No log 0.2773 492 0.8709 0.0477 0.8709 0.9332
No log 0.2785 494 0.8631 0.0390 0.8631 0.9290
No log 0.2796 496 0.8588 0.0262 0.8588 0.9267
No log 0.2807 498 0.8553 0.0400 0.8553 0.9248
1.4812 0.2818 500 0.8579 0.0766 0.8579 0.9262
1.4812 0.2830 502 0.8671 0.0936 0.8671 0.9312
1.4812 0.2841 504 0.8662 0.1001 0.8662 0.9307
1.4812 0.2852 506 0.8568 0.0915 0.8568 0.9256
1.4812 0.2864 508 0.8493 0.0810 0.8493 0.9216
1.4812 0.2875 510 0.8484 0.0810 0.8484 0.9211
1.4812 0.2886 512 0.8479 0.0794 0.8479 0.9208
1.4812 0.2897 514 0.8465 0.0895 0.8465 0.9200
1.4812 0.2909 516 0.8435 0.0801 0.8435 0.9184
1.4812 0.2920 518 0.8416 0.0900 0.8416 0.9174
1.4812 0.2931 520 0.8502 0.1010 0.8502 0.9221
1.4812 0.2943 522 0.8573 0.1010 0.8573 0.9259
1.4812 0.2954 524 0.8657 0.1010 0.8657 0.9304
1.4812 0.2965 526 0.8681 0.1010 0.8681 0.9317
1.4812 0.2976 528 0.8639 0.0991 0.8639 0.9294
1.4812 0.2988 530 0.8535 0.0822 0.8535 0.9238
1.4812 0.2999 532 0.8371 0.0728 0.8371 0.9149
1.4812 0.3010 534 0.8221 0.0822 0.8221 0.9067
1.4812 0.3021 536 0.8138 0.0721 0.8138 0.9021
1.4812 0.3033 538 0.8160 0.0721 0.8160 0.9033
1.4812 0.3044 540 0.8282 0.0721 0.8282 0.9100
1.4812 0.3055 542 0.8334 0.0721 0.8334 0.9129
1.4812 0.3067 544 0.8242 0.0890 0.8242 0.9078
1.4812 0.3078 546 0.8199 0.0916 0.8199 0.9055
1.4812 0.3089 548 0.8210 0.0756 0.8210 0.9061
1.4812 0.3100 550 0.8246 0.0706 0.8246 0.9081
1.4812 0.3112 552 0.8281 0.0706 0.8281 0.9100
1.4812 0.3123 554 0.8334 0.0916 0.8334 0.9129
1.4812 0.3134 556 0.8293 0.0890 0.8293 0.9107
1.4812 0.3145 558 0.8222 0.0896 0.8222 0.9067
1.4812 0.3157 560 0.8224 0.0890 0.8224 0.9069
1.4812 0.3168 562 0.8322 0.0885 0.8322 0.9122
1.4812 0.3179 564 0.8381 0.0916 0.8381 0.9155
1.4812 0.3191 566 0.8413 0.0901 0.8413 0.9172
1.4812 0.3202 568 0.8561 0.0901 0.8561 0.9253
1.4812 0.3213 570 0.8643 0.0783 0.8643 0.9297
1.4812 0.3224 572 0.8744 0.0706 0.8744 0.9351
1.4812 0.3236 574 0.8842 0.0684 0.8842 0.9403
1.4812 0.3247 576 0.8750 0.0539 0.8750 0.9354
1.4812 0.3258 578 0.8662 0.0571 0.8662 0.9307
1.4812 0.3269 580 0.8505 0.0676 0.8505 0.9222

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ASAP_FineTuningBERT_AugV3_k15_task1_organization_fold1

Finetuned
(2058)
this model