Edit model card

ASAP_FineTuningBERT_AugV3_k10_task1_organization_fold1

This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5235
  • Qwk: 0.0080
  • Mse: 1.5235
  • Rmse: 1.2343

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0017 2 10.0555 0.0 10.0555 3.1710
No log 0.0034 4 8.9500 0.0 8.9500 2.9917
No log 0.0050 6 7.7060 0.0073 7.7060 2.7760
No log 0.0067 8 6.3491 0.0041 6.3491 2.5197
No log 0.0084 10 5.1677 0.0037 5.1677 2.2733
No log 0.0101 12 4.1917 0.0 4.1917 2.0474
No log 0.0117 14 3.3442 0.0505 3.3442 1.8287
No log 0.0134 16 2.6858 0.0118 2.6858 1.6388
No log 0.0151 18 2.2369 0.0079 2.2369 1.4956
No log 0.0168 20 1.8389 0.0040 1.8389 1.3561
No log 0.0184 22 1.5408 0.0 1.5408 1.2413
No log 0.0201 24 1.7519 0.0040 1.7519 1.3236
No log 0.0218 26 2.3248 0.0040 2.3248 1.5247
No log 0.0235 28 1.9044 0.0 1.9044 1.3800
No log 0.0251 30 1.3197 0.1273 1.3197 1.1488
No log 0.0268 32 1.7843 0.0 1.7843 1.3358
No log 0.0285 34 2.3587 0.0 2.3587 1.5358
No log 0.0302 36 2.0872 0.0 2.0872 1.4447
No log 0.0318 38 2.3104 0.0 2.3104 1.5200
No log 0.0335 40 2.5450 0.0 2.5450 1.5953
No log 0.0352 42 2.0354 0.0 2.0354 1.4267
No log 0.0369 44 1.7991 0.0 1.7991 1.3413
No log 0.0385 46 2.0412 0.0 2.0412 1.4287
No log 0.0402 48 1.9410 0.0 1.9410 1.3932
No log 0.0419 50 1.9814 0.0 1.9814 1.4076
No log 0.0436 52 2.0285 0.0 2.0285 1.4243
No log 0.0452 54 2.4956 0.0 2.4956 1.5797
No log 0.0469 56 2.6470 0.0 2.6470 1.6270
No log 0.0486 58 2.3212 0.0 2.3212 1.5236
No log 0.0503 60 2.0602 0.0 2.0602 1.4353
No log 0.0519 62 2.0794 0.0 2.0794 1.4420
No log 0.0536 64 2.0714 0.0 2.0714 1.4392
No log 0.0553 66 2.0642 0.0 2.0642 1.4367
No log 0.0570 68 1.8589 -0.0040 1.8589 1.3634
No log 0.0586 70 1.5681 -0.0207 1.5681 1.2522
No log 0.0603 72 1.6149 -0.0105 1.6149 1.2708
No log 0.0620 74 1.8833 -0.0040 1.8833 1.3724
No log 0.0637 76 1.9984 0.0 1.9984 1.4137
No log 0.0653 78 2.2538 0.0 2.2538 1.5013
No log 0.0670 80 2.3498 0.0 2.3498 1.5329
No log 0.0687 82 2.3131 0.0 2.3131 1.5209
No log 0.0704 84 2.1694 0.0 2.1694 1.4729
No log 0.0720 86 2.0945 0.0 2.0945 1.4473
No log 0.0737 88 1.8931 0.0 1.8931 1.3759
No log 0.0754 90 1.8745 0.0 1.8745 1.3691
No log 0.0771 92 2.0462 0.0 2.0462 1.4305
No log 0.0787 94 2.1037 0.0 2.1037 1.4504
No log 0.0804 96 2.2291 0.0 2.2291 1.4930
No log 0.0821 98 2.1640 0.0 2.1640 1.4711
No log 0.0838 100 2.0158 0.0 2.0158 1.4198
No log 0.0854 102 1.9552 0.0 1.9552 1.3983
No log 0.0871 104 1.8525 0.0 1.8525 1.3611
No log 0.0888 106 1.7328 -0.0040 1.7328 1.3163
No log 0.0905 108 1.8211 -0.0040 1.8211 1.3495
No log 0.0921 110 2.0300 0.0 2.0300 1.4248
No log 0.0938 112 2.2221 0.0 2.2221 1.4907
No log 0.0955 114 2.3728 0.0 2.3728 1.5404
No log 0.0972 116 2.2960 0.0 2.2960 1.5153
No log 0.0988 118 2.1523 0.0 2.1523 1.4671
No log 0.1005 120 2.1029 0.0 2.1029 1.4502
No log 0.1022 122 2.1378 0.0 2.1378 1.4621
No log 0.1039 124 2.0389 0.0 2.0389 1.4279
No log 0.1055 126 1.8037 0.0 1.8037 1.3430
No log 0.1072 128 1.5679 -0.0065 1.5679 1.2521
No log 0.1089 130 1.5321 -0.0106 1.5321 1.2378
No log 0.1106 132 1.6342 -0.0040 1.6342 1.2783
No log 0.1122 134 1.6479 -0.0040 1.6479 1.2837
No log 0.1139 136 1.6935 -0.0040 1.6935 1.3013
No log 0.1156 138 1.7934 -0.0040 1.7934 1.3392
No log 0.1173 140 1.8763 -0.0040 1.8763 1.3698
No log 0.1189 142 1.9728 0.0 1.9728 1.4046
No log 0.1206 144 1.9880 0.0 1.9880 1.4100
No log 0.1223 146 1.8702 -0.0040 1.8702 1.3675
No log 0.1240 148 1.7224 -0.0040 1.7224 1.3124
No log 0.1256 150 1.5117 -0.0305 1.5117 1.2295
No log 0.1273 152 1.4140 -0.0080 1.4140 1.1891
No log 0.1290 154 1.4415 -0.0080 1.4415 1.2006
No log 0.1307 156 1.5120 -0.0337 1.5120 1.2296
No log 0.1323 158 1.4612 -0.0259 1.4612 1.2088
No log 0.1340 160 1.4112 0.0453 1.4112 1.1879
No log 0.1357 162 1.4389 0.0191 1.4389 1.1996
No log 0.1374 164 1.4237 0.0261 1.4237 1.1932
No log 0.1390 166 1.3797 0.0968 1.3797 1.1746
No log 0.1407 168 1.2775 0.1010 1.2775 1.1303
No log 0.1424 170 1.2435 0.0930 1.2435 1.1151
No log 0.1441 172 1.2517 0.0921 1.2517 1.1188
No log 0.1457 174 1.3275 0.0754 1.3275 1.1522
No log 0.1474 176 1.3802 0.0291 1.3802 1.1748
No log 0.1491 178 1.3775 0.0204 1.3775 1.1737
No log 0.1508 180 1.3245 0.0723 1.3245 1.1508
No log 0.1524 182 1.1718 0.0959 1.1718 1.0825
No log 0.1541 184 1.0081 0.0921 1.0081 1.0040
No log 0.1558 186 0.9317 0.0831 0.9317 0.9652
No log 0.1575 188 0.9568 0.0458 0.9568 0.9781
No log 0.1591 190 1.0340 0.0556 1.0340 1.0169
No log 0.1608 192 1.0765 0.0791 1.0765 1.0375
No log 0.1625 194 1.1330 0.0890 1.1330 1.0644
No log 0.1642 196 1.0680 0.0940 1.0680 1.0335
No log 0.1658 198 0.9995 0.0995 0.9995 0.9998
No log 0.1675 200 0.9614 0.0921 0.9614 0.9805
No log 0.1692 202 0.9360 0.0921 0.9360 0.9675
No log 0.1709 204 0.9100 0.0921 0.9100 0.9539
No log 0.1725 206 0.8990 0.0663 0.8990 0.9482
No log 0.1742 208 0.9088 0.0458 0.9088 0.9533
No log 0.1759 210 0.9352 0.0606 0.9352 0.9671
No log 0.1776 212 0.9730 0.0761 0.9730 0.9864
No log 0.1792 214 0.9774 0.0886 0.9774 0.9886
No log 0.1809 216 0.9787 0.0921 0.9787 0.9893
No log 0.1826 218 0.9968 0.0921 0.9968 0.9984
No log 0.1843 220 0.9889 0.0921 0.9889 0.9944
No log 0.1859 222 0.9725 0.0921 0.9725 0.9862
No log 0.1876 224 0.9477 0.0921 0.9477 0.9735
No log 0.1893 226 0.9297 0.0975 0.9297 0.9642
No log 0.1910 228 0.8964 0.0831 0.8964 0.9468
No log 0.1926 230 0.8912 0.0831 0.8912 0.9440
No log 0.1943 232 0.8517 0.0684 0.8517 0.9229
No log 0.1960 234 0.8501 0.0831 0.8501 0.9220
No log 0.1977 236 0.8715 0.0921 0.8715 0.9335
No log 0.1993 238 0.9101 0.0921 0.9101 0.9540
No log 0.2010 240 0.9152 0.0921 0.9152 0.9566
No log 0.2027 242 0.9024 0.0921 0.9024 0.9499
No log 0.2044 244 0.8908 0.0921 0.8908 0.9438
No log 0.2060 246 0.8681 0.0921 0.8681 0.9317
No log 0.2077 248 0.8495 0.0921 0.8495 0.9217
No log 0.2094 250 0.8465 0.0831 0.8465 0.9200
No log 0.2111 252 0.8493 0.0921 0.8493 0.9216
No log 0.2127 254 0.8600 0.0921 0.8600 0.9274
No log 0.2144 256 0.8584 0.0921 0.8584 0.9265
No log 0.2161 258 0.8497 0.0921 0.8497 0.9218
No log 0.2178 260 0.8471 0.0921 0.8471 0.9204
No log 0.2194 262 0.8506 0.0871 0.8506 0.9223
No log 0.2211 264 0.8546 0.0960 0.8546 0.9244
No log 0.2228 266 0.8627 0.0960 0.8627 0.9288
No log 0.2245 268 0.8836 0.0921 0.8836 0.9400
No log 0.2261 270 0.9030 0.0921 0.9030 0.9502
No log 0.2278 272 0.9093 0.0921 0.9093 0.9536
No log 0.2295 274 0.8816 0.0921 0.8816 0.9389
No log 0.2312 276 0.8290 0.0921 0.8290 0.9105
No log 0.2328 278 0.8854 0.1038 0.8854 0.9409
No log 0.2345 280 1.0781 0.1545 1.0781 1.0383
No log 0.2362 282 1.0917 0.1398 1.0917 1.0449
No log 0.2379 284 1.0859 0.1653 1.0859 1.0421
No log 0.2395 286 1.0903 0.1562 1.0903 1.0442
No log 0.2412 288 1.1324 0.1608 1.1324 1.0641
No log 0.2429 290 1.0693 0.1174 1.0693 1.0341
No log 0.2446 292 1.0112 0.0921 1.0112 1.0056
No log 0.2462 294 0.9777 0.0921 0.9777 0.9888
No log 0.2479 296 0.8584 0.0921 0.8584 0.9265
No log 0.2496 298 0.9209 0.0782 0.9209 0.9597
No log 0.2513 300 0.9311 0.0619 0.9311 0.9649
No log 0.2529 302 0.9451 0.0393 0.9451 0.9721
No log 0.2546 304 0.9700 0.0663 0.9700 0.9849
No log 0.2563 306 0.9821 0.0921 0.9821 0.9910
No log 0.2580 308 0.9676 0.0921 0.9676 0.9837
No log 0.2596 310 0.9318 0.0702 0.9318 0.9653
No log 0.2613 312 0.9177 0.0105 0.9177 0.9580
No log 0.2630 314 0.9110 0.0134 0.9110 0.9545
No log 0.2647 316 0.9028 0.0342 0.9028 0.9502
No log 0.2663 318 0.8987 0.0663 0.8987 0.9480
No log 0.2680 320 0.9055 0.0684 0.9055 0.9516
No log 0.2697 322 0.9135 0.0684 0.9135 0.9558
No log 0.2714 324 0.9247 0.0831 0.9247 0.9616
No log 0.2730 326 0.9194 0.0921 0.9194 0.9588
No log 0.2747 328 0.8888 0.0921 0.8888 0.9428
No log 0.2764 330 0.8628 0.0921 0.8628 0.9289
No log 0.2781 332 0.8604 0.0921 0.8604 0.9276
No log 0.2797 334 0.8570 0.0831 0.8570 0.9257
No log 0.2814 336 0.8721 0.0831 0.8721 0.9339
No log 0.2831 338 0.8766 0.0761 0.8766 0.9363
No log 0.2848 340 0.8828 0.0737 0.8828 0.9396
No log 0.2864 342 0.8849 0.0726 0.8849 0.9407
No log 0.2881 344 0.9223 0.2885 0.9223 0.9604
No log 0.2898 346 0.9211 0.2817 0.9211 0.9597
No log 0.2915 348 0.9147 0.1590 0.9147 0.9564
No log 0.2931 350 0.9095 0.1117 0.9095 0.9537
No log 0.2948 352 0.9041 0.0749 0.9041 0.9509
No log 0.2965 354 0.8915 0.0856 0.8915 0.9442
No log 0.2982 356 0.8757 0.0663 0.8757 0.9358
No log 0.2998 358 0.8659 0.0661 0.8659 0.9305
No log 0.3015 360 0.8611 0.0770 0.8611 0.9280
No log 0.3032 362 0.8612 0.1484 0.8612 0.9280
No log 0.3049 364 0.8483 0.1856 0.8483 0.9210
No log 0.3065 366 0.8371 0.2068 0.8371 0.9149
No log 0.3082 368 0.8506 0.2882 0.8506 0.9223
No log 0.3099 370 0.8847 0.2687 0.8847 0.9406
No log 0.3116 372 0.9336 0.2427 0.9336 0.9663
No log 0.3132 374 0.9405 0.2406 0.9405 0.9698
No log 0.3149 376 0.9048 0.2990 0.9048 0.9512
No log 0.3166 378 0.8551 0.3166 0.8551 0.9247
No log 0.3183 380 0.8422 0.3305 0.8422 0.9177
No log 0.3199 382 0.8545 0.3219 0.8545 0.9244
No log 0.3216 384 0.8791 0.3176 0.8791 0.9376
No log 0.3233 386 0.9081 0.3225 0.9081 0.9529
No log 0.3250 388 0.9142 0.3124 0.9142 0.9561
No log 0.3266 390 0.8819 0.3045 0.8819 0.9391
No log 0.3283 392 0.8255 0.2848 0.8255 0.9086
No log 0.3300 394 0.8067 0.0921 0.8067 0.8981
No log 0.3317 396 0.8112 0.0921 0.8112 0.9007
No log 0.3333 398 0.8170 0.1797 0.8170 0.9039
No log 0.3350 400 0.8328 0.2370 0.8328 0.9126
No log 0.3367 402 0.8694 0.2743 0.8694 0.9324
No log 0.3384 404 0.8974 0.3392 0.8974 0.9473
No log 0.3400 406 0.9276 0.2760 0.9276 0.9631
No log 0.3417 408 0.9244 0.2665 0.9244 0.9615
No log 0.3434 410 0.8760 0.2457 0.8760 0.9360
No log 0.3451 412 0.8319 0.1996 0.8319 0.9121
No log 0.3467 414 0.8130 0.0992 0.8130 0.9016
No log 0.3484 416 0.8092 0.0910 0.8092 0.8996
No log 0.3501 418 0.8107 0.0909 0.8107 0.9004
No log 0.3518 420 0.8119 0.1347 0.8119 0.9011
No log 0.3534 422 0.8224 0.2488 0.8224 0.9069
No log 0.3551 424 0.8466 0.2776 0.8466 0.9201
No log 0.3568 426 0.8617 0.3122 0.8617 0.9283
No log 0.3585 428 0.8337 0.3067 0.8337 0.9131
No log 0.3601 430 0.8131 0.3193 0.8131 0.9017
No log 0.3618 432 0.7971 0.2119 0.7971 0.8928
No log 0.3635 434 0.7875 0.0936 0.7875 0.8874
No log 0.3652 436 0.8357 0.0921 0.8357 0.9142
No log 0.3668 438 0.8246 0.0921 0.8246 0.9081
No log 0.3685 440 0.8432 0.0921 0.8432 0.9183
No log 0.3702 442 0.8147 0.0921 0.8147 0.9026
No log 0.3719 444 0.7823 0.0921 0.7823 0.8845
No log 0.3735 446 0.7830 0.0943 0.7830 0.8849
No log 0.3752 448 0.8182 0.2746 0.8182 0.9045
No log 0.3769 450 0.8706 0.2663 0.8706 0.9330
No log 0.3786 452 0.9060 0.2749 0.9060 0.9518
No log 0.3802 454 0.9050 0.2494 0.9050 0.9513
No log 0.3819 456 0.8830 0.2178 0.8830 0.9397
No log 0.3836 458 0.8631 0.2035 0.8631 0.9291
No log 0.3853 460 0.8598 0.1703 0.8598 0.9272
No log 0.3869 462 0.8552 0.1061 0.8552 0.9248
No log 0.3886 464 0.8568 0.0921 0.8568 0.9256
No log 0.3903 466 0.8629 0.1038 0.8629 0.9289
No log 0.3920 468 0.8626 0.1592 0.8626 0.9287
No log 0.3936 470 0.8602 0.2111 0.8602 0.9275
No log 0.3953 472 0.8618 0.2603 0.8618 0.9283
No log 0.3970 474 0.8621 0.2911 0.8621 0.9285
No log 0.3987 476 0.8660 0.3171 0.8660 0.9306
No log 0.4003 478 0.8707 0.3816 0.8707 0.9331
No log 0.4020 480 0.8740 0.3679 0.8740 0.9349
No log 0.4037 482 0.8815 0.3595 0.8815 0.9389
No log 0.4054 484 0.8931 0.3146 0.8931 0.9450
No log 0.4070 486 0.9050 0.3170 0.9050 0.9513
No log 0.4087 488 0.8999 0.2968 0.8999 0.9486
No log 0.4104 490 0.9497 0.2303 0.9497 0.9745
No log 0.4121 492 1.0265 0.1278 1.0265 1.0132
No log 0.4137 494 0.9092 0.1626 0.9092 0.9535
No log 0.4154 496 0.8898 0.1502 0.8898 0.9433
No log 0.4171 498 0.8596 0.0752 0.8596 0.9271
1.4861 0.4188 500 0.8407 0.0787 0.8407 0.9169
1.4861 0.4204 502 0.8308 0.0944 0.8308 0.9115
1.4861 0.4221 504 0.8412 0.2271 0.8412 0.9172
1.4861 0.4238 506 0.8555 0.2749 0.8555 0.9249
1.4861 0.4255 508 0.8715 0.3161 0.8715 0.9335
1.4861 0.4271 510 0.8917 0.3015 0.8917 0.9443
1.4861 0.4288 512 0.9132 0.2797 0.9132 0.9556
1.4861 0.4305 514 0.9366 0.2750 0.9366 0.9678
1.4861 0.4322 516 0.9593 0.2767 0.9593 0.9795
1.4861 0.4338 518 0.9357 0.2721 0.9357 0.9673
1.4861 0.4355 520 0.8875 0.2723 0.8875 0.9421
1.4861 0.4372 522 0.8809 0.2601 0.8809 0.9386
1.4861 0.4389 524 0.9106 0.2993 0.9106 0.9542
1.4861 0.4405 526 0.9234 0.2889 0.9234 0.9610
1.4861 0.4422 528 0.9034 0.3159 0.9034 0.9505
1.4861 0.4439 530 0.8945 0.3156 0.8945 0.9458
1.4861 0.4456 532 0.8869 0.3046 0.8869 0.9418
1.4861 0.4472 534 0.8775 0.3199 0.8775 0.9368
1.4861 0.4489 536 0.8613 0.2975 0.8613 0.9281
1.4861 0.4506 538 0.8550 0.2573 0.8550 0.9247
1.4861 0.4523 540 0.8379 0.2040 0.8379 0.9154
1.4861 0.4539 542 0.8277 0.1245 0.8277 0.9098
1.4861 0.4556 544 0.8232 0.1028 0.8232 0.9073
1.4861 0.4573 546 0.8127 0.0970 0.8127 0.9015
1.4861 0.4590 548 0.8099 0.1023 0.8099 0.8999
1.4861 0.4606 550 0.8137 0.1131 0.8137 0.9021
1.4861 0.4623 552 0.8123 0.1143 0.8123 0.9013
1.4861 0.4640 554 0.8045 0.0828 0.8045 0.8969
1.4861 0.4657 556 0.8022 0.0679 0.8022 0.8957
1.4861 0.4673 558 0.8021 0.0508 0.8021 0.8956
1.4861 0.4690 560 0.8178 0.0849 0.8178 0.9043
1.4861 0.4707 562 0.8815 0.1010 0.8815 0.9389
1.4861 0.4724 564 0.9269 0.1106 0.9269 0.9628
1.4861 0.4740 566 0.9675 0.1089 0.9675 0.9836
1.4861 0.4757 568 0.9204 0.1131 0.9204 0.9594
1.4861 0.4774 570 0.8624 0.0960 0.8624 0.9287
1.4861 0.4791 572 0.8295 0.1379 0.8295 0.9108
1.4861 0.4807 574 0.8519 0.3193 0.8519 0.9230
1.4861 0.4824 576 0.8719 0.3279 0.8719 0.9338
1.4861 0.4841 578 0.8579 0.3598 0.8579 0.9262
1.4861 0.4858 580 0.8179 0.2273 0.8179 0.9044
1.4861 0.4874 582 0.8266 0.0952 0.8266 0.9092
1.4861 0.4891 584 0.8798 0.0999 0.8798 0.9380
1.4861 0.4908 586 0.8736 0.0941 0.8736 0.9346
1.4861 0.4925 588 0.8088 0.0921 0.8088 0.8993
1.4861 0.4941 590 0.8221 0.2345 0.8221 0.9067
1.4861 0.4958 592 0.8377 0.0921 0.8377 0.9153
1.4861 0.4975 594 0.8959 0.0921 0.8959 0.9465
1.4861 0.4992 596 1.1603 0.0921 1.1603 1.0772
1.4861 0.5008 598 1.4196 -0.0033 1.4196 1.1915
1.4861 0.5025 600 2.0839 -0.0062 2.0839 1.4436
1.4861 0.5042 602 3.4356 -0.0220 3.4356 1.8535
1.4861 0.5059 604 2.5910 -0.0015 2.5910 1.6097
1.4861 0.5075 606 1.5235 0.0080 1.5235 1.2343

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ASAP_FineTuningBERT_AugV3_k10_task1_organization_fold1

Finetuned
(2058)
this model