Edit model card

ASAP_FineTuningBERT_AugV4_k10_task1_organization_fold0

This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6073
  • Qwk: 0.4163
  • Mse: 0.6073
  • Rmse: 0.7793

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0020 2 14.2524 0.0 14.2524 3.7752
No log 0.0041 4 12.4355 0.0 12.4355 3.5264
No log 0.0061 6 10.4416 0.0080 10.4416 3.2313
No log 0.0081 8 7.8681 0.0 7.8681 2.8050
No log 0.0102 10 6.0614 0.0405 6.0614 2.4620
No log 0.0122 12 4.7016 0.0234 4.7016 2.1683
No log 0.0142 14 3.4154 0.0115 3.4154 1.8481
No log 0.0163 16 2.5009 0.1257 2.5009 1.5814
No log 0.0183 18 1.9497 0.1155 1.9497 1.3963
No log 0.0203 20 1.3874 0.0484 1.3874 1.1779
No log 0.0224 22 1.0867 0.0316 1.0867 1.0424
No log 0.0244 24 0.8935 0.0397 0.8935 0.9453
No log 0.0264 26 0.7875 0.2115 0.7875 0.8874
No log 0.0285 28 0.7447 0.1547 0.7447 0.8630
No log 0.0305 30 0.8089 0.1185 0.8089 0.8994
No log 0.0326 32 0.9391 0.0689 0.9391 0.9691
No log 0.0346 34 0.9037 0.0689 0.9037 0.9506
No log 0.0366 36 0.8205 0.1255 0.8205 0.9058
No log 0.0387 38 0.9736 0.0521 0.9736 0.9867
No log 0.0407 40 1.1668 0.1608 1.1668 1.0802
No log 0.0427 42 0.9403 0.0521 0.9403 0.9697
No log 0.0448 44 0.9024 0.0521 0.9024 0.9500
No log 0.0468 46 1.0188 0.0521 1.0188 1.0094
No log 0.0488 48 0.8766 0.0521 0.8766 0.9362
No log 0.0509 50 0.8789 0.0521 0.8789 0.9375
No log 0.0529 52 1.1058 0.0419 1.1058 1.0516
No log 0.0549 54 1.3660 0.0417 1.3660 1.1688
No log 0.0570 56 1.1852 0.0124 1.1852 1.0887
No log 0.0590 58 0.8975 0.0506 0.8975 0.9474
No log 0.0610 60 0.8719 0.0506 0.8719 0.9337
No log 0.0631 62 0.8985 0.0521 0.8985 0.9479
No log 0.0651 64 1.0891 0.0532 1.0891 1.0436
No log 0.0671 66 1.2190 0.0523 1.2190 1.1041
No log 0.0692 68 0.9515 0.0521 0.9515 0.9754
No log 0.0712 70 0.8337 0.0726 0.8337 0.9131
No log 0.0732 72 0.8283 0.0521 0.8283 0.9101
No log 0.0753 74 0.9558 0.0521 0.9558 0.9776
No log 0.0773 76 1.1886 0.0356 1.1886 1.0902
No log 0.0793 78 1.3286 0.1173 1.3286 1.1526
No log 0.0814 80 1.2141 0.2806 1.2141 1.1019
No log 0.0834 82 0.9281 0.0348 0.9281 0.9634
No log 0.0855 84 0.8013 0.0348 0.8013 0.8951
No log 0.0875 86 0.7968 0.0174 0.7968 0.8926
No log 0.0895 88 0.7858 0.0348 0.7858 0.8864
No log 0.0916 90 0.8314 0.0348 0.8314 0.9118
No log 0.0936 92 0.9557 0.0348 0.9557 0.9776
No log 0.0956 94 1.0167 0.0348 1.0167 1.0083
No log 0.0977 96 0.9794 0.0348 0.9794 0.9897
No log 0.0997 98 0.9675 0.0348 0.9675 0.9836
No log 0.1017 100 0.8935 0.0521 0.8935 0.9453
No log 0.1038 102 0.7942 0.0521 0.7942 0.8912
No log 0.1058 104 0.7652 0.0521 0.7652 0.8747
No log 0.1078 106 0.7540 0.0521 0.7540 0.8683
No log 0.1099 108 0.7946 0.0521 0.7946 0.8914
No log 0.1119 110 0.8123 0.0348 0.8123 0.9013
No log 0.1139 112 0.7863 0.0348 0.7863 0.8867
No log 0.1160 114 0.7900 0.0348 0.7900 0.8888
No log 0.1180 116 0.7818 0.0174 0.7818 0.8842
No log 0.1200 118 0.8444 0.0174 0.8444 0.9189
No log 0.1221 120 0.9907 0.1248 0.9907 0.9953
No log 0.1241 122 1.0523 0.2664 1.0523 1.0258
No log 0.1261 124 1.0840 0.2782 1.0840 1.0412
No log 0.1282 126 0.9240 0.1057 0.9240 0.9613
No log 0.1302 128 0.8543 0.0200 0.8543 0.9243
No log 0.1322 130 0.8993 0.1809 0.8993 0.9483
No log 0.1343 132 1.1177 0.1666 1.1177 1.0572
No log 0.1363 134 1.2346 0.1118 1.2346 1.1111
No log 0.1384 136 1.1394 0.0431 1.1394 1.0675
No log 0.1404 138 1.2881 0.0772 1.2881 1.1349
No log 0.1424 140 1.3646 0.0869 1.3646 1.1681
No log 0.1445 142 1.2050 0.0798 1.2050 1.0977
No log 0.1465 144 1.1245 0.1208 1.1245 1.0604
No log 0.1485 146 1.1741 0.1267 1.1741 1.0836
No log 0.1506 148 1.2027 0.1231 1.2027 1.0967
No log 0.1526 150 0.9615 0.1997 0.9615 0.9805
No log 0.1546 152 0.9363 0.1842 0.9363 0.9676
No log 0.1567 154 1.2144 0.1447 1.2144 1.1020
No log 0.1587 156 1.6331 0.0834 1.6331 1.2779
No log 0.1607 158 1.6617 0.0635 1.6617 1.2891
No log 0.1628 160 1.4068 0.1029 1.4068 1.1861
No log 0.1648 162 1.2530 0.1126 1.2530 1.1194
No log 0.1668 164 1.1498 0.1524 1.1498 1.0723
No log 0.1689 166 1.2417 0.1511 1.2417 1.1143
No log 0.1709 168 1.1797 0.1832 1.1797 1.0861
No log 0.1729 170 1.2454 0.1660 1.2454 1.1160
No log 0.1750 172 1.0620 0.2258 1.0620 1.0305
No log 0.1770 174 0.9187 0.2552 0.9187 0.9585
No log 0.1790 176 1.0228 0.2500 1.0228 1.0113
No log 0.1811 178 1.2139 0.1965 1.2139 1.1018
No log 0.1831 180 1.0744 0.2235 1.0744 1.0365
No log 0.1851 182 0.8469 0.3123 0.8469 0.9203
No log 0.1872 184 0.7594 0.2231 0.7594 0.8714
No log 0.1892 186 0.8029 0.2100 0.8029 0.8960
No log 0.1913 188 0.9766 0.2680 0.9766 0.9882
No log 0.1933 190 0.9017 0.2362 0.9017 0.9496
No log 0.1953 192 0.7546 0.1702 0.7546 0.8687
No log 0.1974 194 0.6947 0.1623 0.6947 0.8335
No log 0.1994 196 0.7068 0.1856 0.7068 0.8407
No log 0.2014 198 0.8325 0.2454 0.8325 0.9124
No log 0.2035 200 1.2753 0.1797 1.2753 1.1293
No log 0.2055 202 1.4321 0.1525 1.4321 1.1967
No log 0.2075 204 1.1169 0.2236 1.1169 1.0568
No log 0.2096 206 0.8111 0.1988 0.8111 0.9006
No log 0.2116 208 0.7759 0.1293 0.7759 0.8808
No log 0.2136 210 0.8296 0.2287 0.8296 0.9108
No log 0.2157 212 1.0859 0.2387 1.0859 1.0421
No log 0.2177 214 1.3781 0.1422 1.3781 1.1739
No log 0.2197 216 1.3072 0.1524 1.3072 1.1433
No log 0.2218 218 1.0086 0.2753 1.0086 1.0043
No log 0.2238 220 0.7956 0.2332 0.7956 0.8920
No log 0.2258 222 0.7944 0.2511 0.7944 0.8913
No log 0.2279 224 0.9285 0.2923 0.9285 0.9636
No log 0.2299 226 1.2129 0.1833 1.2129 1.1013
No log 0.2319 228 1.2641 0.1668 1.2641 1.1243
No log 0.2340 230 1.0853 0.2767 1.0853 1.0418
No log 0.2360 232 0.9418 0.3350 0.9418 0.9705
No log 0.2380 234 0.9214 0.3267 0.9214 0.9599
No log 0.2401 236 0.8887 0.3415 0.8887 0.9427
No log 0.2421 238 0.8320 0.3196 0.8320 0.9121
No log 0.2442 240 0.7991 0.3268 0.7991 0.8939
No log 0.2462 242 0.7630 0.3526 0.7630 0.8735
No log 0.2482 244 0.7726 0.3838 0.7726 0.8790
No log 0.2503 246 0.8695 0.3573 0.8695 0.9325
No log 0.2523 248 1.1970 0.2493 1.1970 1.0941
No log 0.2543 250 1.2131 0.2300 1.2131 1.1014
No log 0.2564 252 0.9347 0.3020 0.9347 0.9668
No log 0.2584 254 0.7301 0.3449 0.7301 0.8545
No log 0.2604 256 0.7240 0.3536 0.7240 0.8509
No log 0.2625 258 0.7886 0.3632 0.7886 0.8880
No log 0.2645 260 0.7821 0.3490 0.7821 0.8844
No log 0.2665 262 0.8392 0.3562 0.8392 0.9161
No log 0.2686 264 0.7377 0.3767 0.7377 0.8589
No log 0.2706 266 0.6526 0.4354 0.6526 0.8079
No log 0.2726 268 0.6136 0.4634 0.6136 0.7833
No log 0.2747 270 0.6233 0.4561 0.6233 0.7895
No log 0.2767 272 0.6340 0.4532 0.6340 0.7962
No log 0.2787 274 0.6458 0.4613 0.6458 0.8036
No log 0.2808 276 0.6249 0.4796 0.6249 0.7905
No log 0.2828 278 0.5922 0.4940 0.5922 0.7695
No log 0.2848 280 0.5898 0.4875 0.5898 0.7680
No log 0.2869 282 0.6014 0.4327 0.6014 0.7755
No log 0.2889 284 0.5898 0.4485 0.5898 0.7680
No log 0.2909 286 0.6508 0.4408 0.6508 0.8067
No log 0.2930 288 0.7140 0.4352 0.7140 0.8450
No log 0.2950 290 0.6432 0.4524 0.6432 0.8020
No log 0.2970 292 0.6601 0.4622 0.6601 0.8125
No log 0.2991 294 0.6945 0.4394 0.6945 0.8333
No log 0.3011 296 0.9330 0.3413 0.9330 0.9659
No log 0.3032 298 0.8364 0.3624 0.8364 0.9145
No log 0.3052 300 0.6684 0.4309 0.6684 0.8175
No log 0.3072 302 0.6631 0.4301 0.6631 0.8143
No log 0.3093 304 0.6487 0.4362 0.6487 0.8054
No log 0.3113 306 0.7637 0.3553 0.7637 0.8739
No log 0.3133 308 0.8431 0.2993 0.8431 0.9182
No log 0.3154 310 0.7241 0.3513 0.7241 0.8509
No log 0.3174 312 0.6358 0.3828 0.6358 0.7974
No log 0.3194 314 0.7444 0.2926 0.7444 0.8628
No log 0.3215 316 0.7039 0.3169 0.7039 0.8390
No log 0.3235 318 0.6119 0.4647 0.6119 0.7822
No log 0.3255 320 0.7097 0.3611 0.7097 0.8424
No log 0.3276 322 0.7211 0.3566 0.7211 0.8492
No log 0.3296 324 0.6242 0.4539 0.6242 0.7901
No log 0.3316 326 0.6826 0.3977 0.6826 0.8262
No log 0.3337 328 0.8104 0.2971 0.8104 0.9002
No log 0.3357 330 0.7201 0.3698 0.7201 0.8486
No log 0.3377 332 0.6255 0.4575 0.6255 0.7909
No log 0.3398 334 0.7740 0.3624 0.7740 0.8797
No log 0.3418 336 0.8368 0.3493 0.8368 0.9148
No log 0.3438 338 0.7049 0.3372 0.7049 0.8396
No log 0.3459 340 0.6016 0.4258 0.6016 0.7756
No log 0.3479 342 0.5956 0.4550 0.5956 0.7718
No log 0.3499 344 0.5910 0.4834 0.5910 0.7687
No log 0.3520 346 0.6334 0.4749 0.6334 0.7958
No log 0.3540 348 0.6347 0.4713 0.6347 0.7967
No log 0.3561 350 0.5916 0.4821 0.5916 0.7691
No log 0.3581 352 0.5980 0.4636 0.5980 0.7733
No log 0.3601 354 0.6013 0.4939 0.6013 0.7754
No log 0.3622 356 0.6365 0.5244 0.6365 0.7978
No log 0.3642 358 0.6629 0.5073 0.6629 0.8142
No log 0.3662 360 0.6258 0.5027 0.6258 0.7911
No log 0.3683 362 0.6111 0.4828 0.6111 0.7817
No log 0.3703 364 0.6044 0.4962 0.6044 0.7774
No log 0.3723 366 0.5941 0.4996 0.5941 0.7708
No log 0.3744 368 0.6032 0.5027 0.6032 0.7766
No log 0.3764 370 0.5909 0.5003 0.5909 0.7687
No log 0.3784 372 0.5972 0.4688 0.5972 0.7728
No log 0.3805 374 0.5919 0.4855 0.5919 0.7694
No log 0.3825 376 0.5970 0.4243 0.5970 0.7727
No log 0.3845 378 0.5977 0.4485 0.5977 0.7731
No log 0.3866 380 0.6073 0.4508 0.6073 0.7793
No log 0.3886 382 0.6441 0.4980 0.6441 0.8025
No log 0.3906 384 0.6384 0.4684 0.6384 0.7990
No log 0.3927 386 0.7185 0.4208 0.7185 0.8476
No log 0.3947 388 0.7171 0.4134 0.7171 0.8468
No log 0.3967 390 0.6906 0.4733 0.6906 0.8310
No log 0.3988 392 0.7191 0.4898 0.7191 0.8480
No log 0.4008 394 0.6986 0.4566 0.6986 0.8358
No log 0.4028 396 0.7147 0.3921 0.7147 0.8454
No log 0.4049 398 0.6481 0.4475 0.6481 0.8050
No log 0.4069 400 0.6870 0.4532 0.6870 0.8289
No log 0.4090 402 0.6674 0.4586 0.6674 0.8170
No log 0.4110 404 0.6029 0.4794 0.6029 0.7764
No log 0.4130 406 0.6205 0.3962 0.6205 0.7877
No log 0.4151 408 0.6764 0.3891 0.6764 0.8225
No log 0.4171 410 0.6025 0.3915 0.6025 0.7762
No log 0.4191 412 0.6819 0.3999 0.6819 0.8258
No log 0.4212 414 0.7267 0.3785 0.7267 0.8524
No log 0.4232 416 0.6667 0.4705 0.6667 0.8165
No log 0.4252 418 0.5916 0.4633 0.5916 0.7692
No log 0.4273 420 0.5976 0.4962 0.5976 0.7730
No log 0.4293 422 0.6359 0.5205 0.6359 0.7975
No log 0.4313 424 0.9656 0.3844 0.9656 0.9827
No log 0.4334 426 1.2683 0.3138 1.2683 1.1262
No log 0.4354 428 1.1435 0.3347 1.1435 1.0694
No log 0.4374 430 0.7352 0.4648 0.7352 0.8575
No log 0.4395 432 0.5771 0.4916 0.5771 0.7596
No log 0.4415 434 0.5809 0.4870 0.5809 0.7621
No log 0.4435 436 0.5932 0.5162 0.5932 0.7702
No log 0.4456 438 0.6494 0.4965 0.6494 0.8059
No log 0.4476 440 0.7660 0.4166 0.7660 0.8752
No log 0.4496 442 0.7850 0.4095 0.7850 0.8860
No log 0.4517 444 0.6352 0.4918 0.6352 0.7970
No log 0.4537 446 0.5906 0.5230 0.5906 0.7685
No log 0.4557 448 0.6299 0.4880 0.6299 0.7937
No log 0.4578 450 0.8701 0.3860 0.8701 0.9328
No log 0.4598 452 0.9842 0.3595 0.9842 0.9921
No log 0.4619 454 0.7763 0.4043 0.7763 0.8811
No log 0.4639 456 0.6003 0.4594 0.6003 0.7748
No log 0.4659 458 0.5860 0.5009 0.5860 0.7655
No log 0.4680 460 0.6568 0.4595 0.6568 0.8104
No log 0.4700 462 0.6725 0.4575 0.6725 0.8201
No log 0.4720 464 0.6127 0.5014 0.6127 0.7827
No log 0.4741 466 0.6224 0.4920 0.6224 0.7889
No log 0.4761 468 0.7347 0.4548 0.7347 0.8572
No log 0.4781 470 1.0236 0.3584 1.0236 1.0117
No log 0.4802 472 0.9869 0.3368 0.9869 0.9934
No log 0.4822 474 0.7095 0.3538 0.7095 0.8423
No log 0.4842 476 0.6062 0.3906 0.6062 0.7786
No log 0.4863 478 0.6046 0.3666 0.6046 0.7776
No log 0.4883 480 0.6634 0.3503 0.6634 0.8145
No log 0.4903 482 0.7740 0.3890 0.7740 0.8798
No log 0.4924 484 0.7046 0.4006 0.7046 0.8394
No log 0.4944 486 0.7094 0.4278 0.7094 0.8422
No log 0.4964 488 0.8236 0.4176 0.8236 0.9075
No log 0.4985 490 0.7108 0.4590 0.7108 0.8431
No log 0.5005 492 0.6641 0.4734 0.6641 0.8149
No log 0.5025 494 0.6132 0.4669 0.6132 0.7831
No log 0.5046 496 0.6440 0.4816 0.6440 0.8025
No log 0.5066 498 0.8052 0.4396 0.8052 0.8973
1.0556 0.5086 500 0.8255 0.4385 0.8255 0.9086
1.0556 0.5107 502 0.7262 0.4366 0.7262 0.8522
1.0556 0.5127 504 0.6593 0.4391 0.6593 0.8120
1.0556 0.5148 506 0.7627 0.4260 0.7627 0.8733
1.0556 0.5168 508 0.8397 0.4043 0.8397 0.9164
1.0556 0.5188 510 0.7585 0.4378 0.7585 0.8709
1.0556 0.5209 512 0.6207 0.4179 0.6207 0.7879
1.0556 0.5229 514 0.6073 0.4163 0.6073 0.7793

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ASAP_FineTuningBERT_AugV4_k10_task1_organization_fold0

Finetuned
(2175)
this model