Edit model card

ASAP_FineTuningBERT_AugV4_k10_task1_organization_fold2

This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6026
  • Qwk: 0.4742
  • Mse: 0.6026
  • Rmse: 0.7763

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0020 2 10.3974 0.0013 10.3974 3.2245
No log 0.0040 4 9.3241 0.0 9.3241 3.0535
No log 0.0060 6 8.0124 0.0 8.0124 2.8306
No log 0.0079 8 6.7046 0.0 6.7046 2.5893
No log 0.0099 10 5.3548 0.0325 5.3548 2.3141
No log 0.0119 12 4.3935 0.0039 4.3935 2.0961
No log 0.0139 14 3.6172 0.0039 3.6172 1.9019
No log 0.0159 16 2.8162 0.0039 2.8162 1.6782
No log 0.0179 18 2.2725 0.1064 2.2725 1.5075
No log 0.0199 20 1.7897 0.0540 1.7897 1.3378
No log 0.0218 22 1.3306 0.0107 1.3306 1.1535
No log 0.0238 24 1.1460 0.0107 1.1460 1.0705
No log 0.0258 26 0.9364 0.0107 0.9364 0.9677
No log 0.0278 28 0.8445 0.2848 0.8445 0.9190
No log 0.0298 30 0.8380 0.0490 0.8380 0.9154
No log 0.0318 32 0.8553 0.0164 0.8553 0.9248
No log 0.0338 34 0.8926 0.0164 0.8926 0.9448
No log 0.0357 36 0.9438 0.0164 0.9438 0.9715
No log 0.0377 38 0.9613 0.0164 0.9613 0.9805
No log 0.0397 40 1.0887 0.0164 1.0887 1.0434
No log 0.0417 42 1.0892 0.0164 1.0892 1.0436
No log 0.0437 44 1.1870 0.0164 1.1870 1.0895
No log 0.0457 46 1.0759 0.0164 1.0759 1.0373
No log 0.0477 48 0.9395 0.0164 0.9395 0.9693
No log 0.0497 50 0.8951 0.0164 0.8951 0.9461
No log 0.0516 52 0.9726 0.0164 0.9726 0.9862
No log 0.0536 54 0.9359 0.0164 0.9359 0.9674
No log 0.0556 56 0.8868 0.0327 0.8868 0.9417
No log 0.0576 58 0.8409 0.0750 0.8409 0.9170
No log 0.0596 60 0.8297 0.0851 0.8297 0.9109
No log 0.0616 62 0.9194 0.0164 0.9194 0.9589
No log 0.0636 64 0.9837 0.0164 0.9837 0.9918
No log 0.0655 66 1.0615 0.0164 1.0615 1.0303
No log 0.0675 68 0.9240 0.0164 0.9240 0.9612
No log 0.0695 70 0.8718 0.0164 0.8718 0.9337
No log 0.0715 72 0.8533 0.0164 0.8533 0.9238
No log 0.0735 74 0.7063 0.2115 0.7063 0.8404
No log 0.0755 76 0.7269 0.3562 0.7269 0.8526
No log 0.0775 78 0.7177 0.0909 0.7177 0.8472
No log 0.0794 80 0.8605 0.0164 0.8605 0.9276
No log 0.0814 82 0.9737 0.0164 0.9737 0.9868
No log 0.0834 84 0.9632 0.0164 0.9632 0.9814
No log 0.0854 86 1.0019 0.0 1.0019 1.0010
No log 0.0874 88 1.0321 0.0164 1.0321 1.0159
No log 0.0894 90 1.0278 0.0164 1.0278 1.0138
No log 0.0914 92 1.0150 0.0164 1.0150 1.0075
No log 0.0933 94 0.8946 0.0164 0.8946 0.9459
No log 0.0953 96 0.7957 0.0474 0.7957 0.8920
No log 0.0973 98 0.7561 0.1078 0.7561 0.8695
No log 0.0993 100 0.7555 0.1120 0.7555 0.8692
No log 0.1013 102 0.9372 0.0297 0.9372 0.9681
No log 0.1033 104 1.0229 0.1536 1.0229 1.0114
No log 0.1053 106 0.9530 0.0815 0.9530 0.9762
No log 0.1072 108 0.9881 0.1216 0.9881 0.9940
No log 0.1092 110 0.9763 0.0163 0.9763 0.9881
No log 0.1112 112 0.9356 0.0089 0.9356 0.9672
No log 0.1132 114 0.8257 0.0280 0.8257 0.9087
No log 0.1152 116 0.8023 0.0296 0.8023 0.8957
No log 0.1172 118 0.8273 0.0164 0.8273 0.9095
No log 0.1192 120 0.8334 0.0164 0.8334 0.9129
No log 0.1212 122 0.7961 0.0576 0.7961 0.8923
No log 0.1231 124 0.7236 0.2255 0.7236 0.8507
No log 0.1251 126 0.8626 0.2398 0.8626 0.9287
No log 0.1271 128 0.7741 0.2557 0.7741 0.8798
No log 0.1291 130 0.7728 0.0946 0.7728 0.8791
No log 0.1311 132 0.9004 0.1897 0.9004 0.9489
No log 0.1331 134 0.7939 0.1358 0.7939 0.8910
No log 0.1351 136 0.6970 0.2546 0.6970 0.8349
No log 0.1370 138 0.6933 0.2566 0.6933 0.8326
No log 0.1390 140 0.7639 0.1891 0.7639 0.8740
No log 0.1410 142 0.7532 0.2058 0.7532 0.8679
No log 0.1430 144 0.8256 0.2619 0.8256 0.9086
No log 0.1450 146 1.0091 0.2972 1.0091 1.0046
No log 0.1470 148 0.8792 0.2927 0.8792 0.9377
No log 0.1490 150 0.6409 0.2950 0.6409 0.8005
No log 0.1509 152 0.6285 0.3240 0.6285 0.7928
No log 0.1529 154 0.6702 0.3649 0.6702 0.8187
No log 0.1549 156 0.8030 0.3118 0.8030 0.8961
No log 0.1569 158 0.7446 0.3493 0.7446 0.8629
No log 0.1589 160 0.6212 0.3933 0.6212 0.7881
No log 0.1609 162 0.6673 0.3439 0.6673 0.8169
No log 0.1629 164 0.6354 0.3642 0.6354 0.7971
No log 0.1648 166 0.7069 0.4087 0.7069 0.8408
No log 0.1668 168 0.7366 0.3949 0.7366 0.8583
No log 0.1688 170 0.6462 0.4324 0.6462 0.8039
No log 0.1708 172 0.5915 0.4213 0.5915 0.7691
No log 0.1728 174 0.6193 0.4215 0.6193 0.7870
No log 0.1748 176 0.6264 0.4597 0.6264 0.7915
No log 0.1768 178 0.8578 0.3514 0.8578 0.9262
No log 0.1787 180 0.8505 0.3526 0.8505 0.9222
No log 0.1807 182 0.6709 0.4304 0.6709 0.8191
No log 0.1827 184 0.6349 0.4467 0.6349 0.7968
No log 0.1847 186 0.7473 0.3882 0.7473 0.8645
No log 0.1867 188 0.8836 0.3485 0.8836 0.9400
No log 0.1887 190 0.8914 0.3484 0.8914 0.9441
No log 0.1907 192 0.7523 0.3985 0.7523 0.8673
No log 0.1927 194 0.6535 0.4282 0.6535 0.8084
No log 0.1946 196 0.5826 0.4481 0.5826 0.7633
No log 0.1966 198 0.5822 0.4763 0.5822 0.7630
No log 0.1986 200 0.6198 0.4533 0.6198 0.7872
No log 0.2006 202 0.7635 0.4209 0.7635 0.8738
No log 0.2026 204 0.7246 0.4261 0.7246 0.8512
No log 0.2046 206 0.5498 0.5072 0.5498 0.7415
No log 0.2066 208 0.5541 0.4848 0.5541 0.7444
No log 0.2085 210 0.5437 0.5153 0.5437 0.7373
No log 0.2105 212 0.6710 0.4757 0.6710 0.8191
No log 0.2125 214 0.7285 0.4625 0.7285 0.8535
No log 0.2145 216 0.6743 0.4762 0.6743 0.8212
No log 0.2165 218 0.5992 0.4926 0.5992 0.7741
No log 0.2185 220 0.7095 0.4422 0.7095 0.8423
No log 0.2205 222 0.7947 0.3867 0.7947 0.8915
No log 0.2224 224 0.5860 0.4868 0.5860 0.7655
No log 0.2244 226 0.5441 0.5228 0.5441 0.7376
No log 0.2264 228 0.6245 0.4612 0.6245 0.7903
No log 0.2284 230 0.6718 0.4620 0.6718 0.8197
No log 0.2304 232 0.7027 0.4224 0.7027 0.8383
No log 0.2324 234 0.5607 0.4715 0.5607 0.7488
No log 0.2344 236 0.5618 0.4801 0.5618 0.7495
No log 0.2363 238 0.6410 0.4641 0.6410 0.8006
No log 0.2383 240 0.7248 0.3803 0.7248 0.8513
No log 0.2403 242 0.8617 0.3559 0.8617 0.9283
No log 0.2423 244 0.7279 0.3990 0.7279 0.8532
No log 0.2443 246 0.6123 0.3973 0.6123 0.7825
No log 0.2463 248 0.6779 0.3438 0.6779 0.8234
No log 0.2483 250 0.6332 0.3669 0.6332 0.7957
No log 0.2502 252 0.6308 0.4346 0.6308 0.7942
No log 0.2522 254 0.6348 0.4637 0.6348 0.7968
No log 0.2542 256 0.5830 0.4491 0.5830 0.7636
No log 0.2562 258 0.5863 0.4845 0.5863 0.7657
No log 0.2582 260 0.6570 0.5055 0.6570 0.8105
No log 0.2602 262 0.7695 0.4303 0.7695 0.8772
No log 0.2622 264 0.7063 0.4625 0.7063 0.8404
No log 0.2642 266 0.5725 0.5164 0.5725 0.7567
No log 0.2661 268 0.5644 0.5004 0.5644 0.7513
No log 0.2681 270 0.6126 0.4590 0.6126 0.7827
No log 0.2701 272 0.6015 0.4822 0.6015 0.7755
No log 0.2721 274 0.5716 0.4692 0.5716 0.7560
No log 0.2741 276 0.5983 0.4245 0.5983 0.7735
No log 0.2761 278 0.5770 0.4267 0.5770 0.7596
No log 0.2781 280 0.5929 0.4780 0.5929 0.7700
No log 0.2800 282 0.6408 0.4794 0.6408 0.8005
No log 0.2820 284 0.6149 0.4912 0.6149 0.7842
No log 0.2840 286 0.6653 0.4831 0.6653 0.8157
No log 0.2860 288 0.6121 0.5031 0.6121 0.7823
No log 0.2880 290 0.7064 0.4622 0.7064 0.8405
No log 0.2900 292 0.7420 0.4545 0.7420 0.8614
No log 0.2920 294 0.6447 0.5107 0.6447 0.8030
No log 0.2939 296 0.5766 0.5481 0.5766 0.7594
No log 0.2959 298 0.5689 0.5403 0.5689 0.7542
No log 0.2979 300 0.6820 0.4781 0.6820 0.8258
No log 0.2999 302 0.6219 0.5061 0.6219 0.7886
No log 0.3019 304 0.5379 0.5164 0.5379 0.7334
No log 0.3039 306 0.5403 0.5010 0.5403 0.7350
No log 0.3059 308 0.6005 0.4918 0.6005 0.7749
No log 0.3078 310 0.7733 0.4161 0.7733 0.8794
No log 0.3098 312 0.6863 0.4507 0.6863 0.8284
No log 0.3118 314 0.5466 0.4909 0.5466 0.7393
No log 0.3138 316 0.5447 0.4869 0.5447 0.7381
No log 0.3158 318 0.6360 0.4646 0.6360 0.7975
No log 0.3178 320 0.7466 0.4310 0.7466 0.8641
No log 0.3198 322 0.7539 0.4182 0.7539 0.8683
No log 0.3217 324 0.8132 0.4204 0.8132 0.9018
No log 0.3237 326 0.6399 0.4816 0.6399 0.7999
No log 0.3257 328 0.6194 0.5013 0.6194 0.7870
No log 0.3277 330 0.7706 0.4354 0.7706 0.8779
No log 0.3297 332 0.7874 0.4351 0.7874 0.8874
No log 0.3317 334 0.6584 0.4751 0.6584 0.8114
No log 0.3337 336 0.6470 0.4841 0.6470 0.8044
No log 0.3357 338 0.8473 0.4150 0.8473 0.9205
No log 0.3376 340 0.9082 0.3854 0.9082 0.9530
No log 0.3396 342 0.7503 0.4297 0.7503 0.8662
No log 0.3416 344 0.8584 0.3993 0.8584 0.9265
No log 0.3436 346 0.7809 0.4243 0.7809 0.8837
No log 0.3456 348 0.5610 0.5243 0.5610 0.7490
No log 0.3476 350 0.5532 0.5435 0.5532 0.7438
No log 0.3496 352 0.6018 0.5127 0.6018 0.7757
No log 0.3515 354 0.8079 0.4430 0.8079 0.8988
No log 0.3535 356 0.7563 0.4499 0.7563 0.8697
No log 0.3555 358 0.5841 0.5035 0.5841 0.7643
No log 0.3575 360 0.5414 0.5676 0.5414 0.7358
No log 0.3595 362 0.5351 0.5620 0.5351 0.7315
No log 0.3615 364 0.5646 0.5453 0.5646 0.7514
No log 0.3635 366 0.5716 0.5446 0.5716 0.7560
No log 0.3654 368 0.6107 0.5194 0.6107 0.7815
No log 0.3674 370 0.6072 0.5248 0.6072 0.7792
No log 0.3694 372 0.5728 0.5181 0.5728 0.7569
No log 0.3714 374 0.6223 0.5149 0.6223 0.7889
No log 0.3734 376 0.6318 0.5045 0.6318 0.7948
No log 0.3754 378 0.5854 0.5012 0.5854 0.7651
No log 0.3774 380 0.6440 0.4982 0.6440 0.8025
No log 0.3793 382 0.7272 0.4373 0.7272 0.8528
No log 0.3813 384 0.8188 0.4072 0.8188 0.9049
No log 0.3833 386 0.6251 0.4875 0.6251 0.7906
No log 0.3853 388 0.6376 0.4324 0.6376 0.7985
No log 0.3873 390 0.6793 0.4208 0.6793 0.8242
No log 0.3893 392 0.6076 0.4585 0.6076 0.7795
No log 0.3913 394 0.7265 0.4442 0.7265 0.8523
No log 0.3932 396 0.7037 0.4379 0.7037 0.8389
No log 0.3952 398 0.6247 0.4347 0.6247 0.7904
No log 0.3972 400 0.6572 0.3708 0.6572 0.8107
No log 0.3992 402 0.6208 0.3792 0.6208 0.7879
No log 0.4012 404 0.6294 0.4114 0.6294 0.7934
No log 0.4032 406 0.7127 0.4002 0.7127 0.8442
No log 0.4052 408 0.7978 0.4070 0.7978 0.8932
No log 0.4071 410 0.8891 0.3645 0.8891 0.9429
No log 0.4091 412 0.9278 0.3493 0.9278 0.9632
No log 0.4111 414 0.9305 0.3594 0.9305 0.9646
No log 0.4131 416 1.2339 0.2962 1.2339 1.1108
No log 0.4151 418 1.1565 0.3167 1.1565 1.0754
No log 0.4171 420 0.8596 0.3881 0.8596 0.9272
No log 0.4191 422 0.8051 0.3963 0.8051 0.8973
No log 0.4211 424 0.7938 0.4108 0.7938 0.8910
No log 0.4230 426 0.8390 0.4019 0.8390 0.9160
No log 0.4250 428 0.6362 0.5174 0.6362 0.7976
No log 0.4270 430 0.5771 0.5244 0.5771 0.7597
No log 0.4290 432 0.5983 0.5304 0.5983 0.7735
No log 0.4310 434 0.7338 0.4628 0.7338 0.8566
No log 0.4330 436 0.6473 0.5168 0.6473 0.8045
No log 0.4350 438 0.6160 0.5112 0.6160 0.7848
No log 0.4369 440 0.9833 0.4090 0.9833 0.9916
No log 0.4389 442 0.9474 0.4146 0.9474 0.9733
No log 0.4409 444 0.6374 0.4908 0.6374 0.7984
No log 0.4429 446 0.6867 0.4887 0.6867 0.8286
No log 0.4449 448 0.8693 0.4417 0.8693 0.9324
No log 0.4469 450 0.7725 0.4748 0.7725 0.8789
No log 0.4489 452 0.5889 0.5110 0.5889 0.7674
No log 0.4508 454 0.6323 0.4795 0.6323 0.7952
No log 0.4528 456 0.7201 0.4325 0.7201 0.8486
No log 0.4548 458 0.6368 0.4816 0.6368 0.7980
No log 0.4568 460 0.5745 0.5349 0.5745 0.7580
No log 0.4588 462 0.6213 0.4867 0.6213 0.7882
No log 0.4608 464 0.5859 0.5131 0.5859 0.7655
No log 0.4628 466 0.5660 0.5211 0.5660 0.7523
No log 0.4647 468 0.5686 0.5035 0.5686 0.7541
No log 0.4667 470 0.5881 0.5169 0.5881 0.7668
No log 0.4687 472 0.5820 0.5364 0.5820 0.7629
No log 0.4707 474 0.5607 0.5641 0.5607 0.7488
No log 0.4727 476 0.5726 0.5359 0.5726 0.7567
No log 0.4747 478 0.6279 0.5058 0.6279 0.7924
No log 0.4767 480 0.7919 0.4596 0.7919 0.8899
No log 0.4786 482 0.8001 0.4605 0.8001 0.8945
No log 0.4806 484 0.6043 0.5401 0.6043 0.7774
No log 0.4826 486 0.5651 0.5524 0.5651 0.7517
No log 0.4846 488 0.5717 0.5623 0.5717 0.7561
No log 0.4866 490 0.7213 0.4822 0.7213 0.8493
No log 0.4886 492 1.1569 0.3919 1.1569 1.0756
No log 0.4906 494 1.1533 0.3937 1.1533 1.0739
No log 0.4926 496 0.7984 0.4827 0.7984 0.8936
No log 0.4945 498 0.5415 0.5748 0.5415 0.7359
0.9981 0.4965 500 0.5343 0.5240 0.5343 0.7310
0.9981 0.4985 502 0.5408 0.5231 0.5408 0.7354
0.9981 0.5005 504 0.7343 0.4676 0.7343 0.8569
0.9981 0.5025 506 0.9090 0.4405 0.9090 0.9534
0.9981 0.5045 508 0.8055 0.4367 0.8055 0.8975
0.9981 0.5065 510 0.6026 0.4742 0.6026 0.7763

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ASAP_FineTuningBERT_AugV4_k10_task1_organization_fold2

Finetuned
(2175)
this model