Edit model card

ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold1

This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0233
  • Qwk: 0.4858
  • Mse: 1.0233
  • Rmse: 1.0116

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0047 2 9.3843 0.0016 9.3843 3.0634
No log 0.0093 4 7.6902 0.0 7.6902 2.7731
No log 0.0140 6 6.2906 0.0 6.2906 2.5081
No log 0.0186 8 4.9654 0.0320 4.9654 2.2283
No log 0.0233 10 3.6594 0.0 3.6594 1.9130
No log 0.0280 12 2.9328 0.0 2.9328 1.7126
No log 0.0326 14 2.1220 0.1345 2.1220 1.4567
No log 0.0373 16 1.6280 0.0482 1.6280 1.2759
No log 0.0420 18 1.2820 0.0 1.2820 1.1322
No log 0.0466 20 1.0755 0.0 1.0755 1.0371
No log 0.0513 22 0.9013 0.2768 0.9013 0.9494
No log 0.0559 24 0.8756 0.0943 0.8756 0.9357
No log 0.0606 26 0.9791 0.0286 0.9791 0.9895
No log 0.0653 28 0.9280 0.0429 0.9280 0.9633
No log 0.0699 30 0.9766 0.0286 0.9766 0.9882
No log 0.0746 32 1.3594 0.0 1.3594 1.1659
No log 0.0793 34 1.2967 0.0 1.2967 1.1387
No log 0.0839 36 1.2188 0.0 1.2188 1.1040
No log 0.0886 38 1.0623 0.0 1.0623 1.0307
No log 0.0932 40 1.1963 0.0 1.1963 1.0937
No log 0.0979 42 1.2359 0.0 1.2359 1.1117
No log 0.1026 44 1.1297 0.0 1.1297 1.0629
No log 0.1072 46 1.1432 0.0 1.1432 1.0692
No log 0.1119 48 1.1393 0.0 1.1393 1.0674
No log 0.1166 50 0.9199 0.0143 0.9199 0.9591
No log 0.1212 52 0.8848 0.0143 0.8848 0.9407
No log 0.1259 54 1.0594 0.0143 1.0594 1.0292
No log 0.1305 56 1.0446 0.0143 1.0446 1.0221
No log 0.1352 58 0.9308 0.0143 0.9308 0.9648
No log 0.1399 60 0.9824 0.0143 0.9824 0.9911
No log 0.1445 62 1.1525 0.0 1.1525 1.0736
No log 0.1492 64 1.1400 0.0 1.1400 1.0677
No log 0.1538 66 0.9839 0.0073 0.9839 0.9919
No log 0.1585 68 0.9852 0.0036 0.9852 0.9926
No log 0.1632 70 1.1657 0.0 1.1657 1.0797
No log 0.1678 72 1.2796 0.0 1.2796 1.1312
No log 0.1725 74 1.2845 0.0 1.2845 1.1333
No log 0.1772 76 1.0578 0.0143 1.0578 1.0285
No log 0.1818 78 0.8877 0.0179 0.8877 0.9422
No log 0.1865 80 0.8662 0.0358 0.8662 0.9307
No log 0.1911 82 0.9209 0.0143 0.9209 0.9596
No log 0.1958 84 1.1707 0.0 1.1707 1.0820
No log 0.2005 86 1.3210 0.0262 1.3210 1.1494
No log 0.2051 88 1.4341 0.2121 1.4341 1.1975
No log 0.2098 90 1.3311 0.0070 1.3311 1.1537
No log 0.2145 92 1.1069 0.0 1.1069 1.0521
No log 0.2191 94 0.9273 0.0 0.9273 0.9630
No log 0.2238 96 0.8827 0.0323 0.8827 0.9395
No log 0.2284 98 0.8738 0.0252 0.8738 0.9348
No log 0.2331 100 0.9496 0.0 0.9496 0.9745
No log 0.2378 102 1.0099 0.0 1.0099 1.0050
No log 0.2424 104 1.0647 0.0 1.0647 1.0319
No log 0.2471 106 0.9915 0.0 0.9915 0.9958
No log 0.2517 108 1.0097 0.0382 1.0097 1.0048
No log 0.2564 110 1.0028 0.1248 1.0028 1.0014
No log 0.2611 112 0.9278 0.1526 0.9278 0.9632
No log 0.2657 114 0.9710 0.3092 0.9710 0.9854
No log 0.2704 116 0.8717 0.2976 0.8717 0.9336
No log 0.2751 118 0.7836 0.2718 0.7836 0.8852
No log 0.2797 120 0.7954 0.3127 0.7954 0.8918
No log 0.2844 122 0.9252 0.3546 0.9252 0.9619
No log 0.2890 124 0.8700 0.3571 0.8700 0.9327
No log 0.2937 126 0.7403 0.3369 0.7403 0.8604
No log 0.2984 128 0.7369 0.3058 0.7369 0.8584
No log 0.3030 130 0.7438 0.2744 0.7438 0.8625
No log 0.3077 132 0.7503 0.2813 0.7503 0.8662
No log 0.3124 134 0.6996 0.3109 0.6996 0.8364
No log 0.3170 136 0.6582 0.3700 0.6582 0.8113
No log 0.3217 138 0.6768 0.3613 0.6768 0.8227
No log 0.3263 140 0.6670 0.3977 0.6670 0.8167
No log 0.3310 142 0.6407 0.4164 0.6407 0.8004
No log 0.3357 144 0.6279 0.4588 0.6279 0.7924
No log 0.3403 146 0.6372 0.4161 0.6372 0.7983
No log 0.3450 148 0.6649 0.4023 0.6649 0.8154
No log 0.3497 150 0.6939 0.3856 0.6939 0.8330
No log 0.3543 152 0.6779 0.3844 0.6779 0.8234
No log 0.3590 154 0.8422 0.3675 0.8422 0.9177
No log 0.3636 156 0.8417 0.3777 0.8417 0.9174
No log 0.3683 158 0.7826 0.3775 0.7826 0.8846
No log 0.3730 160 0.6487 0.3777 0.6487 0.8054
No log 0.3776 162 0.5992 0.4299 0.5992 0.7740
No log 0.3823 164 0.6050 0.4694 0.6050 0.7778
No log 0.3869 166 0.6142 0.4970 0.6142 0.7837
No log 0.3916 168 0.5750 0.4569 0.5750 0.7583
No log 0.3963 170 0.6277 0.5037 0.6277 0.7923
No log 0.4009 172 0.7736 0.4456 0.7736 0.8795
No log 0.4056 174 0.9673 0.4133 0.9673 0.9835
No log 0.4103 176 0.8049 0.4295 0.8049 0.8972
No log 0.4149 178 0.7364 0.4152 0.7364 0.8581
No log 0.4196 180 0.7654 0.4253 0.7654 0.8749
No log 0.4242 182 0.9275 0.4106 0.9275 0.9631
No log 0.4289 184 0.8539 0.4148 0.8539 0.9241
No log 0.4336 186 0.6760 0.4512 0.6760 0.8222
No log 0.4382 188 0.6397 0.4239 0.6397 0.7998
No log 0.4429 190 0.7127 0.4156 0.7127 0.8442
No log 0.4476 192 0.6885 0.4279 0.6885 0.8298
No log 0.4522 194 0.6358 0.4176 0.6358 0.7974
No log 0.4569 196 0.6393 0.4404 0.6393 0.7996
No log 0.4615 198 0.6466 0.4403 0.6466 0.8041
No log 0.4662 200 0.7371 0.4441 0.7371 0.8585
No log 0.4709 202 0.6930 0.4592 0.6930 0.8325
No log 0.4755 204 0.6651 0.4764 0.6651 0.8156
No log 0.4802 206 0.6216 0.4874 0.6216 0.7884
No log 0.4848 208 0.6288 0.4498 0.6288 0.7930
No log 0.4895 210 0.7087 0.3337 0.7087 0.8418
No log 0.4942 212 0.6133 0.4225 0.6133 0.7831
No log 0.4988 214 0.5628 0.5285 0.5628 0.7502
No log 0.5035 216 0.5629 0.5312 0.5629 0.7503
No log 0.5082 218 0.5692 0.5199 0.5692 0.7545
No log 0.5128 220 0.5704 0.5139 0.5704 0.7553
No log 0.5175 222 0.6406 0.4129 0.6406 0.8003
No log 0.5221 224 0.6120 0.4437 0.6120 0.7823
No log 0.5268 226 0.5876 0.5391 0.5876 0.7666
No log 0.5315 228 0.7083 0.5018 0.7083 0.8416
No log 0.5361 230 0.5759 0.5407 0.5759 0.7589
No log 0.5408 232 0.5650 0.4865 0.5650 0.7517
No log 0.5455 234 0.5471 0.5400 0.5471 0.7397
No log 0.5501 236 0.6479 0.4973 0.6479 0.8049
No log 0.5548 238 0.6382 0.5071 0.6382 0.7989
No log 0.5594 240 0.6395 0.5236 0.6395 0.7997
No log 0.5641 242 0.5608 0.5356 0.5608 0.7489
No log 0.5688 244 0.6164 0.5473 0.6164 0.7851
No log 0.5734 246 0.5722 0.5658 0.5722 0.7564
No log 0.5781 248 0.5177 0.5355 0.5177 0.7195
No log 0.5828 250 0.5329 0.5189 0.5329 0.7300
No log 0.5874 252 0.5397 0.5682 0.5397 0.7347
No log 0.5921 254 0.7793 0.4886 0.7793 0.8828
No log 0.5967 256 0.7562 0.4960 0.7562 0.8696
No log 0.6014 258 0.5510 0.5218 0.5510 0.7423
No log 0.6061 260 0.5833 0.4736 0.5833 0.7637
No log 0.6107 262 0.5789 0.4672 0.5789 0.7608
No log 0.6154 264 0.5632 0.5213 0.5632 0.7505
No log 0.6200 266 0.8972 0.4317 0.8972 0.9472
No log 0.6247 268 1.2352 0.3382 1.2352 1.1114
No log 0.6294 270 1.1639 0.3660 1.1639 1.0788
No log 0.6340 272 0.8008 0.4440 0.8008 0.8949
No log 0.6387 274 0.5685 0.5206 0.5685 0.7540
No log 0.6434 276 0.5656 0.5081 0.5656 0.7521
No log 0.6480 278 0.5751 0.5230 0.5751 0.7584
No log 0.6527 280 0.6808 0.4899 0.6808 0.8251
No log 0.6573 282 0.8706 0.4267 0.8706 0.9330
No log 0.6620 284 0.7656 0.4588 0.7656 0.8750
No log 0.6667 286 0.5443 0.5958 0.5443 0.7378
No log 0.6713 288 0.5226 0.5962 0.5226 0.7229
No log 0.6760 290 0.5095 0.6071 0.5095 0.7138
No log 0.6807 292 0.5768 0.5801 0.5768 0.7595
No log 0.6853 294 0.5851 0.5650 0.5851 0.7649
No log 0.6900 296 0.5128 0.6047 0.5128 0.7161
No log 0.6946 298 0.5089 0.5803 0.5089 0.7134
No log 0.6993 300 0.5085 0.5734 0.5085 0.7131
No log 0.7040 302 0.5283 0.5689 0.5283 0.7269
No log 0.7086 304 0.7961 0.4627 0.7961 0.8922
No log 0.7133 306 0.9707 0.4049 0.9707 0.9853
No log 0.7179 308 0.7990 0.4473 0.7990 0.8939
No log 0.7226 310 0.6139 0.4528 0.6139 0.7835
No log 0.7273 312 0.5635 0.5261 0.5635 0.7507
No log 0.7319 314 0.5683 0.5282 0.5683 0.7539
No log 0.7366 316 0.5644 0.5540 0.5644 0.7513
No log 0.7413 318 0.5096 0.5971 0.5096 0.7138
No log 0.7459 320 0.5512 0.5775 0.5512 0.7424
No log 0.7506 322 0.5975 0.5582 0.5975 0.7730
No log 0.7552 324 0.6028 0.5605 0.6028 0.7764
No log 0.7599 326 0.6276 0.5439 0.6276 0.7922
No log 0.7646 328 0.6967 0.5367 0.6967 0.8347
No log 0.7692 330 0.7977 0.5171 0.7977 0.8931
No log 0.7739 332 0.9367 0.4793 0.9367 0.9678
No log 0.7786 334 0.6564 0.5722 0.6564 0.8102
No log 0.7832 336 0.5104 0.6285 0.5104 0.7144
No log 0.7879 338 0.5155 0.6268 0.5155 0.7180
No log 0.7925 340 0.5791 0.6236 0.5791 0.7610
No log 0.7972 342 0.6672 0.5843 0.6672 0.8168
No log 0.8019 344 0.6547 0.5891 0.6547 0.8092
No log 0.8065 346 0.5056 0.6337 0.5056 0.7110
No log 0.8112 348 0.4891 0.6295 0.4891 0.6994
No log 0.8159 350 0.4856 0.6249 0.4856 0.6968
No log 0.8205 352 0.4916 0.6220 0.4916 0.7011
No log 0.8252 354 0.4930 0.6282 0.4930 0.7021
No log 0.8298 356 0.5044 0.6097 0.5044 0.7102
No log 0.8345 358 0.5103 0.6353 0.5103 0.7144
No log 0.8392 360 0.5378 0.6343 0.5378 0.7333
No log 0.8438 362 0.5636 0.6194 0.5636 0.7507
No log 0.8485 364 0.7472 0.5763 0.7472 0.8644
No log 0.8531 366 0.7252 0.5791 0.7252 0.8516
No log 0.8578 368 0.5898 0.6122 0.5898 0.7680
No log 0.8625 370 0.7453 0.5799 0.7453 0.8633
No log 0.8671 372 0.6918 0.5720 0.6918 0.8317
No log 0.8718 374 0.5452 0.6077 0.5452 0.7384
No log 0.8765 376 0.5438 0.5906 0.5438 0.7374
No log 0.8811 378 0.5310 0.5871 0.5310 0.7287
No log 0.8858 380 0.5805 0.5836 0.5805 0.7619
No log 0.8904 382 0.8195 0.5321 0.8195 0.9053
No log 0.8951 384 0.7126 0.5589 0.7126 0.8442
No log 0.8998 386 0.5401 0.6017 0.5401 0.7349
No log 0.9044 388 0.5559 0.6210 0.5559 0.7456
No log 0.9091 390 0.5439 0.5992 0.5439 0.7375
No log 0.9138 392 0.6774 0.5525 0.6774 0.8230
No log 0.9184 394 0.7939 0.4982 0.7939 0.8910
No log 0.9231 396 0.7069 0.5294 0.7069 0.8408
No log 0.9277 398 0.5890 0.5907 0.5890 0.7675
No log 0.9324 400 0.5975 0.5906 0.5975 0.7730
No log 0.9371 402 0.6703 0.5743 0.6703 0.8187
No log 0.9417 404 0.8185 0.5545 0.8185 0.9047
No log 0.9464 406 0.6894 0.5771 0.6894 0.8303
No log 0.9510 408 0.6050 0.6040 0.6050 0.7778
No log 0.9557 410 0.5090 0.6361 0.5090 0.7134
No log 0.9604 412 0.5219 0.6351 0.5219 0.7224
No log 0.9650 414 0.5180 0.6548 0.5180 0.7197
No log 0.9697 416 0.5676 0.6551 0.5676 0.7534
No log 0.9744 418 0.5886 0.6483 0.5886 0.7672
No log 0.9790 420 0.5261 0.6469 0.5261 0.7253
No log 0.9837 422 0.5870 0.6288 0.5870 0.7662
No log 0.9883 424 0.7002 0.5809 0.7002 0.8368
No log 0.9930 426 0.5667 0.6122 0.5667 0.7528
No log 0.9977 428 0.5623 0.6555 0.5623 0.7499
No log 1.0023 430 0.7268 0.5898 0.7268 0.8525
No log 1.0070 432 0.6592 0.5963 0.6592 0.8119
No log 1.0117 434 0.6531 0.5806 0.6531 0.8081
No log 1.0163 436 0.6722 0.5720 0.6722 0.8199
No log 1.0210 438 0.6533 0.5393 0.6533 0.8083
No log 1.0256 440 0.6333 0.5669 0.6333 0.7958
No log 1.0303 442 0.7562 0.5593 0.7562 0.8696
No log 1.0350 444 0.7881 0.5553 0.7881 0.8877
No log 1.0396 446 0.5655 0.6375 0.5655 0.7520
No log 1.0443 448 0.5010 0.6453 0.5010 0.7078
No log 1.0490 450 0.5040 0.6472 0.5040 0.7099
No log 1.0536 452 0.6296 0.6041 0.6296 0.7935
No log 1.0583 454 0.7043 0.5849 0.7043 0.8392
No log 1.0629 456 0.5669 0.6263 0.5669 0.7529
No log 1.0676 458 0.4899 0.6481 0.4899 0.7000
No log 1.0723 460 0.5168 0.6089 0.5168 0.7189
No log 1.0769 462 0.5086 0.6155 0.5086 0.7132
No log 1.0816 464 0.5141 0.6416 0.5141 0.7170
No log 1.0862 466 0.6181 0.6143 0.6181 0.7862
No log 1.0909 468 0.6788 0.6171 0.6788 0.8239
No log 1.0956 470 0.5832 0.6188 0.5832 0.7637
No log 1.1002 472 0.5803 0.6331 0.5803 0.7618
No log 1.1049 474 0.5935 0.6359 0.5935 0.7704
No log 1.1096 476 0.5957 0.6198 0.5957 0.7718
No log 1.1142 478 0.5229 0.6492 0.5229 0.7231
No log 1.1189 480 0.5061 0.6446 0.5061 0.7114
No log 1.1235 482 0.5001 0.6397 0.5001 0.7071
No log 1.1282 484 0.5478 0.6432 0.5478 0.7402
No log 1.1329 486 0.5905 0.6402 0.5905 0.7684
No log 1.1375 488 0.5022 0.6697 0.5022 0.7086
No log 1.1422 490 0.5098 0.6487 0.5098 0.7140
No log 1.1469 492 0.4983 0.6456 0.4983 0.7059
No log 1.1515 494 0.4993 0.6632 0.4993 0.7066
No log 1.1562 496 0.5403 0.6554 0.5403 0.7350
No log 1.1608 498 0.5965 0.6392 0.5965 0.7723
0.8772 1.1655 500 0.5477 0.6326 0.5477 0.7401
0.8772 1.1702 502 0.6351 0.6098 0.6351 0.7969
0.8772 1.1748 504 0.5885 0.6078 0.5885 0.7672
0.8772 1.1795 506 0.7234 0.5930 0.7234 0.8506
0.8772 1.1841 508 1.1260 0.4874 1.1260 1.0611
0.8772 1.1888 510 1.0233 0.4858 1.0233 1.0116

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
17
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold1

Finetuned
(2174)
this model