ASAP_FineTuningBERT_AugV4_k3_task1_organization_fold4

This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5183
  • Qwk: 0.6191
  • Mse: 0.5183
  • Rmse: 0.7200

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0064 2 10.6976 0.0151 10.6976 3.2707
No log 0.0128 4 9.9961 0.0018 9.9961 3.1617
No log 0.0192 6 9.0365 0.0018 9.0365 3.0061
No log 0.0256 8 7.3828 0.0018 7.3828 2.7171
No log 0.0319 10 5.9928 0.0102 5.9928 2.4480
No log 0.0383 12 4.9253 0.0180 4.9253 2.2193
No log 0.0447 14 3.8175 0.0040 3.8175 1.9538
No log 0.0511 16 3.0563 0.0040 3.0563 1.7482
No log 0.0575 18 2.0004 0.1124 2.0004 1.4144
No log 0.0639 20 1.4535 0.0420 1.4535 1.2056
No log 0.0703 22 1.1248 0.0420 1.1248 1.0605
No log 0.0767 24 0.9114 0.0833 0.9114 0.9547
No log 0.0831 26 0.8260 0.1231 0.8260 0.9089
No log 0.0895 28 0.8795 0.0605 0.8795 0.9378
No log 0.0958 30 0.9493 0.0605 0.9493 0.9743
No log 0.1022 32 0.9037 0.0605 0.9037 0.9506
No log 0.1086 34 0.9315 0.0605 0.9315 0.9651
No log 0.1150 36 0.7498 0.1357 0.7498 0.8659
No log 0.1214 38 0.7274 0.2361 0.7274 0.8529
No log 0.1278 40 0.7870 0.1037 0.7870 0.8871
No log 0.1342 42 0.7136 0.2055 0.7136 0.8447
No log 0.1406 44 0.7068 0.2591 0.7068 0.8407
No log 0.1470 46 0.8590 0.2203 0.8590 0.9268
No log 0.1534 48 0.7828 0.2050 0.7828 0.8848
No log 0.1597 50 0.6309 0.3632 0.6309 0.7943
No log 0.1661 52 0.6086 0.4318 0.6086 0.7801
No log 0.1725 54 0.6145 0.4475 0.6145 0.7839
No log 0.1789 56 0.6889 0.3875 0.6889 0.8300
No log 0.1853 58 0.5929 0.4157 0.5929 0.7700
No log 0.1917 60 0.6896 0.4145 0.6896 0.8304
No log 0.1981 62 0.6523 0.3289 0.6523 0.8077
No log 0.2045 64 0.6290 0.4061 0.6290 0.7931
No log 0.2109 66 0.6864 0.3736 0.6864 0.8285
No log 0.2173 68 0.7471 0.3807 0.7471 0.8644
No log 0.2236 70 0.7549 0.4192 0.7549 0.8688
No log 0.2300 72 0.7381 0.5186 0.7381 0.8591
No log 0.2364 74 0.6931 0.4139 0.6931 0.8325
No log 0.2428 76 0.7369 0.3463 0.7369 0.8584
No log 0.2492 78 0.8489 0.3165 0.8489 0.9214
No log 0.2556 80 0.7489 0.4047 0.7489 0.8654
No log 0.2620 82 0.6745 0.4112 0.6745 0.8213
No log 0.2684 84 0.7131 0.5075 0.7131 0.8444
No log 0.2748 86 0.7908 0.4804 0.7908 0.8893
No log 0.2812 88 1.0346 0.4077 1.0346 1.0172
No log 0.2875 90 0.8698 0.4421 0.8698 0.9326
No log 0.2939 92 0.6438 0.5365 0.6438 0.8024
No log 0.3003 94 0.6090 0.5378 0.6090 0.7804
No log 0.3067 96 0.6244 0.5424 0.6244 0.7902
No log 0.3131 98 0.6923 0.5570 0.6923 0.8320
No log 0.3195 100 0.7009 0.5376 0.7009 0.8372
No log 0.3259 102 0.6552 0.5085 0.6552 0.8094
No log 0.3323 104 0.5452 0.4638 0.5452 0.7384
No log 0.3387 106 0.5407 0.4658 0.5407 0.7353
No log 0.3450 108 0.7330 0.4800 0.7330 0.8561
No log 0.3514 110 1.1506 0.3597 1.1506 1.0727
No log 0.3578 112 1.0139 0.3951 1.0139 1.0069
No log 0.3642 114 0.5955 0.4726 0.5955 0.7717
No log 0.3706 116 0.5616 0.4614 0.5616 0.7494
No log 0.3770 118 0.5556 0.4670 0.5556 0.7454
No log 0.3834 120 0.5376 0.5675 0.5376 0.7332
No log 0.3898 122 0.7907 0.4735 0.7907 0.8892
No log 0.3962 124 0.7590 0.4692 0.7590 0.8712
No log 0.4026 126 0.5981 0.5777 0.5981 0.7734
No log 0.4089 128 0.5535 0.5461 0.5535 0.7440
No log 0.4153 130 0.6093 0.5038 0.6093 0.7806
No log 0.4217 132 0.5694 0.5472 0.5694 0.7546
No log 0.4281 134 0.6110 0.5382 0.6110 0.7817
No log 0.4345 136 0.8399 0.4528 0.8399 0.9164
No log 0.4409 138 0.9017 0.4320 0.9017 0.9496
No log 0.4473 140 0.7277 0.4878 0.7277 0.8530
No log 0.4537 142 0.6339 0.4652 0.6339 0.7962
No log 0.4601 144 0.7151 0.4801 0.7151 0.8456
No log 0.4665 146 0.9164 0.4252 0.9164 0.9573
No log 0.4728 148 0.7634 0.4757 0.7634 0.8738
No log 0.4792 150 0.7318 0.4892 0.7318 0.8555
No log 0.4856 152 0.7747 0.4820 0.7747 0.8802
No log 0.4920 154 0.9758 0.4370 0.9758 0.9878
No log 0.4984 156 0.8152 0.4758 0.8152 0.9029
No log 0.5048 158 0.5220 0.5721 0.5220 0.7225
No log 0.5112 160 0.4944 0.5654 0.4944 0.7031
No log 0.5176 162 0.5132 0.5805 0.5132 0.7164
No log 0.5240 164 0.6965 0.5424 0.6965 0.8346
No log 0.5304 166 0.6770 0.5525 0.6770 0.8228
No log 0.5367 168 0.5627 0.5600 0.5627 0.7501
No log 0.5431 170 0.5510 0.5530 0.5510 0.7423
No log 0.5495 172 0.5958 0.5517 0.5958 0.7719
No log 0.5559 174 0.8923 0.4366 0.8923 0.9446
No log 0.5623 176 1.0900 0.3945 1.0900 1.0440
No log 0.5687 178 0.9399 0.4268 0.9399 0.9695
No log 0.5751 180 0.6555 0.5037 0.6555 0.8096
No log 0.5815 182 0.5297 0.5122 0.5297 0.7278
No log 0.5879 184 0.5230 0.5539 0.5230 0.7232
No log 0.5942 186 0.7054 0.5277 0.7054 0.8399
No log 0.6006 188 0.8842 0.4678 0.8842 0.9403
No log 0.6070 190 0.7241 0.5416 0.7241 0.8510
No log 0.6134 192 0.5087 0.6113 0.5087 0.7132
No log 0.6198 194 0.5255 0.5797 0.5255 0.7249
No log 0.6262 196 0.5003 0.6131 0.5003 0.7073
No log 0.6326 198 0.5129 0.6167 0.5129 0.7162
No log 0.6390 200 0.5193 0.6127 0.5193 0.7206
No log 0.6454 202 0.5125 0.6105 0.5125 0.7159
No log 0.6518 204 0.5372 0.5561 0.5372 0.7329
No log 0.6581 206 0.5307 0.5629 0.5307 0.7285
No log 0.6645 208 0.6072 0.5420 0.6072 0.7792
No log 0.6709 210 0.6902 0.5041 0.6902 0.8308
No log 0.6773 212 0.6275 0.5060 0.6275 0.7921
No log 0.6837 214 0.6659 0.4955 0.6659 0.8160
No log 0.6901 216 0.8504 0.4816 0.8504 0.9221
No log 0.6965 218 0.7728 0.4895 0.7728 0.8791
No log 0.7029 220 0.5888 0.5305 0.5888 0.7673
No log 0.7093 222 0.5723 0.5547 0.5723 0.7565
No log 0.7157 224 0.6764 0.5201 0.6764 0.8225
No log 0.7220 226 0.6787 0.5433 0.6787 0.8239
No log 0.7284 228 0.5720 0.5775 0.5720 0.7563
No log 0.7348 230 0.5406 0.5739 0.5406 0.7352
No log 0.7412 232 0.5786 0.5507 0.5786 0.7607
No log 0.7476 234 0.5417 0.6061 0.5417 0.7360
No log 0.7540 236 0.5549 0.6198 0.5549 0.7449
No log 0.7604 238 0.5894 0.6030 0.5894 0.7677
No log 0.7668 240 0.5628 0.6096 0.5628 0.7502
No log 0.7732 242 0.5680 0.5939 0.5680 0.7536
No log 0.7796 244 0.5545 0.6026 0.5545 0.7447
No log 0.7859 246 0.6209 0.5798 0.6209 0.7880
No log 0.7923 248 0.7483 0.5264 0.7483 0.8651
No log 0.7987 250 0.7413 0.5167 0.7413 0.8610
No log 0.8051 252 0.5843 0.5632 0.5843 0.7644
No log 0.8115 254 0.5770 0.5675 0.5770 0.7596
No log 0.8179 256 0.6677 0.5504 0.6677 0.8171
No log 0.8243 258 0.6314 0.5567 0.6314 0.7946
No log 0.8307 260 0.6593 0.5544 0.6593 0.8120
No log 0.8371 262 0.6755 0.5428 0.6755 0.8219
No log 0.8435 264 0.7745 0.5121 0.7745 0.8801
No log 0.8498 266 0.7679 0.5149 0.7679 0.8763
No log 0.8562 268 0.6124 0.5478 0.6124 0.7826
No log 0.8626 270 0.5935 0.5798 0.5935 0.7704
No log 0.8690 272 0.5883 0.5813 0.5883 0.7670
No log 0.8754 274 0.5864 0.5967 0.5864 0.7657
No log 0.8818 276 0.5265 0.6343 0.5265 0.7256
No log 0.8882 278 0.5375 0.6310 0.5375 0.7331
No log 0.8946 280 0.6137 0.5894 0.6137 0.7834
No log 0.9010 282 0.5380 0.6121 0.5380 0.7335
No log 0.9073 284 0.4957 0.6259 0.4957 0.7041
No log 0.9137 286 0.4976 0.6084 0.4976 0.7054
No log 0.9201 288 0.5236 0.5967 0.5236 0.7236
No log 0.9265 290 0.5187 0.6106 0.5187 0.7202
No log 0.9329 292 0.5303 0.6008 0.5303 0.7282
No log 0.9393 294 0.5534 0.5930 0.5534 0.7439
No log 0.9457 296 0.5842 0.5795 0.5842 0.7643
No log 0.9521 298 0.5277 0.6080 0.5277 0.7264
No log 0.9585 300 0.5998 0.5844 0.5998 0.7745
No log 0.9649 302 0.5526 0.5904 0.5526 0.7434
No log 0.9712 304 0.5529 0.5457 0.5529 0.7435
No log 0.9776 306 0.5998 0.5086 0.5998 0.7745
No log 0.9840 308 0.5176 0.5762 0.5176 0.7195
No log 0.9904 310 0.5566 0.6154 0.5566 0.7460
No log 0.9968 312 0.5388 0.6095 0.5388 0.7340
No log 1.0032 314 0.5417 0.6016 0.5417 0.7360
No log 1.0096 316 0.5253 0.5878 0.5253 0.7248
No log 1.0160 318 0.5322 0.5899 0.5322 0.7296
No log 1.0224 320 0.5501 0.5963 0.5501 0.7417
No log 1.0288 322 0.6302 0.5789 0.6302 0.7939
No log 1.0351 324 0.5568 0.6101 0.5568 0.7462
No log 1.0415 326 0.5858 0.6215 0.5858 0.7654
No log 1.0479 328 0.6811 0.5482 0.6811 0.8253
No log 1.0543 330 0.5708 0.6023 0.5708 0.7555
No log 1.0607 332 0.5379 0.6203 0.5379 0.7334
No log 1.0671 334 0.5669 0.6034 0.5669 0.7529
No log 1.0735 336 0.6221 0.5690 0.6221 0.7888
No log 1.0799 338 0.5613 0.5943 0.5613 0.7492
No log 1.0863 340 0.5605 0.5720 0.5605 0.7487
No log 1.0927 342 0.5771 0.5774 0.5771 0.7597
No log 1.0990 344 0.5707 0.5775 0.5707 0.7555
No log 1.1054 346 0.5464 0.5912 0.5464 0.7392
No log 1.1118 348 0.5536 0.5879 0.5536 0.7440
No log 1.1182 350 0.7107 0.5470 0.7107 0.8430
No log 1.1246 352 0.8504 0.5279 0.8504 0.9222
No log 1.1310 354 0.6671 0.5361 0.6671 0.8167
No log 1.1374 356 0.5682 0.5996 0.5682 0.7538
No log 1.1438 358 0.5568 0.5918 0.5568 0.7462
No log 1.1502 360 0.6755 0.5351 0.6755 0.8219
No log 1.1565 362 0.7539 0.5183 0.7539 0.8683
No log 1.1629 364 0.6043 0.5598 0.6043 0.7773
No log 1.1693 366 0.5314 0.6074 0.5314 0.7289
No log 1.1757 368 0.5330 0.6132 0.5330 0.7301
No log 1.1821 370 0.6910 0.5233 0.6910 0.8312
No log 1.1885 372 0.7723 0.5260 0.7723 0.8788
No log 1.1949 374 0.6120 0.5772 0.6120 0.7823
No log 1.2013 376 0.5166 0.6242 0.5166 0.7188
No log 1.2077 378 0.5251 0.6228 0.5251 0.7246
No log 1.2141 380 0.5431 0.6145 0.5431 0.7369
No log 1.2204 382 0.4983 0.6497 0.4983 0.7059
No log 1.2268 384 0.4982 0.6413 0.4982 0.7058
No log 1.2332 386 0.5007 0.6385 0.5007 0.7076
No log 1.2396 388 0.5311 0.6088 0.5311 0.7288
No log 1.2460 390 0.5182 0.6190 0.5182 0.7199
No log 1.2524 392 0.5226 0.6232 0.5226 0.7229
No log 1.2588 394 0.5408 0.6302 0.5408 0.7354
No log 1.2652 396 0.6128 0.5768 0.6128 0.7828
No log 1.2716 398 0.7495 0.5705 0.7495 0.8657
No log 1.2780 400 0.6667 0.5763 0.6667 0.8165
No log 1.2843 402 0.5745 0.6158 0.5745 0.7579
No log 1.2907 404 0.5674 0.6219 0.5674 0.7533
No log 1.2971 406 0.5665 0.6354 0.5665 0.7527
No log 1.3035 408 0.5966 0.5975 0.5966 0.7724
No log 1.3099 410 0.5241 0.6258 0.5241 0.7240
No log 1.3163 412 0.5306 0.5973 0.5306 0.7284
No log 1.3227 414 0.5108 0.6141 0.5108 0.7147
No log 1.3291 416 0.5241 0.6077 0.5241 0.7239
No log 1.3355 418 0.6375 0.5584 0.6375 0.7984
No log 1.3419 420 0.5985 0.5589 0.5985 0.7736
No log 1.3482 422 0.5688 0.5743 0.5688 0.7542
No log 1.3546 424 0.5825 0.5786 0.5825 0.7632
No log 1.3610 426 0.5940 0.5713 0.5940 0.7707
No log 1.3674 428 0.6371 0.5538 0.6371 0.7982
No log 1.3738 430 0.5536 0.5933 0.5536 0.7440
No log 1.3802 432 0.5361 0.6228 0.5361 0.7322
No log 1.3866 434 0.5363 0.6196 0.5363 0.7324
No log 1.3930 436 0.5986 0.5910 0.5986 0.7737
No log 1.3994 438 0.6832 0.5581 0.6832 0.8265
No log 1.4058 440 0.5976 0.5918 0.5976 0.7730
No log 1.4121 442 0.5662 0.6115 0.5662 0.7525
No log 1.4185 444 0.5450 0.6189 0.5450 0.7382
No log 1.4249 446 0.5688 0.5996 0.5688 0.7542
No log 1.4313 448 0.5659 0.6006 0.5659 0.7522
No log 1.4377 450 0.5032 0.6415 0.5032 0.7094
No log 1.4441 452 0.5190 0.5966 0.5190 0.7204
No log 1.4505 454 0.5279 0.5892 0.5279 0.7266
No log 1.4569 456 0.6419 0.5718 0.6419 0.8012
No log 1.4633 458 0.7961 0.5293 0.7961 0.8922
No log 1.4696 460 0.7711 0.5256 0.7711 0.8781
No log 1.4760 462 0.6106 0.5730 0.6106 0.7814
No log 1.4824 464 0.5586 0.6024 0.5586 0.7474
No log 1.4888 466 0.5769 0.5931 0.5769 0.7595
No log 1.4952 468 0.6146 0.5663 0.6146 0.7839
No log 1.5016 470 0.6625 0.5599 0.6625 0.8139
No log 1.5080 472 0.5637 0.5903 0.5637 0.7508
No log 1.5144 474 0.5309 0.6050 0.5309 0.7286
No log 1.5208 476 0.6172 0.5900 0.6172 0.7856
No log 1.5272 478 0.7000 0.5519 0.7000 0.8367
No log 1.5335 480 0.6096 0.5713 0.6096 0.7807
No log 1.5399 482 0.5232 0.6118 0.5232 0.7233
No log 1.5463 484 0.5266 0.6177 0.5266 0.7257
No log 1.5527 486 0.5599 0.6258 0.5599 0.7483
No log 1.5591 488 0.5965 0.5881 0.5965 0.7723
No log 1.5655 490 0.7104 0.5544 0.7104 0.8429
No log 1.5719 492 0.6976 0.5641 0.6976 0.8352
No log 1.5783 494 0.5298 0.6249 0.5298 0.7278
No log 1.5847 496 0.5669 0.6061 0.5669 0.7529
No log 1.5911 498 0.5585 0.6088 0.5585 0.7473
0.9486 1.5974 500 0.5040 0.6488 0.5040 0.7099
0.9486 1.6038 502 0.5498 0.6225 0.5498 0.7415
0.9486 1.6102 504 0.5270 0.6211 0.5270 0.7259
0.9486 1.6166 506 0.5168 0.6109 0.5168 0.7189
0.9486 1.6230 508 0.5160 0.6206 0.5160 0.7184
0.9486 1.6294 510 0.5183 0.6191 0.5183 0.7200

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
162
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ASAP_FineTuningBERT_AugV4_k3_task1_organization_fold4

Finetuned
(3019)
this model