output_nf_3

This model is a fine-tuned version of nferruz/ProtGPT2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 5.6642
  • Accuracy: 0.3099

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 1
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
12.1762 1.0 9 11.8181 0.0029
11.6538 2.0 18 11.3468 0.0010
11.2876 3.0 27 10.9127 0.0010
10.8664 4.0 36 10.5162 0.0010
10.6184 5.0 45 10.1817 0.0020
10.3132 6.0 54 9.8765 0.0010
9.9087 7.0 63 9.6073 0.0010
9.6953 8.0 72 9.3244 0.0
9.3741 9.0 81 9.0318 0.0
9.2045 10.0 90 8.7461 0.0
8.9079 11.0 99 8.4755 0.0020
8.7047 12.0 108 8.2222 0.0117
8.4622 13.0 117 7.9951 0.0313
8.2649 14.0 126 7.8052 0.0655
8.043 15.0 135 7.6495 0.1065
7.9092 16.0 144 7.5160 0.1378
7.7103 17.0 153 7.3881 0.1691
7.5701 18.0 162 7.2693 0.1818
7.4483 19.0 171 7.1544 0.1994
7.314 20.0 180 7.0469 0.2160
7.2101 21.0 189 6.9440 0.2326
7.1026 22.0 198 6.8542 0.2366
6.9954 23.0 207 6.7684 0.2483
6.9206 24.0 216 6.6851 0.2512
6.8588 25.0 225 6.6217 0.2561
6.7975 26.0 234 6.5621 0.2590
6.7355 27.0 243 6.5111 0.2610
6.6928 28.0 252 6.4630 0.2639
6.6483 29.0 261 6.4137 0.2698
6.6169 30.0 270 6.3675 0.2717
6.5498 31.0 279 6.3292 0.2747
6.5385 32.0 288 6.2967 0.2766
6.5111 33.0 297 6.2592 0.2786
6.4748 34.0 306 6.2316 0.2815
6.4575 35.0 315 6.2096 0.2835
6.4251 36.0 324 6.1847 0.2845
6.4096 37.0 333 6.1613 0.2854
6.3741 38.0 342 6.1402 0.2854
6.3645 39.0 351 6.1285 0.2874
6.3511 40.0 360 6.1134 0.2874
6.3254 41.0 369 6.0957 0.2874
6.3077 42.0 378 6.0818 0.2874
6.301 43.0 387 6.0688 0.2874
6.2846 44.0 396 6.0511 0.2884
6.2739 45.0 405 6.0398 0.2893
6.2569 46.0 414 6.0301 0.2913
6.258 47.0 423 6.0176 0.2913
6.2273 48.0 432 6.0037 0.2933
6.2256 49.0 441 5.9962 0.2933
6.2065 50.0 450 5.9869 0.2933
6.1991 51.0 459 5.9756 0.2933
6.1895 52.0 468 5.9682 0.2942
6.1763 53.0 477 5.9611 0.2942
6.1734 54.0 486 5.9535 0.2942
6.1702 55.0 495 5.9457 0.2942
6.1556 56.0 504 5.9376 0.2952
6.1481 57.0 513 5.9306 0.2952
6.1425 58.0 522 5.9222 0.2942
6.1416 59.0 531 5.9170 0.2942
6.1328 60.0 540 5.9142 0.2942
6.1176 61.0 549 5.9064 0.2942
6.1091 62.0 558 5.9010 0.2942
6.104 63.0 567 5.8961 0.2942
6.0986 64.0 576 5.8915 0.2942
6.089 65.0 585 5.8854 0.2952
6.0734 66.0 594 5.8810 0.2942
6.0905 67.0 603 5.8762 0.2942
6.0701 68.0 612 5.8737 0.2962
6.0595 69.0 621 5.8694 0.2962
6.0616 70.0 630 5.8661 0.2962
6.0512 71.0 639 5.8635 0.2962
6.0415 72.0 648 5.8613 0.2962
6.0391 73.0 657 5.8583 0.2962
6.032 74.0 666 5.8548 0.2962
6.0317 75.0 675 5.8511 0.2962
6.0343 76.0 684 5.8479 0.2972
6.0156 77.0 693 5.8435 0.2981
6.0167 78.0 702 5.8403 0.2981
6.0052 79.0 711 5.8364 0.2981
6.0057 80.0 720 5.8336 0.2981
6.0001 81.0 729 5.8319 0.2981
6.001 82.0 738 5.8288 0.2991
5.9905 83.0 747 5.8266 0.3011
5.9906 84.0 756 5.8242 0.3021
5.9862 85.0 765 5.8216 0.3021
5.9829 86.0 774 5.8191 0.3050
5.9725 87.0 783 5.8173 0.3069
5.9795 88.0 792 5.8142 0.3089
5.9695 89.0 801 5.8115 0.3069
5.9607 90.0 810 5.8081 0.3089
5.9605 91.0 819 5.8057 0.3089
5.9591 92.0 828 5.8037 0.3079
5.9481 93.0 837 5.8011 0.3069
5.9501 94.0 846 5.7983 0.3079
5.948 95.0 855 5.7943 0.3089
5.9488 96.0 864 5.7907 0.3109
5.9449 97.0 873 5.7888 0.3118
5.9357 98.0 882 5.7872 0.3099
5.9363 99.0 891 5.7834 0.3079
5.9368 100.0 900 5.7817 0.3099
5.9215 101.0 909 5.7791 0.3099
5.9264 102.0 918 5.7749 0.3099
5.9126 103.0 927 5.7725 0.3089
5.9163 104.0 936 5.7698 0.3069
5.9183 105.0 945 5.7673 0.3069
5.9154 106.0 954 5.7639 0.3089
5.9149 107.0 963 5.7612 0.3079
5.9082 108.0 972 5.7579 0.3089
5.9051 109.0 981 5.7559 0.3079
5.908 110.0 990 5.7531 0.3079
5.901 111.0 999 5.7502 0.3079
5.9064 112.0 1008 5.7484 0.3089
5.895 113.0 1017 5.7460 0.3099
5.894 114.0 1026 5.7438 0.3089
5.8809 115.0 1035 5.7419 0.3079
5.8893 116.0 1044 5.7419 0.3079
5.8874 117.0 1053 5.7392 0.3069
5.8798 118.0 1062 5.7349 0.3089
5.8826 119.0 1071 5.7324 0.3099
5.8736 120.0 1080 5.7299 0.3099
5.8751 121.0 1089 5.7270 0.3099
5.8699 122.0 1098 5.7253 0.3099
5.8802 123.0 1107 5.7231 0.3099
5.8707 124.0 1116 5.7240 0.3099
5.8653 125.0 1125 5.7232 0.3109
5.8693 126.0 1134 5.7180 0.3109
5.8662 127.0 1143 5.7147 0.3109
5.8539 128.0 1152 5.7132 0.3099
5.8611 129.0 1161 5.7127 0.3099
5.8495 130.0 1170 5.7135 0.3079
5.8602 131.0 1179 5.7111 0.3089
5.8512 132.0 1188 5.7077 0.3109
5.8493 133.0 1197 5.7050 0.3109
5.8477 134.0 1206 5.7041 0.3099
5.8464 135.0 1215 5.7042 0.3089
5.8459 136.0 1224 5.7023 0.3099
5.8475 137.0 1233 5.7000 0.3099
5.8384 138.0 1242 5.6982 0.3099
5.8453 139.0 1251 5.6976 0.3089
5.8441 140.0 1260 5.6973 0.3089
5.838 141.0 1269 5.6971 0.3079
5.8463 142.0 1278 5.6950 0.3099
5.8385 143.0 1287 5.6921 0.3099
5.8354 144.0 1296 5.6909 0.3099
5.8283 145.0 1305 5.6908 0.3079
5.8363 146.0 1314 5.6902 0.3079
5.8433 147.0 1323 5.6890 0.3079
5.8302 148.0 1332 5.6890 0.3079
5.8276 149.0 1341 5.6881 0.3089
5.8366 150.0 1350 5.6864 0.3089
5.826 151.0 1359 5.6852 0.3099
5.8293 152.0 1368 5.6844 0.3099
5.8278 153.0 1377 5.6834 0.3099
5.8239 154.0 1386 5.6830 0.3089
5.8262 155.0 1395 5.6818 0.3089
5.8253 156.0 1404 5.6808 0.3099
5.8169 157.0 1413 5.6793 0.3109
5.8201 158.0 1422 5.6795 0.3109
5.8077 159.0 1431 5.6799 0.3099
5.8222 160.0 1440 5.6789 0.3099
5.8191 161.0 1449 5.6778 0.3099
5.83 162.0 1458 5.6768 0.3099
5.8183 163.0 1467 5.6757 0.3109
5.8124 164.0 1476 5.6747 0.3109
5.8119 165.0 1485 5.6745 0.3109
5.821 166.0 1494 5.6744 0.3099
5.807 167.0 1503 5.6737 0.3099
5.8177 168.0 1512 5.6725 0.3109
5.8046 169.0 1521 5.6710 0.3109
5.8093 170.0 1530 5.6708 0.3109
5.8145 171.0 1539 5.6710 0.3109
5.803 172.0 1548 5.6704 0.3109
5.8038 173.0 1557 5.6697 0.3109
5.807 174.0 1566 5.6689 0.3109
5.7974 175.0 1575 5.6683 0.3109
5.8089 176.0 1584 5.6688 0.3109
5.8067 177.0 1593 5.6693 0.3109
5.8092 178.0 1602 5.6695 0.3109
5.8047 179.0 1611 5.6687 0.3099
5.8007 180.0 1620 5.6679 0.3099
5.8041 181.0 1629 5.6672 0.3099
5.8072 182.0 1638 5.6667 0.3099
5.8093 183.0 1647 5.6662 0.3099
5.7948 184.0 1656 5.6658 0.3099
5.7968 185.0 1665 5.6656 0.3099
5.8033 186.0 1674 5.6653 0.3099
5.8031 187.0 1683 5.6651 0.3099
5.7953 188.0 1692 5.6650 0.3099
5.8085 189.0 1701 5.6647 0.3099
5.8021 190.0 1710 5.6646 0.3099
5.7995 191.0 1719 5.6643 0.3099
5.8057 192.0 1728 5.6642 0.3099
5.7989 193.0 1737 5.6642 0.3099
5.7977 194.0 1746 5.6642 0.3099
5.8009 195.0 1755 5.6644 0.3099
5.7988 196.0 1764 5.6645 0.3099
5.8016 197.0 1773 5.6645 0.3099
5.7929 198.0 1782 5.6645 0.3099
5.7973 199.0 1791 5.6645 0.3099
5.8022 200.0 1800 5.6645 0.3099

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.2.0
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
3
Safetensors
Model size
774M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for aayush14/PeptideGPT_non_fouling

Base model

nferruz/ProtGPT2
Finetuned
(10)
this model

Collection including aayush14/PeptideGPT_non_fouling