davanstrien's picture
davanstrien HF staff
End of training
e0f7b2c verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/convnextv2-tiny-1k-224
tags:
  - image-classification
  - vision
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: convnextv2-tiny-1k-224-text
    results: []

convnextv2-tiny-1k-224-text

This model is a fine-tuned version of facebook/convnextv2-tiny-1k-224 on the davanstrien/zenodo-presentations-open-labels dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4554
  • Accuracy: 0.7874

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 1337
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 200.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.5242 1.0 23 0.4961 0.7559
0.459 2.0 46 0.5001 0.7638
0.4429 3.0 69 0.4554 0.7874
0.4308 4.0 92 0.4924 0.7638
0.4319 5.0 115 0.4673 0.7874
0.4047 6.0 138 0.4930 0.7756
0.425 7.0 161 0.4739 0.7795
0.4102 8.0 184 0.5118 0.7598
0.3959 9.0 207 0.5490 0.7480
0.365 10.0 230 0.5261 0.7638
0.4214 11.0 253 0.5089 0.7795
0.3798 12.0 276 0.4711 0.7992
0.3906 13.0 299 0.5035 0.7913
0.3706 14.0 322 0.4933 0.7953
0.3766 15.0 345 0.4973 0.7992
0.3213 16.0 368 0.5221 0.7874
0.329 17.0 391 0.5400 0.7835
0.3427 18.0 414 0.5252 0.7913
0.3472 19.0 437 0.6208 0.7441
0.3424 20.0 460 0.5320 0.7795
0.3016 21.0 483 0.5488 0.7795
0.3033 22.0 506 0.5889 0.7480
0.3083 23.0 529 0.6108 0.7638
0.2772 24.0 552 0.5845 0.7480
0.287 25.0 575 0.5242 0.8071
0.2651 26.0 598 0.6276 0.7598
0.2696 27.0 621 0.5649 0.7835
0.2701 28.0 644 0.6103 0.7756
0.2451 29.0 667 0.6207 0.7638
0.2705 30.0 690 0.5990 0.7756
0.2553 31.0 713 0.5962 0.7835
0.2559 32.0 736 0.6681 0.7717
0.2405 33.0 759 0.5917 0.7638
0.2707 34.0 782 0.5906 0.7638
0.3004 35.0 805 0.5905 0.7874
0.2404 36.0 828 0.5914 0.7677
0.242 37.0 851 0.7637 0.7638
0.2221 38.0 874 0.7117 0.7598
0.2196 39.0 897 0.6442 0.7835
0.23 40.0 920 0.7011 0.7717
0.2045 41.0 943 0.7822 0.7598
0.2043 42.0 966 0.7339 0.7520
0.2413 43.0 989 0.6917 0.7677
0.2135 44.0 1012 0.6954 0.7717
0.2194 45.0 1035 0.6729 0.7795
0.211 46.0 1058 0.6841 0.7835
0.2155 47.0 1081 0.7108 0.7677
0.2231 48.0 1104 0.6758 0.7677
0.2364 49.0 1127 0.7747 0.7520
0.222 50.0 1150 0.7104 0.7638
0.2018 51.0 1173 0.6885 0.7953
0.219 52.0 1196 0.7609 0.7520
0.1916 53.0 1219 0.8394 0.7677
0.1767 54.0 1242 0.7910 0.7717
0.236 55.0 1265 0.7601 0.7756
0.1898 56.0 1288 0.7501 0.7717
0.1876 57.0 1311 0.7492 0.7756
0.1592 58.0 1334 0.7905 0.7638
0.1772 59.0 1357 0.7411 0.7717
0.1787 60.0 1380 0.8145 0.7795
0.1782 61.0 1403 0.7721 0.7795
0.1781 62.0 1426 0.8022 0.7835
0.1884 63.0 1449 0.8630 0.7756
0.1905 64.0 1472 0.7472 0.7953
0.16 65.0 1495 0.7761 0.7874
0.1619 66.0 1518 0.8586 0.7795
0.1768 67.0 1541 0.7700 0.7835
0.1395 68.0 1564 0.8326 0.7717
0.1536 69.0 1587 0.8442 0.7756
0.208 70.0 1610 0.9289 0.7677
0.1783 71.0 1633 0.9022 0.7638
0.1572 72.0 1656 0.8510 0.7677
0.1349 73.0 1679 0.7962 0.7677
0.148 74.0 1702 0.8641 0.7756
0.1768 75.0 1725 0.9277 0.7677
0.1833 76.0 1748 0.8663 0.7638
0.1696 77.0 1771 0.8302 0.7756
0.1577 78.0 1794 0.8576 0.7638
0.1724 79.0 1817 0.8652 0.7598
0.1525 80.0 1840 0.8567 0.7717
0.158 81.0 1863 0.9139 0.7598
0.1639 82.0 1886 0.9689 0.7520
0.1424 83.0 1909 0.9698 0.7638
0.1224 84.0 1932 1.0239 0.7717
0.1765 85.0 1955 0.9072 0.7795
0.1726 86.0 1978 0.9436 0.7520
0.1584 87.0 2001 0.8775 0.7638
0.164 88.0 2024 0.8592 0.7717
0.1682 89.0 2047 0.9051 0.7638
0.1455 90.0 2070 1.0020 0.7717
0.1596 91.0 2093 0.9423 0.7677
0.1667 92.0 2116 0.9586 0.7638
0.132 93.0 2139 0.9890 0.7638
0.1335 94.0 2162 0.9922 0.7717
0.1538 95.0 2185 0.9534 0.7520
0.1288 96.0 2208 1.0714 0.7480
0.1661 97.0 2231 0.9950 0.7598
0.1392 98.0 2254 0.9866 0.7520
0.1413 99.0 2277 1.0638 0.7598
0.1619 100.0 2300 1.0178 0.7598
0.1537 101.0 2323 0.9892 0.7638
0.137 102.0 2346 0.9524 0.7559
0.1416 103.0 2369 1.0539 0.7402
0.1477 104.0 2392 1.0825 0.7283
0.1283 105.0 2415 1.0008 0.7520
0.1498 106.0 2438 0.9702 0.7638
0.1576 107.0 2461 1.0144 0.7677
0.1433 108.0 2484 0.9457 0.7638
0.1377 109.0 2507 0.9770 0.7677
0.1163 110.0 2530 1.1386 0.7559
0.1449 111.0 2553 1.0589 0.7559
0.1475 112.0 2576 1.0110 0.7480
0.1582 113.0 2599 0.9657 0.7677
0.1291 114.0 2622 0.9563 0.7756
0.1106 115.0 2645 1.1004 0.7480
0.1339 116.0 2668 1.0327 0.7520
0.1344 117.0 2691 1.0161 0.7520
0.1433 118.0 2714 1.0312 0.7559
0.1271 119.0 2737 1.0266 0.7598
0.1222 120.0 2760 1.0119 0.7638
0.1235 121.0 2783 1.0808 0.7520
0.1311 122.0 2806 1.0612 0.7520
0.1219 123.0 2829 1.1412 0.7520
0.148 124.0 2852 1.0836 0.7402
0.1076 125.0 2875 1.0629 0.7559
0.1306 126.0 2898 1.0791 0.7362
0.1153 127.0 2921 1.1495 0.7402
0.1239 128.0 2944 1.1446 0.7520
0.1533 129.0 2967 1.0818 0.7441
0.136 130.0 2990 1.0558 0.7520
0.1189 131.0 3013 1.0423 0.7520
0.1247 132.0 3036 1.0581 0.7638
0.1136 133.0 3059 1.0132 0.7717
0.1492 134.0 3082 1.1127 0.7441
0.1184 135.0 3105 1.1450 0.7402
0.1122 136.0 3128 1.1063 0.7520
0.1047 137.0 3151 1.1029 0.7441
0.1285 138.0 3174 1.1563 0.7402
0.1004 139.0 3197 1.1552 0.7362
0.1285 140.0 3220 1.1097 0.7480
0.1257 141.0 3243 1.1602 0.7402
0.1075 142.0 3266 1.1912 0.7559
0.1098 143.0 3289 1.1894 0.7520
0.1148 144.0 3312 1.1551 0.7441
0.1489 145.0 3335 1.1379 0.7441
0.1461 146.0 3358 1.1726 0.7480
0.1171 147.0 3381 1.1191 0.7441
0.1262 148.0 3404 1.1662 0.7441
0.1137 149.0 3427 1.1283 0.7480
0.1118 150.0 3450 1.1388 0.7480
0.1169 151.0 3473 1.1627 0.7520
0.1021 152.0 3496 1.1821 0.7323
0.1392 153.0 3519 1.1672 0.7323
0.1111 154.0 3542 1.2136 0.7402
0.1298 155.0 3565 1.1966 0.7402
0.1114 156.0 3588 1.1382 0.7362
0.09 157.0 3611 1.1460 0.7323
0.1294 158.0 3634 1.1612 0.7441
0.1186 159.0 3657 1.2204 0.7402
0.1096 160.0 3680 1.2096 0.7441
0.1107 161.0 3703 1.1822 0.7480
0.1094 162.0 3726 1.1908 0.7480
0.1112 163.0 3749 1.1647 0.7402
0.1042 164.0 3772 1.2523 0.7441
0.0993 165.0 3795 1.2040 0.7402
0.105 166.0 3818 1.2296 0.7402
0.1071 167.0 3841 1.2863 0.7480
0.108 168.0 3864 1.2372 0.7441
0.1076 169.0 3887 1.1872 0.7480
0.1107 170.0 3910 1.2354 0.7323
0.1012 171.0 3933 1.2105 0.7441
0.0918 172.0 3956 1.2026 0.7441
0.1043 173.0 3979 1.2925 0.7559
0.1035 174.0 4002 1.2314 0.7402
0.1101 175.0 4025 1.1943 0.7441
0.1084 176.0 4048 1.2069 0.7362
0.1247 177.0 4071 1.2303 0.7520
0.1278 178.0 4094 1.2118 0.7480
0.1117 179.0 4117 1.2213 0.7480
0.1123 180.0 4140 1.2403 0.7480
0.0918 181.0 4163 1.1987 0.7441
0.0827 182.0 4186 1.2358 0.7441
0.0814 183.0 4209 1.2608 0.7441
0.0897 184.0 4232 1.2370 0.7441
0.1321 185.0 4255 1.2317 0.7480
0.1194 186.0 4278 1.2289 0.7441
0.1154 187.0 4301 1.1964 0.7441
0.0964 188.0 4324 1.2009 0.7441
0.0903 189.0 4347 1.2123 0.7441
0.1174 190.0 4370 1.2335 0.7441
0.0846 191.0 4393 1.2399 0.7441
0.1073 192.0 4416 1.2432 0.7441
0.0892 193.0 4439 1.2604 0.7480
0.1158 194.0 4462 1.2473 0.7480
0.1153 195.0 4485 1.2267 0.7441
0.1208 196.0 4508 1.2178 0.7441
0.083 197.0 4531 1.2145 0.7480
0.1331 198.0 4554 1.2215 0.7441
0.0943 199.0 4577 1.2238 0.7441
0.0926 200.0 4600 1.2236 0.7441

Framework versions

  • Transformers 4.46.0.dev0
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.1