Edit model card

segformer-b0-finetuned-segments-sidewalk-3

This model is a fine-tuned version of nvidia/mit-b0 on the jhaberbe/lipid-droplets-v3 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1127
  • Mean Iou: 0.4462
  • Mean Accuracy: 0.8924
  • Overall Accuracy: 0.8924
  • Accuracy Unlabeled: nan
  • Accuracy Lipid: 0.8924
  • Iou Unlabeled: 0.0
  • Iou Lipid: 0.8924

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 500

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Lipid Iou Unlabeled Iou Lipid
0.4354 4.0 20 0.6338 0.3881 0.7762 0.7762 nan 0.7762 0.0 0.7762
0.3649 8.0 40 0.5519 0.4247 0.8494 0.8494 nan 0.8494 0.0 0.8494
0.249 12.0 60 0.3059 0.3220 0.6440 0.6440 nan 0.6440 0.0 0.6440
0.1959 16.0 80 0.3187 0.3624 0.7247 0.7247 nan 0.7247 0.0 0.7247
0.1541 20.0 100 0.2542 0.3600 0.7201 0.7201 nan 0.7201 0.0 0.7201
0.245 24.0 120 0.1705 0.0943 0.1886 0.1886 nan 0.1886 0.0 0.1886
0.1661 28.0 140 0.0909 0.2357 0.4713 0.4713 nan 0.4713 0.0 0.4713
0.0906 32.0 160 0.2466 0.3304 0.6608 0.6608 nan 0.6608 0.0 0.6608
0.0747 36.0 180 0.2400 0.3773 0.7546 0.7546 nan 0.7546 0.0 0.7546
0.077 40.0 200 0.3043 0.4416 0.8832 0.8832 nan 0.8832 0.0 0.8832
0.077 44.0 220 0.1589 0.2127 0.4254 0.4254 nan 0.4254 0.0 0.4254
0.0595 48.0 240 0.2388 0.4113 0.8226 0.8226 nan 0.8226 0.0 0.8226
0.0678 52.0 260 0.1919 0.3874 0.7747 0.7747 nan 0.7747 0.0 0.7747
0.0584 56.0 280 0.2784 0.4315 0.8631 0.8631 nan 0.8631 0.0 0.8631
0.0752 60.0 300 0.1283 0.3702 0.7404 0.7404 nan 0.7404 0.0 0.7404
0.058 64.0 320 0.0754 0.3292 0.6585 0.6585 nan 0.6585 0.0 0.6585
0.071 68.0 340 0.1524 0.3384 0.6768 0.6768 nan 0.6768 0.0 0.6768
0.1157 72.0 360 0.0839 0.2022 0.4043 0.4043 nan 0.4043 0.0 0.4043
0.0681 76.0 380 0.1706 0.3494 0.6989 0.6989 nan 0.6989 0.0 0.6989
0.0391 80.0 400 0.1079 0.3062 0.6123 0.6123 nan 0.6123 0.0 0.6123
0.0493 84.0 420 0.2278 0.4427 0.8854 0.8854 nan 0.8854 0.0 0.8854
0.0544 88.0 440 0.1832 0.3955 0.7911 0.7911 nan 0.7911 0.0 0.7911
0.029 92.0 460 0.1327 0.3851 0.7703 0.7703 nan 0.7703 0.0 0.7703
0.0926 96.0 480 0.1737 0.4200 0.8400 0.8400 nan 0.8400 0.0 0.8400
0.0437 100.0 500 0.1248 0.3420 0.6841 0.6841 nan 0.6841 0.0 0.6841
0.0474 104.0 520 0.1365 0.3171 0.6343 0.6343 nan 0.6343 0.0 0.6343
0.0267 108.0 540 0.1942 0.4441 0.8881 0.8881 nan 0.8881 0.0 0.8881
0.0495 112.0 560 0.1312 0.3710 0.7420 0.7420 nan 0.7420 0.0 0.7420
0.0257 116.0 580 0.1224 0.3549 0.7097 0.7097 nan 0.7097 0.0 0.7097
0.0276 120.0 600 0.1836 0.4088 0.8176 0.8176 nan 0.8176 0.0 0.8176
0.0278 124.0 620 0.1652 0.4134 0.8269 0.8269 nan 0.8269 0.0 0.8269
0.04 128.0 640 0.1380 0.4057 0.8115 0.8115 nan 0.8115 0.0 0.8115
0.0373 132.0 660 0.1566 0.4473 0.8947 0.8947 nan 0.8947 0.0 0.8947
0.024 136.0 680 0.0994 0.3670 0.7339 0.7339 nan 0.7339 0.0 0.7339
0.0301 140.0 700 0.1427 0.3591 0.7182 0.7182 nan 0.7182 0.0 0.7182
0.0372 144.0 720 0.1781 0.4245 0.8489 0.8489 nan 0.8489 0.0 0.8489
0.0364 148.0 740 0.1370 0.3996 0.7992 0.7992 nan 0.7992 0.0 0.7992
0.0497 152.0 760 0.1406 0.4023 0.8046 0.8046 nan 0.8046 0.0 0.8046
0.0442 156.0 780 0.1583 0.4168 0.8337 0.8337 nan 0.8337 0.0 0.8337
0.0358 160.0 800 0.1217 0.3739 0.7478 0.7478 nan 0.7478 0.0 0.7478
0.0408 164.0 820 0.0968 0.2903 0.5805 0.5805 nan 0.5805 0.0 0.5805
0.0225 168.0 840 0.1196 0.4256 0.8512 0.8512 nan 0.8512 0.0 0.8512
0.0399 172.0 860 0.1074 0.2849 0.5697 0.5697 nan 0.5697 0.0 0.5697
0.0436 176.0 880 0.0858 0.3241 0.6481 0.6481 nan 0.6481 0.0 0.6481
0.0351 180.0 900 0.1624 0.4129 0.8258 0.8258 nan 0.8258 0.0 0.8258
0.0291 184.0 920 0.1507 0.4153 0.8307 0.8307 nan 0.8307 0.0 0.8307
0.0417 188.0 940 0.1322 0.3823 0.7645 0.7645 nan 0.7645 0.0 0.7645
0.0277 192.0 960 0.1121 0.3679 0.7358 0.7358 nan 0.7358 0.0 0.7358
0.0289 196.0 980 0.1493 0.4154 0.8307 0.8307 nan 0.8307 0.0 0.8307
0.0366 200.0 1000 0.1342 0.4056 0.8111 0.8111 nan 0.8111 0.0 0.8111
0.0337 204.0 1020 0.1494 0.4110 0.8219 0.8219 nan 0.8219 0.0 0.8219
0.0287 208.0 1040 0.1065 0.4281 0.8561 0.8561 nan 0.8561 0.0 0.8561
0.0286 212.0 1060 0.1629 0.4310 0.8621 0.8621 nan 0.8621 0.0 0.8621
0.0319 216.0 1080 0.1547 0.3900 0.7799 0.7799 nan 0.7799 0.0 0.7799
0.038 220.0 1100 0.0830 0.4073 0.8146 0.8146 nan 0.8146 0.0 0.8146
0.0207 224.0 1120 0.1571 0.4229 0.8458 0.8458 nan 0.8458 0.0 0.8458
0.0406 228.0 1140 0.1477 0.4302 0.8604 0.8604 nan 0.8604 0.0 0.8604
0.0323 232.0 1160 0.1267 0.4136 0.8272 0.8272 nan 0.8272 0.0 0.8272
0.0269 236.0 1180 0.1262 0.4243 0.8485 0.8485 nan 0.8485 0.0 0.8485
0.0484 240.0 1200 0.1179 0.4161 0.8322 0.8322 nan 0.8322 0.0 0.8322
0.0366 244.0 1220 0.0954 0.4016 0.8031 0.8031 nan 0.8031 0.0 0.8031
0.0384 248.0 1240 0.0980 0.4147 0.8294 0.8294 nan 0.8294 0.0 0.8294
0.0205 252.0 1260 0.0946 0.3758 0.7516 0.7516 nan 0.7516 0.0 0.7516
0.0243 256.0 1280 0.1198 0.4322 0.8644 0.8644 nan 0.8644 0.0 0.8644
0.033 260.0 1300 0.1271 0.4357 0.8714 0.8714 nan 0.8714 0.0 0.8714
0.0219 264.0 1320 0.1041 0.4368 0.8736 0.8736 nan 0.8736 0.0 0.8736
0.0304 268.0 1340 0.1113 0.4122 0.8244 0.8244 nan 0.8244 0.0 0.8244
0.0241 272.0 1360 0.0802 0.4152 0.8303 0.8303 nan 0.8303 0.0 0.8303
0.0209 276.0 1380 0.1255 0.4476 0.8952 0.8952 nan 0.8952 0.0 0.8952
0.015 280.0 1400 0.1500 0.4440 0.8879 0.8879 nan 0.8879 0.0 0.8879
0.0209 284.0 1420 0.1275 0.4471 0.8941 0.8941 nan 0.8941 0.0 0.8941
0.0423 288.0 1440 0.1406 0.4135 0.8271 0.8271 nan 0.8271 0.0 0.8271
0.0179 292.0 1460 0.0999 0.4272 0.8544 0.8544 nan 0.8544 0.0 0.8544
0.028 296.0 1480 0.1374 0.4471 0.8942 0.8942 nan 0.8942 0.0 0.8942
0.0524 300.0 1500 0.1253 0.4339 0.8679 0.8679 nan 0.8679 0.0 0.8679
0.0182 304.0 1520 0.1077 0.4222 0.8444 0.8444 nan 0.8444 0.0 0.8444
0.0141 308.0 1540 0.1295 0.4478 0.8956 0.8956 nan 0.8956 0.0 0.8956
0.0255 312.0 1560 0.1309 0.4364 0.8728 0.8728 nan 0.8728 0.0 0.8728
0.0375 316.0 1580 0.0917 0.4369 0.8737 0.8737 nan 0.8737 0.0 0.8737
0.0312 320.0 1600 0.0967 0.3975 0.7951 0.7951 nan 0.7951 0.0 0.7951
0.0312 324.0 1620 0.1041 0.4184 0.8368 0.8368 nan 0.8368 0.0 0.8368
0.0294 328.0 1640 0.1041 0.4279 0.8558 0.8558 nan 0.8558 0.0 0.8558
0.0277 332.0 1660 0.1285 0.4322 0.8644 0.8644 nan 0.8644 0.0 0.8644
0.022 336.0 1680 0.0897 0.3872 0.7744 0.7744 nan 0.7744 0.0 0.7744
0.0185 340.0 1700 0.1148 0.4293 0.8586 0.8586 nan 0.8586 0.0 0.8586
0.0197 344.0 1720 0.1161 0.4448 0.8896 0.8896 nan 0.8896 0.0 0.8896
0.0243 348.0 1740 0.0981 0.4256 0.8511 0.8511 nan 0.8511 0.0 0.8511
0.0252 352.0 1760 0.0848 0.3893 0.7787 0.7787 nan 0.7787 0.0 0.7787
0.0273 356.0 1780 0.0852 0.4175 0.8350 0.8350 nan 0.8350 0.0 0.8350
0.0299 360.0 1800 0.0912 0.4016 0.8033 0.8033 nan 0.8033 0.0 0.8033
0.0223 364.0 1820 0.0877 0.4222 0.8444 0.8444 nan 0.8444 0.0 0.8444
0.0246 368.0 1840 0.1020 0.4455 0.8910 0.8910 nan 0.8910 0.0 0.8910
0.0354 372.0 1860 0.1018 0.4262 0.8524 0.8524 nan 0.8524 0.0 0.8524
0.0228 376.0 1880 0.1030 0.4422 0.8843 0.8843 nan 0.8843 0.0 0.8843
0.0208 380.0 1900 0.1135 0.4403 0.8805 0.8805 nan 0.8805 0.0 0.8805
0.0187 384.0 1920 0.1085 0.4384 0.8768 0.8768 nan 0.8768 0.0 0.8768
0.0211 388.0 1940 0.0929 0.4376 0.8752 0.8752 nan 0.8752 0.0 0.8752
0.0312 392.0 1960 0.0926 0.4450 0.8899 0.8899 nan 0.8899 0.0 0.8899
0.0211 396.0 1980 0.0946 0.4424 0.8847 0.8847 nan 0.8847 0.0 0.8847
0.0106 400.0 2000 0.1098 0.4487 0.8974 0.8974 nan 0.8974 0.0 0.8974
0.0244 404.0 2020 0.0960 0.4394 0.8788 0.8788 nan 0.8788 0.0 0.8788
0.0156 408.0 2040 0.0916 0.4257 0.8514 0.8514 nan 0.8514 0.0 0.8514
0.0186 412.0 2060 0.1090 0.4476 0.8953 0.8953 nan 0.8953 0.0 0.8953
0.0219 416.0 2080 0.1093 0.4297 0.8594 0.8594 nan 0.8594 0.0 0.8594
0.012 420.0 2100 0.0891 0.4347 0.8693 0.8693 nan 0.8693 0.0 0.8693
0.0367 424.0 2120 0.1247 0.4566 0.9132 0.9132 nan 0.9132 0.0 0.9132
0.0123 428.0 2140 0.0705 0.4162 0.8323 0.8323 nan 0.8323 0.0 0.8323
0.0246 432.0 2160 0.1167 0.4468 0.8936 0.8936 nan 0.8936 0.0 0.8936
0.0144 436.0 2180 0.1139 0.4523 0.9047 0.9047 nan 0.9047 0.0 0.9047
0.0278 440.0 2200 0.1015 0.4330 0.8660 0.8660 nan 0.8660 0.0 0.8660
0.0157 444.0 2220 0.0921 0.4196 0.8392 0.8392 nan 0.8392 0.0 0.8392
0.0207 448.0 2240 0.1109 0.4462 0.8924 0.8924 nan 0.8924 0.0 0.8924
0.0191 452.0 2260 0.1060 0.4421 0.8843 0.8843 nan 0.8843 0.0 0.8843
0.0214 456.0 2280 0.1137 0.4357 0.8714 0.8714 nan 0.8714 0.0 0.8714
0.0228 460.0 2300 0.0943 0.4340 0.8679 0.8679 nan 0.8679 0.0 0.8679
0.0213 464.0 2320 0.1003 0.4379 0.8759 0.8759 nan 0.8759 0.0 0.8759
0.0248 468.0 2340 0.1063 0.4525 0.9050 0.9050 nan 0.9050 0.0 0.9050
0.0186 472.0 2360 0.0915 0.4278 0.8556 0.8556 nan 0.8556 0.0 0.8556
0.0212 476.0 2380 0.0948 0.4375 0.8751 0.8751 nan 0.8751 0.0 0.8751
0.0201 480.0 2400 0.0961 0.4326 0.8652 0.8652 nan 0.8652 0.0 0.8652
0.0168 484.0 2420 0.1050 0.4372 0.8745 0.8745 nan 0.8745 0.0 0.8745
0.0242 488.0 2440 0.1120 0.4481 0.8963 0.8963 nan 0.8963 0.0 0.8963
0.017 492.0 2460 0.1035 0.4235 0.8470 0.8470 nan 0.8470 0.0 0.8470
0.0267 496.0 2480 0.1076 0.4465 0.8931 0.8931 nan 0.8931 0.0 0.8931
0.0148 500.0 2500 0.1127 0.4462 0.8924 0.8924 nan 0.8924 0.0 0.8924

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
13
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jhaberbe/segformer-b0-finetuned-segments-sidewalk-3

Base model

nvidia/mit-b0
Finetuned
(322)
this model