Edit model card

segformer-b0-finetuned-ade-512-512-8

This model is a fine-tuned version of nvidia/segformer-b0-finetuned-ade-512-512 on the scene_parse_150 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6428
  • Mean Iou: 0.2997
  • Mean Accuracy: 0.4375
  • Overall Accuracy: 0.8102
  • Per Category Iou: [0.7371388878666323, 0.5490897129978163, 0.9700247026342557, 0.7397814820989553, 0.7565577307676246, 0.538992633884133, 0.9684399031641162, 0.7770701963943443, 0.6458365228574053, 0.9120140171425866, 0.0, 0.3611738148984199, 0.7147295688868965, 0.844665496554283, 0.0, 0.267621753080869, 0.0, 0.5733383645086518, 0.7270281349414842, 0.5270223598747757, 0.840511972388006, 0.0, 0.8477572044270246, 0.0, 0.0, 0.021707969762536722, nan, 0.0, 0.829668750449091, 0.0, 0.0, 0.7105652748372308, 0.7026081948329768, nan, 0.0, nan, 0.017881217625771658, nan, nan, 0.1726423621572763, 0.0, 0.0, 0.0, 0.6915747515561865, 0.7621085594989562, 0.0, nan, nan, nan, 0.48211358596085235, nan, nan, nan, 0.15345104333868378, nan, nan, nan, 0.26001203923373817, nan, nan, nan, nan, nan, 0.0, 0.5937421357268685, nan, 0.7411266113367473, 0.0, 0.3576292854934532, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.5491386838136064, nan, 0.5732569245463228, nan, 0.007035647279549718, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.182496270512183, nan, nan, nan, nan, nan, nan, nan, 0.5035749578561879, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.493494057342421, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.451919770773639, nan, 0.1466372657111356, nan, 0.0, nan, 0.07390486428379468, nan, nan, 0.0, nan, 0.0, nan]
  • Per Category Accuracy: [0.8629099223585593, 0.9373502782623141, 0.9885021323164402, 0.9691669255828691, 0.8784799487066202, 0.5555336682097245, 0.9908501068807045, 0.9659944403197679, 0.8390174775625886, 0.9905262508229098, 0.0, 0.5128205128205128, 0.9000798115668374, 0.9465840018314591, 0.0, 0.30400464306442254, nan, 0.985910270671116, 0.7516968427085012, 0.7248142189534973, 0.9076719987575711, 0.0, 0.9155381845899353, 0.0, nan, 0.021726205087084827, nan, 0.0, 0.8356432060792474, nan, nan, 0.7952856913784809, 0.7707129963898917, nan, 0.0, nan, 0.018030910131654265, nan, nan, 0.178538044494298, 0.0, nan, 0.0, 0.7040691533715048, 0.9282898919262556, nan, nan, nan, nan, 0.48211358596085235, nan, nan, nan, 0.15544715447154472, nan, nan, nan, 0.28087824656695864, nan, nan, nan, nan, nan, nan, 0.7199674499033669, nan, 0.8616300554300965, 0.0, 0.36956446654923497, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.6710156174216704, nan, 0.8681439481601481, nan, 0.00926497838171711, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.182496270512183, nan, nan, nan, nan, nan, nan, nan, 0.6666410157752982, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5142941003815673, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.5008256065032389, nan, 0.14712389380530974, nan, nan, nan, 0.07550796265788029, nan, nan, nan, nan, 0.0, nan]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
0.5476 0.5 20 0.6762 0.2906 0.4305 0.8058 [0.7322215001865174, 0.5396033497347898, 0.9698159584004744, 0.7059408169041077, 0.7420634747803818, 0.5771483833123271, 0.9687929277263261, 0.798442726875497, 0.6575429789438036, 0.8923039936382886, 0.0, 0.28801524073432627, 0.7283594706781259, 0.8224761368155906, 0.0, 0.240371308735549, 0.0, 0.4796904357162029, 0.7300895666908739, 0.5389056168074076, 0.8471427562336653, 0.0, 0.8295389954228618, 0.0, 0.0, 0.025697483663158584, nan, 0.0, 0.774069227167401, 0.0, 0.0, 0.5734580589230532, 0.6845251053891979, nan, 0.0, nan, 0.016910615319027995, 0.0, nan, 0.28083523074359973, 0.0, 0.0, 0.0, 0.664271610355419, 0.8665254237288136, 0.0, nan, nan, nan, 0.37003824767781957, nan, nan, nan, 0.0003252032520325203, nan, nan, 0.0, 0.21958700381611326, nan, nan, nan, nan, nan, 0.0, 0.5700031801558276, nan, 0.7282428702851886, 0.0, 0.3606846837580464, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.5200032558707419, nan, 0.5948942023708379, nan, 0.019722097714029583, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.6463394480251198, nan, nan, nan, nan, nan, nan, nan, 0.22524599381501265, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.6390017101325353, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.3408957329258258, nan, 0.15244909190974132, nan, 0.0, nan, 0.01070840197693575, nan, nan, nan, nan, 0.0, nan] [0.860411265924896, 0.9156574466317801, 0.9871186651022158, 0.9745072645221228, 0.9180006357792876, 0.5975984109786927, 0.9890819569536549, 0.9622822314691714, 0.8206250155383735, 0.9926349572086899, 0.0, 0.39482431149097813, 0.8721522519190464, 0.9275312212374496, 0.0, 0.2621009866511898, nan, 0.9939747868001483, 0.7741781431138333, 0.7294568298475447, 0.9312781487808666, 0.0, 0.8994015666304471, 0.0, nan, 0.02571907029515565, nan, 0.0, 0.7776732404559435, nan, nan, 0.7307000195984108, 0.7803925992779783, nan, 0.0, nan, 0.017029192902117917, nan, nan, 0.3151367856920297, 0.0, nan, 0.0, 0.6732447606870866, 0.9360457724094088, nan, nan, nan, nan, 0.37003824767781957, nan, nan, nan, 0.0003252032520325203, nan, nan, nan, 0.23551237424932103, nan, nan, nan, nan, nan, nan, 0.7292747431593938, nan, 0.8125641552042702, 0.0, 0.36334749421476187, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.6197012319332622, nan, 0.8507289979171488, nan, 0.027177269919703522, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.6482678601027682, nan, nan, nan, nan, nan, nan, nan, 0.3082724124663332, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.7019078368065746, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.36834751682967104, nan, 0.1532079646017699, nan, nan, nan, 0.01070840197693575, nan, nan, nan, nan, 0.0, nan]
0.2556 1.0 40 0.6649 0.2953 0.4284 0.8018 [0.726149854026478, 0.545394522192899, 0.9690015408834769, 0.7052044560128634, 0.7523889451341894, 0.5130005345417701, 0.9709761653425191, 0.8111412048898323, 0.6429130715141094, 0.9262670555762191, 0.0, 0.37778926509864735, 0.7268044032319289, 0.8257242480529214, 0.0, 0.1989299429551471, 0.0, 0.5826857830797855, 0.7106774154456954, 0.5197232223222322, 0.8256289417146025, 0.0, 0.8564725571943037, 0.0, 0.0, 0.03595642901168911, nan, 0.0, 0.7863390403973748, 0.0, 0.0, 0.6995377262399123, 0.689874057452443, 0.0, 0.0, nan, 0.015104240533257695, nan, nan, 0.23531112653250058, 0.0, 0.0, 0.0, 0.6178685004484793, 0.6380706287683032, 0.0, nan, nan, nan, 0.4431266673094848, nan, nan, nan, 0.2509101941747573, nan, nan, nan, 0.299855516791768, nan, nan, nan, nan, nan, 0.0, 0.5089397668694562, nan, 0.6995700465782874, 0.0, 0.202697152245345, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.5431913116123642, nan, 0.5516033623910336, nan, 0.014944491887275833, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.16111387369467928, nan, nan, nan, nan, nan, nan, nan, 0.4687278477561481, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.5924450024564659, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.38066535296163206, nan, 0.05807522123893805, nan, 0.0, nan, 0.06764314247669774, nan, nan, 0.0, nan, 0.0, nan] [0.8601635542956966, 0.9386294542736108, 0.9826396723678866, 0.9767990574424822, 0.9129413412787785, 0.5277586263501757, 0.9883128682860256, 0.9566275877548907, 0.832866767769684, 0.9734302995391705, 0.0, 0.5205365622032289, 0.8696865287159422, 0.9470885874872101, 0.0, 0.24773070226349392, nan, 0.9870226177233964, 0.7351731466060161, 0.7077759901938252, 0.9250660040378941, 0.0, 0.9295340652924864, 0.0, nan, 0.03595760580820065, nan, 0.0, 0.7905313310415536, nan, nan, 0.7953569583266521, 0.7699909747292418, nan, 0.0, nan, 0.015240412135088723, nan, nan, 0.2535676450426871, 0.0, nan, 0.0, 0.6318305631219079, 0.9418944691671964, nan, nan, nan, nan, 0.4431266673094848, nan, nan, nan, 0.2689430894308943, nan, nan, nan, 0.3254790957426462, nan, nan, nan, nan, nan, nan, 0.5573695453158377, nan, 0.8016834325600493, 0.0, 0.20453838980416536, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.6307110291977883, nan, 0.8200647998148577, nan, 0.021618282890673256, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.16111387369467928, nan, nan, nan, nan, nan, nan, nan, 0.6614851866102347, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6371000880540064, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.417121808713324, nan, 0.05807522123893805, nan, nan, nan, 0.06974190005491489, nan, nan, nan, nan, 0.0, nan]
0.7837 1.5 60 0.6371 0.2984 0.4406 0.8103 [0.7346863295923115, 0.559280550405539, 0.9700337139810269, 0.7324351478447735, 0.7537282395579585, 0.5269837485596217, 0.9699500828263524, 0.7902034846570013, 0.6447285740559098, 0.9098007998332102, 0.0, 0.3467465753424658, 0.7176319963508568, 0.8377835858192753, 0.0, 0.23577251477803965, 0.0, 0.6944118031074553, 0.7155893183871043, 0.5237096935210143, 0.8337736522670475, 0.0, 0.8526187213251505, 0.0, 0.0, 0.05102346100077786, nan, 0.0, 0.8391301743245672, 0.0, 0.0, 0.6882384574018591, 0.7126956666736144, nan, 0.0, nan, 0.014692313152104479, nan, nan, 0.1747445168230443, 0.0, 0.0, 0.0, 0.7538995266781411, 0.6427956619039421, 0.0, nan, nan, nan, 0.477163886478321, nan, nan, 0.0, 0.30625383200490497, nan, nan, nan, 0.28023153832660935, nan, nan, nan, nan, nan, 0.0, 0.5608163265306122, nan, 0.7223006351446718, 0.0, 0.3352164775115595, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.5687336945615091, nan, 0.5533441700035682, nan, 0.0029225523623964927, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.2680258577821979, nan, nan, nan, nan, nan, nan, nan, 0.573604365293518, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.4579870203802801, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.39445086705202315, nan, 0.10822749861954721, nan, 0.0, nan, 0.13947633434038267, nan, nan, 0.0, nan, 0.0, nan] [0.8621840195842968, 0.9361275853476202, 0.9852021187779232, 0.9688008539620332, 0.8885715979073163, 0.5442808365343577, 0.9911592503647516, 0.9608318801042872, 0.8366805061780573, 0.9875329163923634, 0.0, 0.4807692307692308, 0.898347316579393, 0.9554796600587749, 0.0, 0.28334300638421356, nan, 0.9860029662588061, 0.7377639284371138, 0.7234199034704666, 0.9067401770461252, 0.0, 0.9282690841861144, 0.0, nan, 0.05116540389577094, nan, 0.0, 0.8447982630721911, nan, nan, 0.798082919094197, 0.7715027075812274, nan, 0.0, nan, 0.014811104751001718, nan, nan, 0.18221474418894498, 0.0, nan, 0.0, 0.7791706042581578, 0.9495232040686586, nan, nan, nan, nan, 0.477163886478321, nan, nan, nan, 0.3248780487804878, nan, nan, nan, 0.30555024289484756, nan, nan, nan, nan, nan, nan, 0.6638693927372596, nan, 0.8404845001026483, 0.0, 0.3442993817566401, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.6872635561160151, nan, 0.8075098356861837, nan, 0.0037059913526868438, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.2680258577821979, nan, nan, nan, nan, nan, nan, nan, 0.7361292804924972, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.47226298796595245, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.433379906007875, nan, 0.1084070796460177, nan, nan, nan, 0.15211422295442065, nan, nan, nan, nan, 0.0, nan]
0.8532 2.0 80 0.6428 0.2997 0.4375 0.8102 [0.7371388878666323, 0.5490897129978163, 0.9700247026342557, 0.7397814820989553, 0.7565577307676246, 0.538992633884133, 0.9684399031641162, 0.7770701963943443, 0.6458365228574053, 0.9120140171425866, 0.0, 0.3611738148984199, 0.7147295688868965, 0.844665496554283, 0.0, 0.267621753080869, 0.0, 0.5733383645086518, 0.7270281349414842, 0.5270223598747757, 0.840511972388006, 0.0, 0.8477572044270246, 0.0, 0.0, 0.021707969762536722, nan, 0.0, 0.829668750449091, 0.0, 0.0, 0.7105652748372308, 0.7026081948329768, nan, 0.0, nan, 0.017881217625771658, nan, nan, 0.1726423621572763, 0.0, 0.0, 0.0, 0.6915747515561865, 0.7621085594989562, 0.0, nan, nan, nan, 0.48211358596085235, nan, nan, nan, 0.15345104333868378, nan, nan, nan, 0.26001203923373817, nan, nan, nan, nan, nan, 0.0, 0.5937421357268685, nan, 0.7411266113367473, 0.0, 0.3576292854934532, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.5491386838136064, nan, 0.5732569245463228, nan, 0.007035647279549718, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.182496270512183, nan, nan, nan, nan, nan, nan, nan, 0.5035749578561879, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.493494057342421, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.451919770773639, nan, 0.1466372657111356, nan, 0.0, nan, 0.07390486428379468, nan, nan, 0.0, nan, 0.0, nan] [0.8629099223585593, 0.9373502782623141, 0.9885021323164402, 0.9691669255828691, 0.8784799487066202, 0.5555336682097245, 0.9908501068807045, 0.9659944403197679, 0.8390174775625886, 0.9905262508229098, 0.0, 0.5128205128205128, 0.9000798115668374, 0.9465840018314591, 0.0, 0.30400464306442254, nan, 0.985910270671116, 0.7516968427085012, 0.7248142189534973, 0.9076719987575711, 0.0, 0.9155381845899353, 0.0, nan, 0.021726205087084827, nan, 0.0, 0.8356432060792474, nan, nan, 0.7952856913784809, 0.7707129963898917, nan, 0.0, nan, 0.018030910131654265, nan, nan, 0.178538044494298, 0.0, nan, 0.0, 0.7040691533715048, 0.9282898919262556, nan, nan, nan, nan, 0.48211358596085235, nan, nan, nan, 0.15544715447154472, nan, nan, nan, 0.28087824656695864, nan, nan, nan, nan, nan, nan, 0.7199674499033669, nan, 0.8616300554300965, 0.0, 0.36956446654923497, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.6710156174216704, nan, 0.8681439481601481, nan, 0.00926497838171711, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.182496270512183, nan, nan, nan, nan, nan, nan, nan, 0.6666410157752982, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5142941003815673, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.5008256065032389, nan, 0.14712389380530974, nan, nan, nan, 0.07550796265788029, nan, nan, nan, nan, 0.0, nan]

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.1.2+cpu
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
7
Safetensors
Model size
3.75M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for Hemg/segformer-b0-finetuned-ade-512-512-8

Finetuned
(33)
this model

Dataset used to train Hemg/segformer-b0-finetuned-ade-512-512-8