Edit model card

segformer-b0-finetuned-oldapp-oct-1

This model is a fine-tuned version of nvidia/mit-b0 on the PushkarA07/oldapptiles5 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0968
  • Mean Iou: 0.9990
  • Mean Accuracy: 1.0
  • Overall Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy
0.6574 0.7143 10 0.6497 0.9990 1.0 1.0
0.6005 1.4286 20 0.5761 0.9990 1.0 1.0
0.5676 2.1429 30 0.4740 0.9990 1.0 1.0
0.6287 2.8571 40 0.4394 0.9990 1.0 1.0
0.5121 3.5714 50 0.4173 0.9990 1.0 1.0
0.4744 4.2857 60 0.3842 0.9990 1.0 1.0
0.4413 5.0 70 0.4107 0.9990 1.0 1.0
0.4134 5.7143 80 0.3737 0.9990 1.0 1.0
0.4139 6.4286 90 0.3424 0.9990 1.0 1.0
0.4097 7.1429 100 0.3248 0.9990 1.0 1.0
0.3645 7.8571 110 0.3218 0.9990 1.0 1.0
0.3287 8.5714 120 0.2928 0.9990 1.0 1.0
0.3113 9.2857 130 0.3021 0.9990 1.0 1.0
0.3085 10.0 140 0.2962 0.9990 1.0 1.0
0.2879 10.7143 150 0.2596 0.9990 1.0 1.0
0.2958 11.4286 160 0.2409 0.9990 1.0 1.0
0.2788 12.1429 170 0.2396 0.9990 1.0 1.0
0.2565 12.8571 180 0.2076 0.9990 1.0 1.0
0.2457 13.5714 190 0.2184 0.9990 1.0 1.0
0.2328 14.2857 200 0.1962 0.9990 1.0 1.0
0.1916 15.0 210 0.2003 0.9990 1.0 1.0
0.3277 15.7143 220 0.1875 0.9990 1.0 1.0
0.2053 16.4286 230 0.1718 0.9990 1.0 1.0
0.2555 17.1429 240 0.1571 0.9990 1.0 1.0
0.1863 17.8571 250 0.1546 0.9990 1.0 1.0
0.1944 18.5714 260 0.1503 0.9990 1.0 1.0
0.2652 19.2857 270 0.1456 0.9990 1.0 1.0
0.1614 20.0 280 0.1442 0.9990 1.0 1.0
0.139 20.7143 290 0.1413 0.9990 1.0 1.0
0.1631 21.4286 300 0.1308 0.9990 1.0 1.0
0.1988 22.1429 310 0.1256 0.9990 1.0 1.0
0.1294 22.8571 320 0.1190 0.9990 1.0 1.0
0.1174 23.5714 330 0.1185 0.9990 1.0 1.0
0.1287 24.2857 340 0.1251 0.9990 1.0 1.0
0.1322 25.0 350 0.1308 0.9990 1.0 1.0
0.1667 25.7143 360 0.1215 0.9990 1.0 1.0
0.1095 26.4286 370 0.1226 0.9990 1.0 1.0
0.1992 27.1429 380 0.1331 0.9990 1.0 1.0
0.1987 27.8571 390 0.1174 0.9990 1.0 1.0
0.1587 28.5714 400 0.1162 0.9990 1.0 1.0
0.1043 29.2857 410 0.1161 0.9990 1.0 1.0
0.1073 30.0 420 0.1112 0.9990 1.0 1.0
0.14 30.7143 430 0.1279 0.9990 1.0 1.0
0.1183 31.4286 440 0.1361 0.9990 1.0 1.0
0.1096 32.1429 450 0.1430 0.9990 1.0 1.0
0.0957 32.8571 460 0.1397 0.9990 1.0 1.0
0.1605 33.5714 470 0.1436 0.9990 1.0 1.0
0.0837 34.2857 480 0.1163 0.9990 1.0 1.0
0.1032 35.0 490 0.1223 0.9990 1.0 1.0
0.1815 35.7143 500 0.0829 0.9990 1.0 1.0
0.0762 36.4286 510 0.1128 0.9990 1.0 1.0
0.0754 37.1429 520 0.1676 0.9990 1.0 1.0
0.0883 37.8571 530 0.1639 0.9990 1.0 1.0
0.0721 38.5714 540 0.1843 0.9990 1.0 1.0
0.0703 39.2857 550 0.1493 0.9990 1.0 1.0
0.0766 40.0 560 0.1616 0.9990 1.0 1.0
0.0636 40.7143 570 0.1292 0.9990 1.0 1.0
0.0771 41.4286 580 0.1087 0.9990 1.0 1.0
0.1019 42.1429 590 0.1540 0.9990 1.0 1.0
0.0734 42.8571 600 0.1639 0.9990 1.0 1.0
0.0504 43.5714 610 0.1544 0.9990 1.0 1.0
0.0606 44.2857 620 0.1403 0.9990 1.0 1.0
0.0925 45.0 630 0.1664 0.9990 1.0 1.0
0.0584 45.7143 640 0.1589 0.9990 1.0 1.0
0.0662 46.4286 650 0.1696 0.9990 1.0 1.0
0.0537 47.1429 660 0.1487 0.9990 1.0 1.0
0.0772 47.8571 670 0.1688 0.9990 1.0 1.0
0.0529 48.5714 680 0.1637 0.9990 1.0 1.0
0.0538 49.2857 690 0.1573 0.9990 1.0 1.0
0.045 50.0 700 0.1661 0.9990 1.0 1.0
0.0588 50.7143 710 0.1824 0.9990 1.0 1.0
0.0482 51.4286 720 0.1653 0.9990 1.0 1.0
0.0811 52.1429 730 0.1579 0.9990 1.0 1.0
-0.0544 52.8571 740 0.1355 0.9990 1.0 1.0
0.0463 53.5714 750 0.1514 0.9990 1.0 1.0
0.0465 54.2857 760 0.1259 0.9990 1.0 1.0
0.0798 55.0 770 0.1504 0.9990 1.0 1.0
-0.042 55.7143 780 0.1638 0.9990 1.0 1.0
-0.2192 56.4286 790 0.1666 0.9990 1.0 1.0
0.0365 57.1429 800 0.1834 0.9990 1.0 1.0
0.0765 57.8571 810 0.1456 0.9990 1.0 1.0
0.0552 58.5714 820 0.1491 0.9990 1.0 1.0
0.0375 59.2857 830 0.1515 0.9990 1.0 1.0
0.0499 60.0 840 0.1082 0.9990 1.0 1.0
0.0787 60.7143 850 0.1422 0.9990 1.0 1.0
0.0562 61.4286 860 0.1337 0.9990 1.0 1.0
0.0623 62.1429 870 0.1399 0.9990 1.0 1.0
0.0966 62.8571 880 0.1412 0.9990 1.0 1.0
0.0811 63.5714 890 0.1311 0.9990 1.0 1.0
0.0496 64.2857 900 0.1591 0.9990 1.0 1.0
0.0447 65.0 910 0.1587 0.9990 1.0 1.0
0.0345 65.7143 920 0.1637 0.9990 1.0 1.0
0.0637 66.4286 930 0.1427 0.9990 1.0 1.0
0.0644 67.1429 940 0.1611 0.9990 1.0 1.0
0.0779 67.8571 950 0.1566 0.9990 1.0 1.0
0.0417 68.5714 960 0.1488 0.9990 1.0 1.0
0.0969 69.2857 970 0.1577 0.9990 1.0 1.0
0.0452 70.0 980 0.1166 0.9990 1.0 1.0
0.0373 70.7143 990 0.1429 0.9990 1.0 1.0
0.0438 71.4286 1000 0.1425 0.9990 1.0 1.0
0.0606 72.1429 1010 0.1238 0.9990 1.0 1.0
0.0389 72.8571 1020 0.1284 0.9990 1.0 1.0
0.0402 73.5714 1030 0.1350 0.9990 1.0 1.0
0.0349 74.2857 1040 0.1583 0.9990 1.0 1.0
0.031 75.0 1050 0.1563 0.9990 1.0 1.0
0.0501 75.7143 1060 0.1501 0.9990 1.0 1.0
0.0412 76.4286 1070 0.1417 0.9990 1.0 1.0
0.0532 77.1429 1080 0.1456 0.9990 1.0 1.0
0.0378 77.8571 1090 0.1059 0.9990 1.0 1.0
-0.3927 78.5714 1100 0.1200 0.9990 1.0 1.0
0.0499 79.2857 1110 0.1396 0.9990 1.0 1.0
0.0501 80.0 1120 0.1277 0.9990 1.0 1.0
0.0408 80.7143 1130 0.1494 0.9990 1.0 1.0
0.0369 81.4286 1140 0.1394 0.9990 1.0 1.0
0.0014 82.1429 1150 0.1306 0.9990 1.0 1.0
0.0359 82.8571 1160 0.1557 0.9990 1.0 1.0
-0.4227 83.5714 1170 0.1380 0.9990 1.0 1.0
0.0307 84.2857 1180 0.1351 0.9990 1.0 1.0
0.0433 85.0 1190 0.1379 0.9990 1.0 1.0
0.0407 85.7143 1200 0.1346 0.9990 1.0 1.0
0.0247 86.4286 1210 0.1572 0.9990 1.0 1.0
0.0498 87.1429 1220 0.1398 0.9990 1.0 1.0
0.0399 87.8571 1230 0.1261 0.9990 1.0 1.0
0.0354 88.5714 1240 0.0936 0.9990 1.0 1.0
0.0336 89.2857 1250 0.1343 0.9990 1.0 1.0
0.0291 90.0 1260 0.1410 0.9990 1.0 1.0
0.0379 90.7143 1270 0.1520 0.9990 1.0 1.0
0.0331 91.4286 1280 0.1490 0.9990 1.0 1.0
0.034 92.1429 1290 0.1449 0.9990 1.0 1.0
0.0315 92.8571 1300 0.1447 0.9990 1.0 1.0
0.0474 93.5714 1310 0.0865 0.9990 1.0 1.0
0.0295 94.2857 1320 0.1292 0.9990 1.0 1.0
0.0347 95.0 1330 0.1097 0.9990 1.0 1.0
0.0315 95.7143 1340 0.1264 0.9990 1.0 1.0
0.0427 96.4286 1350 0.1458 0.9990 1.0 1.0
0.0072 97.1429 1360 0.1381 0.9990 1.0 1.0
0.0481 97.8571 1370 0.1120 0.9990 1.0 1.0
0.0312 98.5714 1380 0.1331 0.9990 1.0 1.0
0.0617 99.2857 1390 0.1368 0.9990 1.0 1.0
-0.4803 100.0 1400 0.0968 0.9990 1.0 1.0

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.5.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.19.1
Downloads last month
23
Safetensors
Model size
3.71M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for PushkarA07/segformer-b0-finetuned-oldapp-oct-1

Base model

nvidia/mit-b0
Finetuned
(315)
this model