segformer-b0-finetuned-segments-SixrayGun8-14-2024
This model is a fine-tuned version of nvidia/mit-b0 on the saad7489/SIXray_Gun dataset. It achieves the following results on the evaluation set:
- Loss: 0.1155
- Mean Iou: 0.1748
- Mean Accuracy: 0.2486
- Overall Accuracy: 0.5704
- Accuracy No-label: nan
- Accuracy Object1: 0.6638
- Accuracy Object2: 0.5791
- Accuracy Object3: 0.0
- Accuracy Object4: 0.0
- Accuracy Object5: 0.0
- Accuracy Object6: nan
- Iou No-label: 0.0
- Iou Object1: 0.5674
- Iou Object2: 0.4817
- Iou Object3: 0.0
- Iou Object4: 0.0
- Iou Object5: 0.0
- Iou Object6: nan
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 20
- eval_batch_size: 20
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy No-label | Accuracy Object1 | Accuracy Object2 | Accuracy Object3 | Accuracy Object4 | Accuracy Object5 | Accuracy Object6 | Iou No-label | Iou Object1 | Iou Object2 | Iou Object3 | Iou Object4 | Iou Object5 | Iou Object6 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.5955 | 0.5128 | 20 | 1.7043 | 0.0922 | 0.1760 | 0.4766 | nan | 0.6799 | 0.1943 | 0.0058 | 0.0 | 0.0 | nan | 0.0 | 0.4999 | 0.1400 | 0.0058 | 0.0 | 0.0 | 0.0 |
1.2095 | 1.0256 | 40 | 1.2385 | 0.0865 | 0.1398 | 0.4206 | nan | 0.6565 | 0.0425 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4787 | 0.0403 | 0.0 | 0.0 | 0.0 | nan |
1.0262 | 1.5385 | 60 | 1.0043 | 0.0795 | 0.1256 | 0.3824 | nan | 0.6026 | 0.0256 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4521 | 0.0250 | 0.0 | 0.0 | 0.0 | nan |
0.935 | 2.0513 | 80 | 0.8920 | 0.0771 | 0.1201 | 0.3655 | nan | 0.5759 | 0.0245 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4386 | 0.0240 | 0.0 | 0.0 | 0.0 | nan |
0.8168 | 2.5641 | 100 | 0.8136 | 0.0691 | 0.1079 | 0.3359 | nan | 0.5386 | 0.0011 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4137 | 0.0011 | 0.0 | 0.0 | 0.0 | nan |
0.721 | 3.0769 | 120 | 0.7429 | 0.0758 | 0.1222 | 0.3803 | nan | 0.6100 | 0.0009 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4538 | 0.0009 | 0.0 | 0.0 | 0.0 | nan |
0.6536 | 3.5897 | 140 | 0.6914 | 0.0828 | 0.1378 | 0.4292 | nan | 0.6886 | 0.0003 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4965 | 0.0003 | 0.0 | 0.0 | 0.0 | nan |
0.6535 | 4.1026 | 160 | 0.5916 | 0.0691 | 0.1091 | 0.3398 | nan | 0.5454 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4147 | 0.0000 | 0.0 | 0.0 | 0.0 | nan |
0.5396 | 4.6154 | 180 | 0.5436 | 0.0780 | 0.1283 | 0.3996 | nan | 0.6409 | 0.0007 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4674 | 0.0007 | 0.0 | 0.0 | 0.0 | nan |
0.4738 | 5.1282 | 200 | 0.4589 | 0.0777 | 0.1275 | 0.3973 | nan | 0.6376 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4664 | 0.0000 | 0.0 | 0.0 | 0.0 | nan |
0.4548 | 5.6410 | 220 | 0.3964 | 0.0739 | 0.1182 | 0.3679 | nan | 0.5902 | 0.0007 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4426 | 0.0007 | 0.0 | 0.0 | 0.0 | nan |
0.3918 | 6.1538 | 240 | 0.4124 | 0.0895 | 0.1507 | 0.4629 | nan | 0.7348 | 0.0185 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5187 | 0.0183 | 0.0 | 0.0 | 0.0 | nan |
0.3569 | 6.6667 | 260 | 0.3519 | 0.0859 | 0.1465 | 0.4560 | nan | 0.7314 | 0.0010 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5144 | 0.0010 | 0.0 | 0.0 | 0.0 | nan |
0.3143 | 7.1795 | 280 | 0.3189 | 0.0860 | 0.1464 | 0.4562 | nan | 0.7322 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5158 | 0.0 | 0.0 | 0.0 | 0.0 | nan |
0.3131 | 7.6923 | 300 | 0.3064 | 0.0940 | 0.1601 | 0.4880 | nan | 0.7698 | 0.0306 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5339 | 0.0302 | 0.0 | 0.0 | 0.0 | nan |
0.2623 | 8.2051 | 320 | 0.2687 | 0.0827 | 0.1392 | 0.4337 | nan | 0.6960 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4961 | 0.0001 | 0.0 | 0.0 | 0.0 | nan |
0.2205 | 8.7179 | 340 | 0.2293 | 0.0838 | 0.1377 | 0.4232 | nan | 0.6722 | 0.0162 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4865 | 0.0161 | 0.0 | 0.0 | 0.0 | nan |
0.2316 | 9.2308 | 360 | 0.2277 | 0.0865 | 0.1445 | 0.4444 | nan | 0.7062 | 0.0160 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5032 | 0.0160 | 0.0 | 0.0 | 0.0 | nan |
0.2626 | 9.7436 | 380 | 0.2061 | 0.0936 | 0.1452 | 0.4223 | nan | 0.6405 | 0.0856 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4777 | 0.0836 | 0.0 | 0.0 | 0.0 | nan |
0.1835 | 10.2564 | 400 | 0.1938 | 0.1100 | 0.1683 | 0.4696 | nan | 0.6863 | 0.1551 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5108 | 0.1491 | 0.0 | 0.0 | 0.0 | nan |
0.2101 | 10.7692 | 420 | 0.1763 | 0.1127 | 0.1703 | 0.4652 | nan | 0.6659 | 0.1858 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5027 | 0.1732 | 0.0 | 0.0 | 0.0 | nan |
0.173 | 11.2821 | 440 | 0.1608 | 0.1284 | 0.1867 | 0.4838 | nan | 0.6555 | 0.2783 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5130 | 0.2576 | 0.0 | 0.0 | 0.0 | nan |
0.1614 | 11.7949 | 460 | 0.1650 | 0.1464 | 0.2140 | 0.5146 | nan | 0.6384 | 0.4316 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5170 | 0.3616 | 0.0 | 0.0 | 0.0 | nan |
0.1575 | 12.3077 | 480 | 0.1562 | 0.1489 | 0.2184 | 0.5282 | nan | 0.6601 | 0.4318 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5311 | 0.3622 | 0.0 | 0.0 | 0.0 | nan |
0.1681 | 12.8205 | 500 | 0.1562 | 0.1657 | 0.2429 | 0.5762 | nan | 0.7023 | 0.5121 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5699 | 0.4241 | 0.0 | 0.0 | 0.0 | nan |
0.1325 | 13.3333 | 520 | 0.1430 | 0.1585 | 0.2268 | 0.5429 | nan | 0.6693 | 0.4649 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5489 | 0.4024 | 0.0 | 0.0 | 0.0 | nan |
0.1414 | 13.8462 | 540 | 0.1283 | 0.1519 | 0.2156 | 0.5188 | nan | 0.6441 | 0.4336 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5311 | 0.3801 | 0.0 | 0.0 | 0.0 | nan |
0.1276 | 14.3590 | 560 | 0.1296 | 0.1399 | 0.1953 | 0.4706 | nan | 0.5855 | 0.3909 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.4895 | 0.3501 | 0.0 | 0.0 | 0.0 | nan |
0.1384 | 14.8718 | 580 | 0.1291 | 0.1526 | 0.2160 | 0.5212 | nan | 0.6493 | 0.4307 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5334 | 0.3819 | 0.0 | 0.0 | 0.0 | nan |
0.1546 | 15.3846 | 600 | 0.1294 | 0.1692 | 0.2438 | 0.5627 | nan | 0.6604 | 0.5587 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5584 | 0.4571 | 0.0 | 0.0 | 0.0 | nan |
0.1308 | 15.8974 | 620 | 0.1219 | 0.1625 | 0.2305 | 0.5490 | nan | 0.6725 | 0.4798 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5564 | 0.4189 | 0.0 | 0.0 | 0.0 | nan |
0.1365 | 16.4103 | 640 | 0.1241 | 0.1686 | 0.2469 | 0.5603 | nan | 0.6418 | 0.5925 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5478 | 0.4640 | 0.0 | 0.0 | 0.0 | nan |
0.1361 | 16.9231 | 660 | 0.1191 | 0.1590 | 0.2255 | 0.5385 | nan | 0.6620 | 0.4653 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5464 | 0.4074 | 0.0 | 0.0 | 0.0 | nan |
0.1125 | 17.4359 | 680 | 0.1168 | 0.1693 | 0.2397 | 0.5549 | nan | 0.6537 | 0.5451 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5570 | 0.4589 | 0.0 | 0.0 | 0.0 | nan |
0.1134 | 17.9487 | 700 | 0.1182 | 0.1732 | 0.2470 | 0.5725 | nan | 0.6758 | 0.5592 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5705 | 0.4689 | 0.0 | 0.0 | 0.0 | nan |
0.1351 | 18.4615 | 720 | 0.1168 | 0.1751 | 0.2496 | 0.5698 | nan | 0.6580 | 0.5901 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5648 | 0.4859 | 0.0 | 0.0 | 0.0 | nan |
0.1382 | 18.9744 | 740 | 0.1177 | 0.1771 | 0.2530 | 0.5784 | nan | 0.6694 | 0.5956 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5724 | 0.4904 | 0.0 | 0.0 | 0.0 | nan |
0.1113 | 19.4872 | 760 | 0.1164 | 0.1789 | 0.2562 | 0.5859 | nan | 0.6784 | 0.6026 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5791 | 0.4943 | 0.0 | 0.0 | 0.0 | nan |
0.098 | 20.0 | 780 | 0.1155 | 0.1748 | 0.2486 | 0.5704 | nan | 0.6638 | 0.5791 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.5674 | 0.4817 | 0.0 | 0.0 | 0.0 | nan |
Framework versions
- Transformers 4.42.4
- Pytorch 2.3.1+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 8
Model tree for saad7489/segformer-b0-finetuned-segments-SixrayGun8-14-2024
Base model
nvidia/mit-b0