Deepglobe_segformer_3
This model is a fine-tuned version of nvidia/mit-b0 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8406
- Mean Iou: 0.4620
- Mean Accuracy: 0.5756
- Overall Accuracy: 0.8155
- Per Category Iou: [0.6433628051500885, 0.8443710072700582, 0.05868084899551419, 0.6663364807256275, 0.5090198981201258, 0.5124226283624453, 0.0]
- Per Category Accuracy: [0.8883659557544211, 0.9159453906069126, 0.06770783080218444, 0.7777261715162402, 0.5203630889379521, 0.85919641595548, 0.0]
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
---|---|---|---|---|---|---|---|---|
1.4727 | 0.49 | 20 | 1.5749 | 0.2930 | 0.4344 | 0.7062 | [0.47121379062001395, 0.7670396913155827, 0.09819894521882991, 0.5227229243539695, 0.0031194487593101524, 0.18902045042309878, 0.0] | [0.9424700572966612, 0.8218753490741639, 0.17839715076899326, 0.8786984834251509, 0.003125783241778855, 0.21592899892395664, 0.0] |
1.1746 | 0.98 | 40 | 1.0074 | 0.3675 | 0.4978 | 0.7835 | [0.6335804347395366, 0.81926412147752, 0.042360636232568284, 0.5516872946875934, 0.06391535031461955, 0.46156523081466366, 0.0] | [0.8835907055203469, 0.8959329160865661, 0.04826409660176376, 0.9133718981931649, 0.0641168704492156, 0.6795223711292329, 0.0] |
0.9123 | 1.46 | 60 | 0.8961 | 0.4223 | 0.5476 | 0.8032 | [0.6331637378113184, 0.8287686244851504, 0.06308541030241316, 0.6147403096624029, 0.27778067188553224, 0.5387135123290364, 0.0] | [0.8993383411101392, 0.8976098372522392, 0.07531755948360704, 0.9001059260136011, 0.28091500200350256, 0.7797697919353069, 0.0] |
1.0879 | 1.95 | 80 | 0.8875 | 0.4412 | 0.5616 | 0.8054 | [0.6004683670971223, 0.8332655416128979, 0.04572839646818258, 0.6549740332584124, 0.44062035905035596, 0.5130410733489108, 0.0] | [0.9154682866219158, 0.9036612725247445, 0.052284546805349184, 0.8106196696766435, 0.44475361708037797, 0.8046768856202466, 0.0] |
1.2142 | 2.44 | 100 | 0.8355 | 0.4533 | 0.5722 | 0.8077 | [0.6298554391449935, 0.8369662549593841, 0.050673209170535985, 0.6649442110070505, 0.4958035295244105, 0.49513095742822216, 0.0] | [0.8994176115323517, 0.902362104312013, 0.05909947222168619, 0.790401335237105, 0.5042568442360589, 0.8498825640463182, 0.0] |
0.884 | 2.93 | 120 | 0.8406 | 0.4620 | 0.5756 | 0.8155 | [0.6433628051500885, 0.8443710072700582, 0.05868084899551419, 0.6663364807256275, 0.5090198981201258, 0.5124226283624453, 0.0] | [0.8883659557544211, 0.9159453906069126, 0.06770783080218444, 0.7777261715162402, 0.5203630889379521, 0.85919641595548, 0.0] |
Framework versions
- Transformers 4.31.0
- Pytorch 2.0.0+cu117
- Datasets 2.10.1
- Tokenizers 0.13.3
- Downloads last month
- 3
Model tree for prnv13/Deepglobe_segformer_3
Base model
nvidia/mit-b0