File size: 8,168 Bytes
824d692
0b66e7b
 
 
 
 
 
 
eb0ca29
824d692
 
0b66e7b
 
 
 
 
2398bf5
0b66e7b
eb0ca29
 
0b66e7b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
eb0ca29
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
---
license: other
base_model: nvidia/mit-b5
tags:
- generated_from_trainer
model-index:
- name: SegFormer_mit-b5_Clean-Set3-Grayscale_Augmented_Medium
  results: []
pipeline_tag: image-segmentation
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# SegFormer_mit-b5_Clean-Set3-Grayscale_Augmented_Medium

This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train-Loss: 0.0088 
- Val-Loss: 0.0134
- Mean Iou: 0.9793
- Mean Accuracy: 0.9903
- Overall Accuracy: 0.9947
- Accuracy Background: 0.9971
- Accuracy Melt: 0.9785
- Accuracy Substrate: 0.9952
- Iou Background: 0.9935
- Iou Melt: 0.9524
- Iou Substrate: 0.9920

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- num_epochs: 25

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:|
| 0.1305        | 0.3968  | 50   | 0.1020          | 0.8694   | 0.9199        | 0.9644           | 0.9855              | 0.8016        | 0.9726             | 0.9651         | 0.6989   | 0.9443        |
| 0.0906        | 0.7937  | 100  | 0.0668          | 0.8972   | 0.9187        | 0.9757           | 0.9891              | 0.7703        | 0.9968             | 0.9818         | 0.7488   | 0.9609        |
| 0.0409        | 1.1905  | 150  | 0.0606          | 0.9231   | 0.9414        | 0.9814           | 0.9879              | 0.8379        | 0.9984             | 0.9840         | 0.8152   | 0.9702        |
| 0.0678        | 1.5873  | 200  | 0.0344          | 0.9524   | 0.9762        | 0.9879           | 0.9883              | 0.9463        | 0.9941             | 0.9848         | 0.8890   | 0.9834        |
| 0.0312        | 1.9841  | 250  | 0.0340          | 0.9489   | 0.9756        | 0.9874           | 0.9935              | 0.9442        | 0.9892             | 0.9869         | 0.8779   | 0.9818        |
| 0.0334        | 2.3810  | 300  | 0.0277          | 0.9576   | 0.9826        | 0.9895           | 0.9956              | 0.9637        | 0.9885             | 0.9908         | 0.8987   | 0.9833        |
| 0.0286        | 2.7778  | 350  | 0.0264          | 0.9581   | 0.9776        | 0.9898           | 0.9964              | 0.9452        | 0.9912             | 0.9896         | 0.9002   | 0.9846        |
| 0.0214        | 3.1746  | 400  | 0.0230          | 0.9661   | 0.9824        | 0.9915           | 0.9926              | 0.9587        | 0.9958             | 0.9903         | 0.9206   | 0.9875        |
| 0.0208        | 3.5714  | 450  | 0.0203          | 0.9692   | 0.9876        | 0.9922           | 0.9968              | 0.9751        | 0.9910             | 0.9916         | 0.9283   | 0.9878        |
| 0.0146        | 3.9683  | 500  | 0.0231          | 0.9667   | 0.9852        | 0.9915           | 0.9961              | 0.9680        | 0.9913             | 0.9904         | 0.9229   | 0.9870        |
| 0.0197        | 4.3651  | 550  | 0.0208          | 0.9662   | 0.9883        | 0.9916           | 0.9950              | 0.9790        | 0.9908             | 0.9914         | 0.9200   | 0.9873        |
| 0.0198        | 4.7619  | 600  | 0.0184          | 0.9722   | 0.9836        | 0.9930           | 0.9969              | 0.9587        | 0.9951             | 0.9916         | 0.9355   | 0.9896        |
| 0.019         | 5.1587  | 650  | 0.0211          | 0.9693   | 0.9889        | 0.9919           | 0.9970              | 0.9801        | 0.9896             | 0.9907         | 0.9298   | 0.9872        |
| 0.0115        | 5.5556  | 700  | 0.0193          | 0.9706   | 0.9833        | 0.9928           | 0.9963              | 0.9584        | 0.9953             | 0.9926         | 0.9304   | 0.9888        |
| 0.0135        | 5.9524  | 750  | 0.0166          | 0.9740   | 0.9867        | 0.9933           | 0.9965              | 0.9692        | 0.9945             | 0.9919         | 0.9401   | 0.9899        |
| 0.0127        | 6.3492  | 800  | 0.0182          | 0.9736   | 0.9866        | 0.9932           | 0.9969              | 0.9689        | 0.9939             | 0.9918         | 0.9395   | 0.9895        |
| 0.0129        | 6.7460  | 850  | 0.0194          | 0.9723   | 0.9853        | 0.9930           | 0.9958              | 0.9651        | 0.9951             | 0.9920         | 0.9354   | 0.9894        |
| 0.0124        | 7.1429  | 900  | 0.0145          | 0.9771   | 0.9900        | 0.9941           | 0.9972              | 0.9789        | 0.9940             | 0.9928         | 0.9472   | 0.9911        |
| 0.011         | 7.5397  | 950  | 0.0149          | 0.9774   | 0.9876        | 0.9941           | 0.9972              | 0.9704        | 0.9953             | 0.9923         | 0.9485   | 0.9914        |
| 0.0176        | 7.9365  | 1000 | 0.0212          | 0.9681   | 0.9890        | 0.9919           | 0.9972              | 0.9802        | 0.9895             | 0.9923         | 0.9251   | 0.9869        |
| 0.0205        | 8.3333  | 1050 | 0.0171          | 0.9724   | 0.9895        | 0.9930           | 0.9971              | 0.9797        | 0.9918             | 0.9924         | 0.9356   | 0.9893        |
| 0.0103        | 8.7302  | 1100 | 0.0141          | 0.9780   | 0.9891        | 0.9943           | 0.9968              | 0.9754        | 0.9953             | 0.9928         | 0.9497   | 0.9915        |
| 0.0093        | 9.1270  | 1150 | 0.0148          | 0.9769   | 0.9881        | 0.9941           | 0.9965              | 0.9723        | 0.9956             | 0.9930         | 0.9466   | 0.9911        |
| 0.0113        | 9.5238  | 1200 | 0.0136          | 0.9788   | 0.9881        | 0.9945           | 0.9977              | 0.9711        | 0.9955             | 0.9929         | 0.9517   | 0.9918        |
| 0.0132        | 9.9206  | 1250 | 0.0144          | 0.9783   | 0.9882        | 0.9944           | 0.9971              | 0.9720        | 0.9957             | 0.9930         | 0.9503   | 0.9915        |
| 0.0104        | 10.3175 | 1300 | 0.0135          | 0.9788   | 0.9882        | 0.9945           | 0.9976              | 0.9714        | 0.9957             | 0.9932         | 0.9515   | 0.9918        |
| 0.0153        | 10.7143 | 1350 | 0.0129          | 0.9796   | 0.9889        | 0.9947           | 0.9970              | 0.9734        | 0.9962             | 0.9932         | 0.9534   | 0.9922        |
| 0.0091        | 11.1111 | 1400 | 0.0142          | 0.9783   | 0.9900        | 0.9944           | 0.9968              | 0.9784        | 0.9950             | 0.9931         | 0.9500   | 0.9917        |
| 0.0098        | 11.5079 | 1450 | 0.0139          | 0.9789   | 0.9889        | 0.9946           | 0.9967              | 0.9740        | 0.9962             | 0.9933         | 0.9516   | 0.9920        |
| 0.0094        | 11.9048 | 1500 | 0.0136          | 0.9795   | 0.9887        | 0.9947           | 0.9977              | 0.9730        | 0.9956             | 0.9931         | 0.9533   | 0.9920        |
| 0.0088        | 12.3016 | 1550 | 0.0134          | 0.9793   | 0.9903        | 0.9947           | 0.9971              | 0.9785        | 0.9952             | 0.9935         | 0.9524   | 0.9920        |


### Framework versions

- Transformers 4.41.2
- Pytorch 2.0.1+cu117
- Datasets 2.19.2
- Tokenizers 0.19.1