|
--- |
|
license: other |
|
base_model: nvidia/mit-b5 |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: SegFormer_mit-b5_Clean-Set3-Grayscale |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# SegFormer_mit-b5_Clean-Set3-Grayscale |
|
|
|
This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.0156 |
|
- Mean Iou: 0.9776 |
|
- Mean Accuracy: 0.9882 |
|
- Overall Accuracy: 0.9952 |
|
- Accuracy Background: 0.9974 |
|
- Accuracy Melt: 0.9708 |
|
- Accuracy Substrate: 0.9963 |
|
- Iou Background: 0.9942 |
|
- Iou Melt: 0.9458 |
|
- Iou Substrate: 0.9927 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0002 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 16 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: cosine |
|
- lr_scheduler_warmup_steps: 200 |
|
- num_epochs: 50 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |
|
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| |
|
| 0.1206 | 1.8519 | 50 | 0.0898 | 0.8826 | 0.9277 | 0.9727 | 0.9809 | 0.8182 | 0.9840 | 0.9697 | 0.7209 | 0.9571 | |
|
| 0.0687 | 3.7037 | 100 | 0.0445 | 0.9291 | 0.9568 | 0.9845 | 0.9920 | 0.8888 | 0.9895 | 0.9833 | 0.8286 | 0.9754 | |
|
| 0.0457 | 5.5556 | 150 | 0.0413 | 0.9284 | 0.9428 | 0.9859 | 0.9938 | 0.8381 | 0.9966 | 0.9877 | 0.8204 | 0.9770 | |
|
| 0.0281 | 7.4074 | 200 | 0.0240 | 0.9592 | 0.9706 | 0.9914 | 0.9971 | 0.9198 | 0.9949 | 0.9900 | 0.9011 | 0.9865 | |
|
| 0.0234 | 9.2593 | 250 | 0.0179 | 0.9672 | 0.9810 | 0.9932 | 0.9960 | 0.9513 | 0.9957 | 0.9926 | 0.9195 | 0.9893 | |
|
| 0.0147 | 11.1111 | 300 | 0.0180 | 0.9672 | 0.9785 | 0.9932 | 0.9955 | 0.9429 | 0.9972 | 0.9925 | 0.9197 | 0.9893 | |
|
| 0.012 | 12.9630 | 350 | 0.0139 | 0.9748 | 0.9864 | 0.9946 | 0.9967 | 0.9664 | 0.9962 | 0.9936 | 0.9390 | 0.9918 | |
|
| 0.0104 | 14.8148 | 400 | 0.0138 | 0.9756 | 0.9890 | 0.9947 | 0.9972 | 0.9748 | 0.9949 | 0.9935 | 0.9413 | 0.9919 | |
|
| 0.0094 | 16.6667 | 450 | 0.0136 | 0.9767 | 0.9862 | 0.9950 | 0.9965 | 0.9646 | 0.9974 | 0.9940 | 0.9436 | 0.9924 | |
|
| 0.0101 | 18.5185 | 500 | 0.0135 | 0.9767 | 0.9867 | 0.9950 | 0.9974 | 0.9663 | 0.9964 | 0.9940 | 0.9438 | 0.9924 | |
|
| 0.0087 | 20.3704 | 550 | 0.0144 | 0.9764 | 0.9887 | 0.9949 | 0.9954 | 0.9736 | 0.9970 | 0.9935 | 0.9435 | 0.9923 | |
|
| 0.0078 | 22.2222 | 600 | 0.0145 | 0.9760 | 0.9885 | 0.9949 | 0.9967 | 0.9727 | 0.9960 | 0.9938 | 0.9417 | 0.9924 | |
|
| 0.0095 | 24.0741 | 650 | 0.0145 | 0.9753 | 0.9855 | 0.9948 | 0.9971 | 0.9626 | 0.9967 | 0.9939 | 0.9398 | 0.9921 | |
|
| 0.0073 | 25.9259 | 700 | 0.0145 | 0.9761 | 0.9892 | 0.9949 | 0.9965 | 0.9752 | 0.9960 | 0.9938 | 0.9419 | 0.9925 | |
|
| 0.009 | 27.7778 | 750 | 0.0143 | 0.9772 | 0.9891 | 0.9951 | 0.9958 | 0.9745 | 0.9970 | 0.9938 | 0.9451 | 0.9929 | |
|
| 0.0049 | 29.6296 | 800 | 0.0143 | 0.9782 | 0.9883 | 0.9953 | 0.9966 | 0.9713 | 0.9971 | 0.9942 | 0.9474 | 0.9929 | |
|
| 0.0075 | 31.4815 | 850 | 0.0153 | 0.9767 | 0.9886 | 0.9951 | 0.9967 | 0.9727 | 0.9963 | 0.9941 | 0.9434 | 0.9925 | |
|
| 0.008 | 33.3333 | 900 | 0.0155 | 0.9772 | 0.9876 | 0.9952 | 0.9970 | 0.9690 | 0.9968 | 0.9943 | 0.9447 | 0.9927 | |
|
| 0.0061 | 35.1852 | 950 | 0.0150 | 0.9777 | 0.9877 | 0.9953 | 0.9973 | 0.9691 | 0.9967 | 0.9943 | 0.9461 | 0.9928 | |
|
| 0.0053 | 37.0370 | 1000 | 0.0156 | 0.9776 | 0.9882 | 0.9952 | 0.9974 | 0.9708 | 0.9963 | 0.9942 | 0.9458 | 0.9927 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.41.2 |
|
- Pytorch 2.0.1+cu117 |
|
- Datasets 2.19.2 |
|
- Tokenizers 0.19.1 |
|
|