Model save
Browse files
README.md
ADDED
@@ -0,0 +1,103 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: other
|
3 |
+
base_model: nvidia/mit-b5
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
model-index:
|
7 |
+
- name: ecc_segformerv2
|
8 |
+
results: []
|
9 |
+
---
|
10 |
+
|
11 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
+
should probably proofread and complete it, then remove this comment. -->
|
13 |
+
|
14 |
+
# ecc_segformerv2
|
15 |
+
|
16 |
+
This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on an unknown dataset.
|
17 |
+
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 0.3478
|
19 |
+
- Mean Iou: 0.0862
|
20 |
+
- Mean Accuracy: 0.1924
|
21 |
+
- Overall Accuracy: 0.1924
|
22 |
+
- Accuracy Background: nan
|
23 |
+
- Accuracy Crack: 0.1924
|
24 |
+
- Iou Background: 0.0
|
25 |
+
- Iou Crack: 0.1723
|
26 |
+
|
27 |
+
## Model description
|
28 |
+
|
29 |
+
More information needed
|
30 |
+
|
31 |
+
## Intended uses & limitations
|
32 |
+
|
33 |
+
More information needed
|
34 |
+
|
35 |
+
## Training and evaluation data
|
36 |
+
|
37 |
+
More information needed
|
38 |
+
|
39 |
+
## Training procedure
|
40 |
+
|
41 |
+
### Training hyperparameters
|
42 |
+
|
43 |
+
The following hyperparameters were used during training:
|
44 |
+
- learning_rate: 6e-05
|
45 |
+
- train_batch_size: 4
|
46 |
+
- eval_batch_size: 4
|
47 |
+
- seed: 1337
|
48 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
49 |
+
- lr_scheduler_type: polynomial
|
50 |
+
- training_steps: 10000
|
51 |
+
|
52 |
+
### Training results
|
53 |
+
|
54 |
+
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Crack | Iou Background | Iou Crack |
|
55 |
+
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:--------------:|:--------------:|:---------:|
|
56 |
+
| 0.1019 | 1.0 | 251 | 0.5116 | 0.1490 | 0.3280 | 0.3280 | nan | 0.3280 | 0.0 | 0.2979 |
|
57 |
+
| 0.0938 | 2.0 | 502 | 0.4725 | 0.1144 | 0.2400 | 0.2400 | nan | 0.2400 | 0.0 | 0.2287 |
|
58 |
+
| 0.098 | 3.0 | 753 | 0.5117 | 0.1276 | 0.2748 | 0.2748 | nan | 0.2748 | 0.0 | 0.2552 |
|
59 |
+
| 0.1018 | 4.0 | 1004 | 0.3870 | 0.1053 | 0.2254 | 0.2254 | nan | 0.2254 | 0.0 | 0.2106 |
|
60 |
+
| 0.0928 | 5.0 | 1255 | 0.2907 | 0.0772 | 0.1630 | 0.1630 | nan | 0.1630 | 0.0 | 0.1544 |
|
61 |
+
| 0.0936 | 6.0 | 1506 | 0.5220 | 0.1193 | 0.2544 | 0.2544 | nan | 0.2544 | 0.0 | 0.2385 |
|
62 |
+
| 0.077 | 7.0 | 1757 | 0.1608 | 0.0617 | 0.1308 | 0.1308 | nan | 0.1308 | 0.0 | 0.1235 |
|
63 |
+
| 0.0963 | 8.0 | 2008 | 0.1756 | 0.0456 | 0.0923 | 0.0923 | nan | 0.0923 | 0.0 | 0.0912 |
|
64 |
+
| 0.0958 | 9.0 | 2259 | 0.2027 | 0.0862 | 0.1813 | 0.1813 | nan | 0.1813 | 0.0 | 0.1725 |
|
65 |
+
| 0.0755 | 10.0 | 2510 | 0.2327 | 0.0888 | 0.1832 | 0.1832 | nan | 0.1832 | 0.0 | 0.1776 |
|
66 |
+
| 0.0632 | 11.0 | 2761 | 0.2169 | 0.0846 | 0.1863 | 0.1863 | nan | 0.1863 | 0.0 | 0.1693 |
|
67 |
+
| 0.0638 | 12.0 | 3012 | 0.2309 | 0.0852 | 0.1957 | 0.1957 | nan | 0.1957 | 0.0 | 0.1704 |
|
68 |
+
| 0.0509 | 13.0 | 3263 | 0.3209 | 0.1236 | 0.2910 | 0.2910 | nan | 0.2910 | 0.0 | 0.2472 |
|
69 |
+
| 0.0497 | 14.0 | 3514 | 0.3274 | 0.1045 | 0.2354 | 0.2354 | nan | 0.2354 | 0.0 | 0.2089 |
|
70 |
+
| 0.0396 | 15.0 | 3765 | 0.3415 | 0.1005 | 0.2257 | 0.2257 | nan | 0.2257 | 0.0 | 0.2010 |
|
71 |
+
| 0.0373 | 16.0 | 4016 | 0.3530 | 0.1122 | 0.2486 | 0.2486 | nan | 0.2486 | 0.0 | 0.2244 |
|
72 |
+
| 0.0388 | 17.0 | 4267 | 0.3312 | 0.0889 | 0.1974 | 0.1974 | nan | 0.1974 | 0.0 | 0.1778 |
|
73 |
+
| 0.0346 | 18.0 | 4518 | 0.3061 | 0.0903 | 0.2125 | 0.2125 | nan | 0.2125 | 0.0 | 0.1807 |
|
74 |
+
| 0.0296 | 19.0 | 4769 | 0.3223 | 0.1000 | 0.2315 | 0.2315 | nan | 0.2315 | 0.0 | 0.2000 |
|
75 |
+
| 0.0311 | 20.0 | 5020 | 0.3458 | 0.0943 | 0.2237 | 0.2237 | nan | 0.2237 | 0.0 | 0.1887 |
|
76 |
+
| 0.0303 | 21.0 | 5271 | 0.3283 | 0.0975 | 0.2255 | 0.2255 | nan | 0.2255 | 0.0 | 0.1951 |
|
77 |
+
| 0.0249 | 22.0 | 5522 | 0.3387 | 0.0998 | 0.2327 | 0.2327 | nan | 0.2327 | 0.0 | 0.1996 |
|
78 |
+
| 0.0298 | 23.0 | 5773 | 0.3332 | 0.0973 | 0.2242 | 0.2242 | nan | 0.2242 | 0.0 | 0.1946 |
|
79 |
+
| 0.0239 | 24.0 | 6024 | 0.3778 | 0.1146 | 0.2634 | 0.2634 | nan | 0.2634 | 0.0 | 0.2292 |
|
80 |
+
| 0.0238 | 25.0 | 6275 | 0.3250 | 0.0909 | 0.2081 | 0.2081 | nan | 0.2081 | 0.0 | 0.1818 |
|
81 |
+
| 0.0242 | 26.0 | 6526 | 0.3826 | 0.1002 | 0.2285 | 0.2285 | nan | 0.2285 | 0.0 | 0.2004 |
|
82 |
+
| 0.017 | 27.0 | 6777 | 0.3543 | 0.1058 | 0.2367 | 0.2367 | nan | 0.2367 | 0.0 | 0.2115 |
|
83 |
+
| 0.0241 | 28.0 | 7028 | 0.3491 | 0.0915 | 0.2069 | 0.2069 | nan | 0.2069 | 0.0 | 0.1830 |
|
84 |
+
| 0.0203 | 29.0 | 7279 | 0.3354 | 0.0899 | 0.2056 | 0.2056 | nan | 0.2056 | 0.0 | 0.1798 |
|
85 |
+
| 0.0206 | 30.0 | 7530 | 0.3592 | 0.0944 | 0.2165 | 0.2165 | nan | 0.2165 | 0.0 | 0.1888 |
|
86 |
+
| 0.0211 | 31.0 | 7781 | 0.3200 | 0.0943 | 0.2100 | 0.2100 | nan | 0.2100 | 0.0 | 0.1886 |
|
87 |
+
| 0.0209 | 32.0 | 8032 | 0.3401 | 0.0850 | 0.1941 | 0.1941 | nan | 0.1941 | 0.0 | 0.1701 |
|
88 |
+
| 0.0172 | 33.0 | 8283 | 0.3326 | 0.0879 | 0.1986 | 0.1986 | nan | 0.1986 | 0.0 | 0.1759 |
|
89 |
+
| 0.0187 | 34.0 | 8534 | 0.3343 | 0.0869 | 0.1960 | 0.1960 | nan | 0.1960 | 0.0 | 0.1739 |
|
90 |
+
| 0.0181 | 35.0 | 8785 | 0.3223 | 0.0824 | 0.1835 | 0.1835 | nan | 0.1835 | 0.0 | 0.1648 |
|
91 |
+
| 0.0168 | 36.0 | 9036 | 0.3461 | 0.0864 | 0.1933 | 0.1933 | nan | 0.1933 | 0.0 | 0.1727 |
|
92 |
+
| 0.0169 | 37.0 | 9287 | 0.3438 | 0.0848 | 0.1888 | 0.1888 | nan | 0.1888 | 0.0 | 0.1695 |
|
93 |
+
| 0.0182 | 38.0 | 9538 | 0.3506 | 0.0865 | 0.1933 | 0.1933 | nan | 0.1933 | 0.0 | 0.1730 |
|
94 |
+
| 0.0167 | 39.0 | 9789 | 0.3535 | 0.0869 | 0.1946 | 0.1946 | nan | 0.1946 | 0.0 | 0.1739 |
|
95 |
+
| 0.0174 | 39.84 | 10000 | 0.3478 | 0.0862 | 0.1924 | 0.1924 | nan | 0.1924 | 0.0 | 0.1723 |
|
96 |
+
|
97 |
+
|
98 |
+
### Framework versions
|
99 |
+
|
100 |
+
- Transformers 4.32.0.dev0
|
101 |
+
- Pytorch 2.0.1+cpu
|
102 |
+
- Datasets 2.14.4
|
103 |
+
- Tokenizers 0.13.3
|