File size: 7,330 Bytes
bd87ca1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cabe6fa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bd87ca1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6fee2d2
 
 
 
 
 
 
 
 
 
 
a702506
 
 
 
 
 
 
 
 
 
 
cabe6fa
 
 
 
 
 
 
 
 
 
 
bd87ca1
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
---
license: other
tags:
- generated_from_keras_callback
model-index:
- name: nateraw/mit-b0-finetuned-sidewalks-v2
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# nateraw/mit-b0-finetuned-sidewalks-v2

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.6320
- Validation Loss: 0.5173
- Validation Mean Iou: 0.2894
- Validation Mean Accuracy: 0.3454
- Validation Overall Accuracy: 0.8435
- Validation Per Category Iou: [0.         0.70789146 0.84902296 0.65266358 0.76099965 0.32934391
        nan 0.29576422 0.43988204 0.         0.79276447 0.
 0.         0.         0.         0.42668367 0.         0.
 0.71717911 0.         0.32151249 0.50084444 0.                nan
 0.         0.18711455 0.         0.         0.82903803 0.68990498
 0.8990059  0.         0.00213015 0.14819771 0.        ]
- Validation Per Category Accuracy: [0.         0.84048763 0.93514369 0.68355212 0.88302113 0.458816
        nan 0.38623272 0.69456442 0.         0.92379471 0.
 0.         0.         0.         0.50677438 0.         0.
 0.90362965 0.         0.4662386  0.57368294 0.                nan
 0.         0.23281768 0.         0.         0.9001526  0.86786434
 0.95195314 0.         0.00333751 0.18532191 0.        ]
- Epoch: 3

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 6e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Per Category Iou                                                                                                                                                                                                                                                                                                                                                             | Validation Per Category Accuracy                                                                                                                                                                                                                                                                                                                                                        | Epoch |
|:----------:|:---------------:|:-------------------:|:------------------------:|:---------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----:|
| 1.4089     | 0.8220          | 0.1975              | 0.2427                   | 0.7701                      | [0.         0.58353931 0.7655921  0.04209491 0.53135026 0.11779776
        nan 0.07709853 0.15950712 0.         0.69634813 0.
 0.         0.         0.         0.         0.         0.
 0.61456822 0.         0.24971248 0.27129675 0.                nan
 0.         0.07697324 0.         0.         0.78576516 0.61267064
 0.84564576 0.         0.         0.08904216 0.        ] | [0.         0.88026971 0.93475302 0.04216372 0.5484085  0.13285614
        nan 0.08669707 0.19044773 0.         0.90089024 0.
 0.         0.         0.         0.         0.         0.
 0.76783975 0.         0.42102101 0.28659817 0.                nan
 0.         0.08671771 0.         0.         0.89590301 0.74932576
 0.9434814  0.         0.         0.14245566 0.        ] | 0     |
| 0.8462     | 0.6135          | 0.2551              | 0.2960                   | 0.8200                      | [0.         0.66967645 0.80571406 0.56416239 0.66692248 0.24744912
        nan 0.23994505 0.28962463 0.         0.76504783 0.
 0.         0.         0.         0.14111353 0.         0.
 0.6924468  0.         0.27988701 0.41876094 0.                nan
 0.         0.14755829 0.         0.         0.81614463 0.68429711
 0.87710938 0.         0.         0.11234171 0.        ] | [0.         0.83805933 0.94928385 0.59586511 0.72913519 0.30595504
        nan 0.3128234  0.34805831 0.         0.87847495 0.
 0.         0.         0.         0.14205167 0.         0.
 0.87543619 0.         0.36001144 0.49498574 0.                nan
 0.         0.18179115 0.         0.         0.92867923 0.7496178
 0.92220166 0.         0.         0.15398549 0.        ]  | 1     |
| 0.7134     | 0.5660          | 0.2780              | 0.3320                   | 0.8286                      | [0.         0.64791461 0.83800512 0.67301044 0.68120631 0.27361472
        nan 0.26715802 0.43596999 0.         0.78649287 0.
 0.         0.         0.         0.41256964 0.         0.
 0.71114766 0.         0.31646321 0.44682442 0.                nan
 0.         0.17132551 0.         0.         0.81845697 0.67536699
 0.88940936 0.         0.         0.1304862  0.        ] | [0.         0.85958877 0.92084269 0.82341633 0.74725972 0.33495972
        nan 0.40755277 0.56591531 0.         0.90641721 0.
 0.         0.         0.         0.48144408 0.         0.
 0.88294811 0.         0.46962078 0.47517397 0.                nan
 0.         0.20631607 0.         0.         0.90956851 0.85856042
 0.94107052 0.         0.         0.16669713 0.        ] | 2     |
| 0.6320     | 0.5173          | 0.2894              | 0.3454                   | 0.8435                      | [0.         0.70789146 0.84902296 0.65266358 0.76099965 0.32934391
        nan 0.29576422 0.43988204 0.         0.79276447 0.
 0.         0.         0.         0.42668367 0.         0.
 0.71717911 0.         0.32151249 0.50084444 0.                nan
 0.         0.18711455 0.         0.         0.82903803 0.68990498
 0.8990059  0.         0.00213015 0.14819771 0.        ] | [0.         0.84048763 0.93514369 0.68355212 0.88302113 0.458816
        nan 0.38623272 0.69456442 0.         0.92379471 0.
 0.         0.         0.         0.50677438 0.         0.
 0.90362965 0.         0.4662386  0.57368294 0.                nan
 0.         0.23281768 0.         0.         0.9001526  0.86786434
 0.95195314 0.         0.00333751 0.18532191 0.        ]   | 3     |


### Framework versions

- Transformers 4.24.0
- TensorFlow 2.9.2
- Datasets 2.7.0
- Tokenizers 0.13.2