|
--- |
|
license: other |
|
tags: |
|
- generated_from_keras_callback |
|
model-index: |
|
- name: nateraw/mit-b0-finetuned-sidewalks-v2 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should |
|
probably proofread and complete it, then remove this comment. --> |
|
|
|
# nateraw/mit-b0-finetuned-sidewalks-v2 |
|
|
|
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Train Loss: 0.4495 |
|
- Validation Loss: 0.4797 |
|
- Validation Mean Iou: 0.3035 |
|
- Validation Mean Accuracy: 0.3702 |
|
- Validation Overall Accuracy: 0.8468 |
|
- Validation Per Category Iou: [0.00000000e+00 7.52163526e-01 8.46563375e-01 7.16396797e-01 |
|
7.38850637e-01 3.93073019e-01 nan 3.31795957e-01 |
|
4.92991567e-01 0.00000000e+00 8.11302090e-01 0.00000000e+00 |
|
0.00000000e+00 0.00000000e+00 0.00000000e+00 5.16059849e-01 |
|
0.00000000e+00 0.00000000e+00 6.56058294e-01 1.25948501e-02 |
|
2.66942435e-01 5.34406894e-01 0.00000000e+00 nan |
|
0.00000000e+00 2.27750085e-01 4.86381323e-04 0.00000000e+00 |
|
8.48618960e-01 7.25828093e-01 9.17747637e-01 8.28380212e-03 |
|
6.74590297e-02 1.51281596e-01 0.00000000e+00] |
|
- Validation Per Category Accuracy: [0.00000000e+00 8.75360044e-01 9.43650850e-01 8.78658645e-01 |
|
7.76578096e-01 4.85757596e-01 nan 4.30901582e-01 |
|
7.54126335e-01 0.00000000e+00 9.30112537e-01 0.00000000e+00 |
|
0.00000000e+00 0.00000000e+00 0.00000000e+00 6.42914247e-01 |
|
0.00000000e+00 0.00000000e+00 7.57605356e-01 1.27102686e-02 |
|
6.50888458e-01 6.94757080e-01 0.00000000e+00 nan |
|
0.00000000e+00 2.91727649e-01 4.86381323e-04 0.00000000e+00 |
|
9.42251577e-01 8.60753175e-01 9.56778008e-01 8.51551074e-03 |
|
1.38756779e-01 1.83583708e-01 0.00000000e+00] |
|
- Epoch: 7 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- optimizer: {'name': 'Adam', 'learning_rate': 6e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False} |
|
- training_precision: float32 |
|
|
|
### Training results |
|
|
|
| Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Per Category Iou | Validation Per Category Accuracy | Epoch | |
|
|:----------:|:---------------:|:-------------------:|:------------------------:|:---------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----:| |
|
| 1.4089 | 0.8220 | 0.1975 | 0.2427 | 0.7701 | [0. 0.58353931 0.7655921 0.04209491 0.53135026 0.11779776 |
|
nan 0.07709853 0.15950712 0. 0.69634813 0. |
|
0. 0. 0. 0. 0. 0. |
|
0.61456822 0. 0.24971248 0.27129675 0. nan |
|
0. 0.07697324 0. 0. 0.78576516 0.61267064 |
|
0.84564576 0. 0. 0.08904216 0. ] | [0. 0.88026971 0.93475302 0.04216372 0.5484085 0.13285614 |
|
nan 0.08669707 0.19044773 0. 0.90089024 0. |
|
0. 0. 0. 0. 0. 0. |
|
0.76783975 0. 0.42102101 0.28659817 0. nan |
|
0. 0.08671771 0. 0. 0.89590301 0.74932576 |
|
0.9434814 0. 0. 0.14245566 0. ] | 0 | |
|
| 0.8462 | 0.6135 | 0.2551 | 0.2960 | 0.8200 | [0. 0.66967645 0.80571406 0.56416239 0.66692248 0.24744912 |
|
nan 0.23994505 0.28962463 0. 0.76504783 0. |
|
0. 0. 0. 0.14111353 0. 0. |
|
0.6924468 0. 0.27988701 0.41876094 0. nan |
|
0. 0.14755829 0. 0. 0.81614463 0.68429711 |
|
0.87710938 0. 0. 0.11234171 0. ] | [0. 0.83805933 0.94928385 0.59586511 0.72913519 0.30595504 |
|
nan 0.3128234 0.34805831 0. 0.87847495 0. |
|
0. 0. 0. 0.14205167 0. 0. |
|
0.87543619 0. 0.36001144 0.49498574 0. nan |
|
0. 0.18179115 0. 0. 0.92867923 0.7496178 |
|
0.92220166 0. 0. 0.15398549 0. ] | 1 | |
|
| 0.7134 | 0.5660 | 0.2780 | 0.3320 | 0.8286 | [0. 0.64791461 0.83800512 0.67301044 0.68120631 0.27361472 |
|
nan 0.26715802 0.43596999 0. 0.78649287 0. |
|
0. 0. 0. 0.41256964 0. 0. |
|
0.71114766 0. 0.31646321 0.44682442 0. nan |
|
0. 0.17132551 0. 0. 0.81845697 0.67536699 |
|
0.88940936 0. 0. 0.1304862 0. ] | [0. 0.85958877 0.92084269 0.82341633 0.74725972 0.33495972 |
|
nan 0.40755277 0.56591531 0. 0.90641721 0. |
|
0. 0. 0. 0.48144408 0. 0. |
|
0.88294811 0. 0.46962078 0.47517397 0. nan |
|
0. 0.20631607 0. 0. 0.90956851 0.85856042 |
|
0.94107052 0. 0. 0.16669713 0. ] | 2 | |
|
| 0.6320 | 0.5173 | 0.2894 | 0.3454 | 0.8435 | [0. 0.70789146 0.84902296 0.65266358 0.76099965 0.32934391 |
|
nan 0.29576422 0.43988204 0. 0.79276447 0. |
|
0. 0. 0. 0.42668367 0. 0. |
|
0.71717911 0. 0.32151249 0.50084444 0. nan |
|
0. 0.18711455 0. 0. 0.82903803 0.68990498 |
|
0.8990059 0. 0.00213015 0.14819771 0. ] | [0. 0.84048763 0.93514369 0.68355212 0.88302113 0.458816 |
|
nan 0.38623272 0.69456442 0. 0.92379471 0. |
|
0. 0. 0. 0.50677438 0. 0. |
|
0.90362965 0. 0.4662386 0.57368294 0. nan |
|
0. 0.23281768 0. 0. 0.9001526 0.86786434 |
|
0.95195314 0. 0.00333751 0.18532191 0. ] | 3 | |
|
| 0.5609 | 0.5099 | 0.2920 | 0.3599 | 0.8385 | [0. 0.70817583 0.84131144 0.66573523 0.81449696 0.38891117 |
|
nan 0.28124784 0.42659255 0. 0.80855146 0. |
|
0. 0. 0. 0.46011866 0. 0. |
|
0.65458792 0. 0.28411565 0.46758138 0. nan |
|
0. 0.21849067 0. 0. 0.83829062 0.71207623 |
|
0.89929169 0. 0.02846127 0.13782635 0. ] | [0. 0.88632871 0.91269832 0.79044294 0.88368528 0.57405218 |
|
nan 0.35035973 0.77610775 0. 0.8889696 0. |
|
0. 0. 0. 0.6020786 0. 0. |
|
0.74586521 0. 0.61602403 0.54519561 0. nan |
|
0. 0.28447396 0. 0. 0.94520232 0.85544414 |
|
0.95994042 0. 0.04680851 0.21407134 0. ] | 4 | |
|
| 0.5256 | 0.4741 | 0.3045 | 0.3598 | 0.8558 | [0.00000000e+00 7.50159008e-01 8.53654462e-01 6.44928131e-01 |
|
7.90455244e-01 4.33599913e-01 nan 3.33472954e-01 |
|
4.74502513e-01 0.00000000e+00 8.01366017e-01 0.00000000e+00 |
|
0.00000000e+00 0.00000000e+00 0.00000000e+00 4.67653814e-01 |
|
0.00000000e+00 0.00000000e+00 7.27412479e-01 0.00000000e+00 |
|
4.18946113e-01 5.04714837e-01 0.00000000e+00 nan |
|
0.00000000e+00 2.00373855e-01 0.00000000e+00 0.00000000e+00 |
|
8.50200795e-01 7.41636173e-01 9.08320534e-01 2.77259907e-04 |
|
0.00000000e+00 1.45430716e-01 0.00000000e+00] | [0.00000000e+00 8.86487233e-01 9.05201886e-01 7.23139265e-01 |
|
8.91929263e-01 7.26675641e-01 nan 4.36386295e-01 |
|
6.64378543e-01 0.00000000e+00 8.89056843e-01 0.00000000e+00 |
|
0.00000000e+00 0.00000000e+00 0.00000000e+00 5.65450644e-01 |
|
0.00000000e+00 0.00000000e+00 9.27446136e-01 0.00000000e+00 |
|
5.36031025e-01 5.84198054e-01 0.00000000e+00 nan |
|
0.00000000e+00 2.42514534e-01 0.00000000e+00 0.00000000e+00 |
|
9.31954754e-01 8.26849708e-01 9.59880377e-01 2.79039335e-04 |
|
0.00000000e+00 1.77106051e-01 0.00000000e+00] | 5 | |
|
| 0.4761 | 0.4922 | 0.3036 | 0.3754 | 0.8517 | [0.00000000e+00 7.18490241e-01 8.54701589e-01 5.90903088e-01 |
|
8.21902743e-01 4.76229883e-01 nan 3.32447673e-01 |
|
4.80642540e-01 0.00000000e+00 8.02904449e-01 0.00000000e+00 |
|
0.00000000e+00 0.00000000e+00 0.00000000e+00 4.73285636e-01 |
|
0.00000000e+00 0.00000000e+00 7.16608930e-01 0.00000000e+00 |
|
3.16598081e-01 5.12540924e-01 0.00000000e+00 nan |
|
0.00000000e+00 2.27702968e-01 0.00000000e+00 0.00000000e+00 |
|
8.51831675e-01 7.39827330e-01 9.07152231e-01 5.59070700e-04 |
|
3.70370370e-02 1.56538301e-01 0.00000000e+00] | [0.00000000e+00 9.20834531e-01 8.92075255e-01 7.48664032e-01 |
|
9.03709011e-01 7.40703529e-01 nan 4.40828188e-01 |
|
7.92719139e-01 0.00000000e+00 9.21593374e-01 0.00000000e+00 |
|
0.00000000e+00 0.00000000e+00 0.00000000e+00 6.90292855e-01 |
|
0.00000000e+00 0.00000000e+00 8.42229041e-01 0.00000000e+00 |
|
4.75170857e-01 6.72591473e-01 0.00000000e+00 nan |
|
0.00000000e+00 2.94713089e-01 0.00000000e+00 0.00000000e+00 |
|
9.26034809e-01 8.39522012e-01 9.66679296e-01 6.06188900e-04 |
|
1.12807676e-01 2.07280968e-01 0.00000000e+00] | 6 | |
|
| 0.4495 | 0.4797 | 0.3035 | 0.3702 | 0.8468 | [0.00000000e+00 7.52163526e-01 8.46563375e-01 7.16396797e-01 |
|
7.38850637e-01 3.93073019e-01 nan 3.31795957e-01 |
|
4.92991567e-01 0.00000000e+00 8.11302090e-01 0.00000000e+00 |
|
0.00000000e+00 0.00000000e+00 0.00000000e+00 5.16059849e-01 |
|
0.00000000e+00 0.00000000e+00 6.56058294e-01 1.25948501e-02 |
|
2.66942435e-01 5.34406894e-01 0.00000000e+00 nan |
|
0.00000000e+00 2.27750085e-01 4.86381323e-04 0.00000000e+00 |
|
8.48618960e-01 7.25828093e-01 9.17747637e-01 8.28380212e-03 |
|
6.74590297e-02 1.51281596e-01 0.00000000e+00] | [0.00000000e+00 8.75360044e-01 9.43650850e-01 8.78658645e-01 |
|
7.76578096e-01 4.85757596e-01 nan 4.30901582e-01 |
|
7.54126335e-01 0.00000000e+00 9.30112537e-01 0.00000000e+00 |
|
0.00000000e+00 0.00000000e+00 0.00000000e+00 6.42914247e-01 |
|
0.00000000e+00 0.00000000e+00 7.57605356e-01 1.27102686e-02 |
|
6.50888458e-01 6.94757080e-01 0.00000000e+00 nan |
|
0.00000000e+00 2.91727649e-01 4.86381323e-04 0.00000000e+00 |
|
9.42251577e-01 8.60753175e-01 9.56778008e-01 8.51551074e-03 |
|
1.38756779e-01 1.83583708e-01 0.00000000e+00] | 7 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.24.0 |
|
- TensorFlow 2.9.2 |
|
- Datasets 2.7.0 |
|
- Tokenizers 0.13.2 |
|
|