license: other
tags:
- generated_from_keras_callback
model-index:
- name: nateraw/mit-b0-finetuned-sidewalks-v2
results: []
nateraw/mit-b0-finetuned-sidewalks-v2
This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.3487
- Validation Loss: 0.4486
- Validation Mean Iou: 0.3181
- Validation Mean Accuracy: 0.3898
- Validation Overall Accuracy: 0.8637
- Validation Per Category Iou: [0. 0.79416982 0.87767891 0.70942695 0.81634288 0.46749785 nan 0.42873013 0.48671464 0. 0.82752704 0.
0. 0. 0.50844774 0. 0.
0.68070149 0.03976498 0.29304387 0.46322705 0. nan 0. 0.24856882 0.12795031 0. 0.84646906 0.71781094 0.92550642 0.04810685 0.04610752 0.14423047 0. ]
- Validation Per Category Accuracy: [0. 0.86951324 0.95247608 0.82408892 0.90393017 0.59760857 nan 0.5760741 0.83602638 0. 0.93420702 0.
0. 0. 0.63502483 0. 0.
0.76902695 0.04024918 0.57179186 0.75842139 0. nan 0. 0.30837498 0.13239994 0. 0.95283514 0.78607095 0.96594744 0.05354669 0.18906967 0.2060098 0. ]
- Epoch: 11
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 6e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Per Category Iou | Validation Per Category Accuracy | Epoch |
---|---|---|---|---|---|---|---|
1.4089 | 0.8220 | 0.1975 | 0.2427 | 0.7701 | [0. 0.58353931 0.7655921 0.04209491 0.53135026 0.11779776 |
nan 0.07709853 0.15950712 0. 0.69634813 0.
0. 0. 0. 0. 0.
0.61456822 0. 0.24971248 0.27129675 0. nan 0. 0.07697324 0. 0. 0.78576516 0.61267064 0.84564576 0. 0. 0.08904216 0. ] | [0. 0.88026971 0.93475302 0.04216372 0.5484085 0.13285614 nan 0.08669707 0.19044773 0. 0.90089024 0. 0. 0. 0. 0. 0. 0. 0.76783975 0. 0.42102101 0.28659817 0. nan 0. 0.08671771 0. 0. 0.89590301 0.74932576 0.9434814 0. 0. 0.14245566 0. ] | 0 | | 0.8462 | 0.6135 | 0.2551 | 0.2960 | 0.8200 | [0. 0.66967645 0.80571406 0.56416239 0.66692248 0.24744912 nan 0.23994505 0.28962463 0. 0.76504783 0. 0. 0. 0. 0.14111353 0. 0. 0.6924468 0. 0.27988701 0.41876094 0. nan 0. 0.14755829 0. 0. 0.81614463 0.68429711 0.87710938 0. 0. 0.11234171 0. ] | [0. 0.83805933 0.94928385 0.59586511 0.72913519 0.30595504 nan 0.3128234 0.34805831 0. 0.87847495 0. 0. 0. 0. 0.14205167 0. 0. 0.87543619 0. 0.36001144 0.49498574 0. nan 0. 0.18179115 0. 0. 0.92867923 0.7496178 0.92220166 0. 0. 0.15398549 0. ] | 1 | | 0.7134 | 0.5660 | 0.2780 | 0.3320 | 0.8286 | [0. 0.64791461 0.83800512 0.67301044 0.68120631 0.27361472 nan 0.26715802 0.43596999 0. 0.78649287 0. 0. 0. 0. 0.41256964 0. 0. 0.71114766 0. 0.31646321 0.44682442 0. nan 0. 0.17132551 0. 0. 0.81845697 0.67536699 0.88940936 0. 0. 0.1304862 0. ] | [0. 0.85958877 0.92084269 0.82341633 0.74725972 0.33495972 nan 0.40755277 0.56591531 0. 0.90641721 0. 0. 0. 0. 0.48144408 0. 0. 0.88294811 0. 0.46962078 0.47517397 0. nan 0. 0.20631607 0. 0. 0.90956851 0.85856042 0.94107052 0. 0. 0.16669713 0. ] | 2 | | 0.6320 | 0.5173 | 0.2894 | 0.3454 | 0.8435 | [0. 0.70789146 0.84902296 0.65266358 0.76099965 0.32934391 nan 0.29576422 0.43988204 0. 0.79276447 0. 0. 0. 0. 0.42668367 0. 0. 0.71717911 0. 0.32151249 0.50084444 0. nan 0. 0.18711455 0. 0. 0.82903803 0.68990498 0.8990059 0. 0.00213015 0.14819771 0. ] | [0. 0.84048763 0.93514369 0.68355212 0.88302113 0.458816 nan 0.38623272 0.69456442 0. 0.92379471 0. 0. 0. 0. 0.50677438 0. 0. 0.90362965 0. 0.4662386 0.57368294 0. nan 0. 0.23281768 0. 0. 0.9001526 0.86786434 0.95195314 0. 0.00333751 0.18532191 0. ] | 3 | | 0.5609 | 0.5099 | 0.2920 | 0.3599 | 0.8385 | [0. 0.70817583 0.84131144 0.66573523 0.81449696 0.38891117 nan 0.28124784 0.42659255 0. 0.80855146 0. 0. 0. 0. 0.46011866 0. 0. 0.65458792 0. 0.28411565 0.46758138 0. nan 0. 0.21849067 0. 0. 0.83829062 0.71207623 0.89929169 0. 0.02846127 0.13782635 0. ] | [0. 0.88632871 0.91269832 0.79044294 0.88368528 0.57405218 nan 0.35035973 0.77610775 0. 0.8889696 0. 0. 0. 0. 0.6020786 0. 0. 0.74586521 0. 0.61602403 0.54519561 0. nan 0. 0.28447396 0. 0. 0.94520232 0.85544414 0.95994042 0. 0.04680851 0.21407134 0. ] | 4 | | 0.5256 | 0.4741 | 0.3045 | 0.3598 | 0.8558 | [0.00000000e+00 7.50159008e-01 8.53654462e-01 6.44928131e-01 7.90455244e-01 4.33599913e-01 nan 3.33472954e-01 4.74502513e-01 0.00000000e+00 8.01366017e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 4.67653814e-01 0.00000000e+00 0.00000000e+00 7.27412479e-01 0.00000000e+00 4.18946113e-01 5.04714837e-01 0.00000000e+00 nan 0.00000000e+00 2.00373855e-01 0.00000000e+00 0.00000000e+00 8.50200795e-01 7.41636173e-01 9.08320534e-01 2.77259907e-04 0.00000000e+00 1.45430716e-01 0.00000000e+00] | [0.00000000e+00 8.86487233e-01 9.05201886e-01 7.23139265e-01 8.91929263e-01 7.26675641e-01 nan 4.36386295e-01 6.64378543e-01 0.00000000e+00 8.89056843e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 5.65450644e-01 0.00000000e+00 0.00000000e+00 9.27446136e-01 0.00000000e+00 5.36031025e-01 5.84198054e-01 0.00000000e+00 nan 0.00000000e+00 2.42514534e-01 0.00000000e+00 0.00000000e+00 9.31954754e-01 8.26849708e-01 9.59880377e-01 2.79039335e-04 0.00000000e+00 1.77106051e-01 0.00000000e+00] | 5 | | 0.4761 | 0.4922 | 0.3036 | 0.3754 | 0.8517 | [0.00000000e+00 7.18490241e-01 8.54701589e-01 5.90903088e-01 8.21902743e-01 4.76229883e-01 nan 3.32447673e-01 4.80642540e-01 0.00000000e+00 8.02904449e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 4.73285636e-01 0.00000000e+00 0.00000000e+00 7.16608930e-01 0.00000000e+00 3.16598081e-01 5.12540924e-01 0.00000000e+00 nan 0.00000000e+00 2.27702968e-01 0.00000000e+00 0.00000000e+00 8.51831675e-01 7.39827330e-01 9.07152231e-01 5.59070700e-04 3.70370370e-02 1.56538301e-01 0.00000000e+00] | [0.00000000e+00 9.20834531e-01 8.92075255e-01 7.48664032e-01 9.03709011e-01 7.40703529e-01 nan 4.40828188e-01 7.92719139e-01 0.00000000e+00 9.21593374e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 6.90292855e-01 0.00000000e+00 0.00000000e+00 8.42229041e-01 0.00000000e+00 4.75170857e-01 6.72591473e-01 0.00000000e+00 nan 0.00000000e+00 2.94713089e-01 0.00000000e+00 0.00000000e+00 9.26034809e-01 8.39522012e-01 9.66679296e-01 6.06188900e-04 1.12807676e-01 2.07280968e-01 0.00000000e+00] | 6 | | 0.4495 | 0.4797 | 0.3035 | 0.3702 | 0.8468 | [0.00000000e+00 7.52163526e-01 8.46563375e-01 7.16396797e-01 7.38850637e-01 3.93073019e-01 nan 3.31795957e-01 4.92991567e-01 0.00000000e+00 8.11302090e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 5.16059849e-01 0.00000000e+00 0.00000000e+00 6.56058294e-01 1.25948501e-02 2.66942435e-01 5.34406894e-01 0.00000000e+00 nan 0.00000000e+00 2.27750085e-01 4.86381323e-04 0.00000000e+00 8.48618960e-01 7.25828093e-01 9.17747637e-01 8.28380212e-03 6.74590297e-02 1.51281596e-01 0.00000000e+00] | [0.00000000e+00 8.75360044e-01 9.43650850e-01 8.78658645e-01 7.76578096e-01 4.85757596e-01 nan 4.30901582e-01 7.54126335e-01 0.00000000e+00 9.30112537e-01 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00 6.42914247e-01 0.00000000e+00 0.00000000e+00 7.57605356e-01 1.27102686e-02 6.50888458e-01 6.94757080e-01 0.00000000e+00 nan 0.00000000e+00 2.91727649e-01 4.86381323e-04 0.00000000e+00 9.42251577e-01 8.60753175e-01 9.56778008e-01 8.51551074e-03 1.38756779e-01 1.83583708e-01 0.00000000e+00] | 7 | | 0.4193 | 0.4487 | 0.3073 | 0.3633 | 0.8594 | [0. 0.77081114 0.86089485 0.64464211 0.82962632 0.36186873 nan 0.39092332 0.5399988 0. 0.81734925 0. 0. 0. 0. 0.50271555 0. 0. 0.70239658 0. 0.30875695 0.52195319 0. nan 0. 0.20124517 0.00696273 0. 0.84526591 0.72563399 0.91703372 0. 0.03526147 0.15693635 0. ] | [0. 0.8654775 0.95711297 0.70665759 0.93130714 0.42436958 nan 0.52892143 0.69243377 0. 0.91682626 0. 0. 0. 0. 0.62315913 0. 0. 0.86251114 0. 0.5607807 0.70416055 0. nan 0. 0.24483525 0.00698305 0. 0.921099 0.81848055 0.96789871 0. 0.06891948 0.18778302 0. ] | 8 | | 0.3883 | 0.4824 | 0.3086 | 0.3690 | 0.8527 | [0. 0.76454291 0.86544951 0.70501066 0.77912256 0.39088976 nan 0.40275725 0.53334923 0. 0.82777802 0. 0. 0. 0. 0.49916177 0. 0. 0.68780083 0.01500768 0.31589145 0.53805504 0. nan 0. 0.22450413 0.03544121 0. 0.82663975 0.60689445 0.91513911 0.12702194 0.0163284 0.10604071 0. ] | [0. 0.86846682 0.93345513 0.77258597 0.90365389 0.54440067 nan 0.51997559 0.73323435 0. 0.92499729 0. 0. 0. 0. 0.62015064 0. 0. 0.8190305 0.01503264 0.61258781 0.62514291 0. nan 0. 0.28141855 0.03574903 0. 0.95838638 0.66828866 0.96505306 0.19804095 0.04463913 0.1315269 0. ] | 9 | | 0.3736 | 0.4515 | 0.3180 | 0.3859 | 0.8600 | [0. 0.77296038 0.8679117 0.60122746 0.84573808 0.42877201 nan 0.40372521 0.5356554 0. 0.82057963 0. 0. 0. 0. 0.48309209 0. 0. 0.70156487 0.07165346 0.31172072 0.45383525 0. nan 0. 0.26337213 0.07457255 0. 0.85227381 0.7079085 0.92271657 0.20363628 0.03853875 0.13249146 0. ] | [0. 0.90081404 0.93156248 0.71723323 0.91251575 0.57187527 nan 0.53665381 0.74547838 0. 0.93718616 0. 0. 0. 0. 0.6410839 0. 0. 0.80529967 0.07249561 0.6074764 0.5775282 0. nan 0. 0.34898163 0.07545859 0. 0.95221746 0.80297775 0.96768443 0.26155608 0.19382562 0.17354842 0. ] | 10 | | 0.3487 | 0.4486 | 0.3181 | 0.3898 | 0.8637 | [0. 0.79416982 0.87767891 0.70942695 0.81634288 0.46749785 nan 0.42873013 0.48671464 0. 0.82752704 0. 0. 0. 0. 0.50844774 0. 0. 0.68070149 0.03976498 0.29304387 0.46322705 0. nan 0. 0.24856882 0.12795031 0. 0.84646906 0.71781094 0.92550642 0.04810685 0.04610752 0.14423047 0. ] | [0. 0.86951324 0.95247608 0.82408892 0.90393017 0.59760857 nan 0.5760741 0.83602638 0. 0.93420702 0. 0. 0. 0. 0.63502483 0. 0. 0.76902695 0.04024918 0.57179186 0.75842139 0. nan 0. 0.30837498 0.13239994 0. 0.95283514 0.78607095 0.96594744 0.05354669 0.18906967 0.2060098 0. ] | 11 |
Framework versions
- Transformers 4.24.0
- TensorFlow 2.9.2
- Datasets 2.7.0
- Tokenizers 0.13.2