File size: 13,889 Bytes
ffee9d6 e2ec15e 771f9d5 e2ec15e ffee9d6 73c1c97 ffee9d6 d336f7f 53834ca cc15857 7861e38 0918268 4853e31 14f859e 0a308a6 f3c8463 3aed5d9 93541c3 2d7fa68 291187d 101f402 d9b153c aa1654d 4d7427c c935318 9138305 5d9e9d0 a733517 95d6408 f1a18c4 dc9bd4e e4ef07d aac3ee9 8f0f907 5251b64 e736b45 9a39720 98465f1 198600b 3fcf2f1 74329c8 ac9eae1 ecf05ff 66ddd9b 4b78f37 bc7043a 8262771 4ecd176 5d55f47 70db5e4 247e491 42ef058 34e5033 2883542 6e19731 db2161e 45a2a0a d747c7b 73c1c97 771f9d5 6beeb47 12da1d5 8ad3cfe 72caab8 ec09079 806ed39 3e7a8c0 70b90e6 60d75d3 5e836bf 7c998bb 3158a54 3c10cf6 f682544 dfd7440 32778ca 09e0fd4 b6fea04 c328d62 2dbe0cc 82beed7 20b3d78 9f7f7e0 0177eba dc739fd da8aebf eb797af af0eaa9 296d03b 910b10a e4b6563 426bc12 55b669c 4426ae9 e245558 3e54ff1 99cf7f4 683c593 ddae82c 1c53a78 ba647e8 3e3f78d 7d059c6 c169581 82d1c6d e2ec15e ffee9d6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 |
---
license: other
tags:
- generated_from_keras_callback
model-index:
- name: AhamadShaik/SegFormer_PADDING_x.6
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# AhamadShaik/SegFormer_PADDING_x.6
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0159
- Train Dice Coef: 0.8034
- Train Iou: 0.6803
- Validation Loss: 0.0223
- Validation Dice Coef: 0.8606
- Validation Iou: 0.7573
- Train Lr: 1e-10
- Epoch: 99
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 1e-10, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Dice Coef | Train Iou | Validation Loss | Validation Dice Coef | Validation Iou | Train Lr | Epoch |
|:----------:|:---------------:|:---------:|:---------------:|:--------------------:|:--------------:|:--------:|:-----:|
| 0.2116 | 0.3018 | 0.1931 | 0.0863 | 0.6813 | 0.5211 | 1e-04 | 0 |
| 0.0722 | 0.4966 | 0.3490 | 0.0565 | 0.7560 | 0.6108 | 1e-04 | 1 |
| 0.0544 | 0.5768 | 0.4227 | 0.0465 | 0.7728 | 0.6368 | 1e-04 | 2 |
| 0.0446 | 0.6305 | 0.4771 | 0.0379 | 0.8130 | 0.6869 | 1e-04 | 3 |
| 0.0422 | 0.6479 | 0.4950 | 0.0366 | 0.8005 | 0.6719 | 1e-04 | 4 |
| 0.0375 | 0.6776 | 0.5273 | 0.0315 | 0.8327 | 0.7155 | 1e-04 | 5 |
| 0.0351 | 0.6926 | 0.5428 | 0.0311 | 0.8340 | 0.7177 | 1e-04 | 6 |
| 0.0341 | 0.6967 | 0.5485 | 0.0295 | 0.8377 | 0.7228 | 1e-04 | 7 |
| 0.0307 | 0.7246 | 0.5794 | 0.0278 | 0.8444 | 0.7328 | 1e-04 | 8 |
| 0.0318 | 0.7119 | 0.5664 | 0.0278 | 0.8423 | 0.7297 | 1e-04 | 9 |
| 0.0284 | 0.7362 | 0.5940 | 0.0280 | 0.8435 | 0.7314 | 1e-04 | 10 |
| 0.0278 | 0.7382 | 0.5979 | 0.0284 | 0.8371 | 0.7232 | 1e-04 | 11 |
| 0.0268 | 0.7429 | 0.6030 | 0.0261 | 0.8504 | 0.7419 | 1e-04 | 12 |
| 0.0262 | 0.7464 | 0.6072 | 0.0285 | 0.8408 | 0.7280 | 1e-04 | 13 |
| 0.0247 | 0.7560 | 0.6189 | 0.0255 | 0.8505 | 0.7419 | 1e-04 | 14 |
| 0.0244 | 0.7580 | 0.6209 | 0.0249 | 0.8524 | 0.7450 | 1e-04 | 15 |
| 0.0221 | 0.7719 | 0.6385 | 0.0246 | 0.8503 | 0.7422 | 1e-04 | 16 |
| 0.0234 | 0.7623 | 0.6261 | 0.0233 | 0.8567 | 0.7516 | 1e-04 | 17 |
| 0.0253 | 0.7527 | 0.6147 | 0.0258 | 0.8481 | 0.7401 | 1e-04 | 18 |
| 0.0241 | 0.7597 | 0.6236 | 0.0258 | 0.8430 | 0.7331 | 1e-04 | 19 |
| 0.0230 | 0.7657 | 0.6310 | 0.0224 | 0.8571 | 0.7522 | 1e-04 | 20 |
| 0.0210 | 0.7755 | 0.6431 | 0.0220 | 0.8609 | 0.7577 | 1e-04 | 21 |
| 0.0195 | 0.7867 | 0.6572 | 0.0231 | 0.8578 | 0.7531 | 1e-04 | 22 |
| 0.0192 | 0.7880 | 0.6592 | 0.0226 | 0.8602 | 0.7568 | 1e-04 | 23 |
| 0.0185 | 0.7909 | 0.6630 | 0.0231 | 0.8591 | 0.7549 | 1e-04 | 24 |
| 0.0186 | 0.7906 | 0.6626 | 0.0221 | 0.8590 | 0.7551 | 1e-04 | 25 |
| 0.0196 | 0.7836 | 0.6531 | 0.0239 | 0.8550 | 0.7491 | 1e-04 | 26 |
| 0.0177 | 0.7975 | 0.6717 | 0.0223 | 0.8589 | 0.7549 | 5e-06 | 27 |
| 0.0173 | 0.7979 | 0.6727 | 0.0228 | 0.8585 | 0.7542 | 5e-06 | 28 |
| 0.0170 | 0.7980 | 0.6731 | 0.0215 | 0.8594 | 0.7556 | 5e-06 | 29 |
| 0.0168 | 0.8003 | 0.6755 | 0.0213 | 0.8616 | 0.7590 | 5e-06 | 30 |
| 0.0167 | 0.8016 | 0.6774 | 0.0211 | 0.8614 | 0.7587 | 5e-06 | 31 |
| 0.0167 | 0.8044 | 0.6807 | 0.0217 | 0.8598 | 0.7562 | 5e-06 | 32 |
| 0.0167 | 0.8048 | 0.6815 | 0.0211 | 0.8622 | 0.7599 | 5e-06 | 33 |
| 0.0164 | 0.8013 | 0.6773 | 0.0213 | 0.8621 | 0.7596 | 5e-06 | 34 |
| 0.0162 | 0.8025 | 0.6790 | 0.0216 | 0.8608 | 0.7578 | 5e-06 | 35 |
| 0.0163 | 0.8018 | 0.6784 | 0.0212 | 0.8615 | 0.7587 | 5e-06 | 36 |
| 0.0161 | 0.8043 | 0.6818 | 0.0211 | 0.8627 | 0.7605 | 2.5e-07 | 37 |
| 0.0161 | 0.8025 | 0.6793 | 0.0218 | 0.8604 | 0.7572 | 2.5e-07 | 38 |
| 0.0163 | 0.8039 | 0.6810 | 0.0211 | 0.8618 | 0.7592 | 2.5e-07 | 39 |
| 0.0159 | 0.8044 | 0.6816 | 0.0215 | 0.8622 | 0.7597 | 2.5e-07 | 40 |
| 0.0157 | 0.8068 | 0.6841 | 0.0213 | 0.8612 | 0.7584 | 2.5e-07 | 41 |
| 0.0159 | 0.8063 | 0.6837 | 0.0214 | 0.8615 | 0.7588 | 1.25e-08 | 42 |
| 0.0160 | 0.8040 | 0.6814 | 0.0217 | 0.8609 | 0.7578 | 1.25e-08 | 43 |
| 0.0159 | 0.8072 | 0.6852 | 0.0213 | 0.8616 | 0.7589 | 1.25e-08 | 44 |
| 0.0160 | 0.8062 | 0.6836 | 0.0215 | 0.8611 | 0.7581 | 1.25e-08 | 45 |
| 0.0159 | 0.8045 | 0.6820 | 0.0211 | 0.8623 | 0.7600 | 1.25e-08 | 46 |
| 0.0162 | 0.8027 | 0.6798 | 0.0210 | 0.8622 | 0.7599 | 6.25e-10 | 47 |
| 0.0160 | 0.8039 | 0.6807 | 0.0218 | 0.8606 | 0.7575 | 6.25e-10 | 48 |
| 0.0159 | 0.8093 | 0.6874 | 0.0220 | 0.8601 | 0.7566 | 6.25e-10 | 49 |
| 0.0159 | 0.8072 | 0.6841 | 0.0217 | 0.8622 | 0.7596 | 6.25e-10 | 50 |
| 0.0159 | 0.8045 | 0.6815 | 0.0213 | 0.8614 | 0.7586 | 6.25e-10 | 51 |
| 0.0159 | 0.8111 | 0.6894 | 0.0216 | 0.8615 | 0.7588 | 6.25e-10 | 52 |
| 0.0158 | 0.8066 | 0.6843 | 0.0213 | 0.8617 | 0.7592 | 1e-10 | 53 |
| 0.0161 | 0.8042 | 0.6813 | 0.0212 | 0.8618 | 0.7592 | 1e-10 | 54 |
| 0.0163 | 0.8058 | 0.6829 | 0.0221 | 0.8604 | 0.7570 | 1e-10 | 55 |
| 0.0164 | 0.8017 | 0.6785 | 0.0214 | 0.8612 | 0.7583 | 1e-10 | 56 |
| 0.0160 | 0.8059 | 0.6827 | 0.0210 | 0.8620 | 0.7595 | 1e-10 | 57 |
| 0.0162 | 0.8038 | 0.6805 | 0.0216 | 0.8616 | 0.7587 | 1e-10 | 58 |
| 0.0160 | 0.8022 | 0.6791 | 0.0222 | 0.8598 | 0.7562 | 1e-10 | 59 |
| 0.0161 | 0.8045 | 0.6812 | 0.0215 | 0.8614 | 0.7585 | 1e-10 | 60 |
| 0.0159 | 0.8026 | 0.6794 | 0.0213 | 0.8605 | 0.7572 | 1e-10 | 61 |
| 0.0161 | 0.8069 | 0.6846 | 0.0216 | 0.8608 | 0.7577 | 1e-10 | 62 |
| 0.0159 | 0.8088 | 0.6873 | 0.0209 | 0.8628 | 0.7607 | 1e-10 | 63 |
| 0.0161 | 0.8016 | 0.6783 | 0.0212 | 0.8616 | 0.7588 | 1e-10 | 64 |
| 0.0161 | 0.8031 | 0.6798 | 0.0213 | 0.8612 | 0.7583 | 1e-10 | 65 |
| 0.0161 | 0.8038 | 0.6811 | 0.0215 | 0.8601 | 0.7566 | 1e-10 | 66 |
| 0.0160 | 0.8052 | 0.6827 | 0.0216 | 0.8608 | 0.7576 | 1e-10 | 67 |
| 0.0161 | 0.8051 | 0.6825 | 0.0216 | 0.8610 | 0.7580 | 1e-10 | 68 |
| 0.0159 | 0.8055 | 0.6826 | 0.0218 | 0.8601 | 0.7568 | 1e-10 | 69 |
| 0.0159 | 0.8024 | 0.6793 | 0.0212 | 0.8617 | 0.7591 | 1e-10 | 70 |
| 0.0158 | 0.8043 | 0.6813 | 0.0214 | 0.8608 | 0.7578 | 1e-10 | 71 |
| 0.0161 | 0.8074 | 0.6850 | 0.0212 | 0.8610 | 0.7579 | 1e-10 | 72 |
| 0.0161 | 0.8066 | 0.6841 | 0.0216 | 0.8615 | 0.7586 | 1e-10 | 73 |
| 0.0159 | 0.8065 | 0.6841 | 0.0214 | 0.8611 | 0.7582 | 1e-10 | 74 |
| 0.0162 | 0.8039 | 0.6808 | 0.0212 | 0.8617 | 0.7591 | 1e-10 | 75 |
| 0.0160 | 0.8036 | 0.6801 | 0.0214 | 0.8616 | 0.7589 | 1e-10 | 76 |
| 0.0161 | 0.8100 | 0.6879 | 0.0211 | 0.8619 | 0.7595 | 1e-10 | 77 |
| 0.0161 | 0.8049 | 0.6816 | 0.0211 | 0.8616 | 0.7590 | 1e-10 | 78 |
| 0.0161 | 0.8037 | 0.6805 | 0.0221 | 0.8596 | 0.7558 | 1e-10 | 79 |
| 0.0159 | 0.8044 | 0.6816 | 0.0219 | 0.8615 | 0.7587 | 1e-10 | 80 |
| 0.0161 | 0.8031 | 0.6796 | 0.0214 | 0.8611 | 0.7581 | 1e-10 | 81 |
| 0.0160 | 0.8016 | 0.6782 | 0.0209 | 0.8622 | 0.7599 | 1e-10 | 82 |
| 0.0162 | 0.8040 | 0.6810 | 0.0211 | 0.8623 | 0.7601 | 1e-10 | 83 |
| 0.0159 | 0.8065 | 0.6844 | 0.0210 | 0.8624 | 0.7602 | 1e-10 | 84 |
| 0.0159 | 0.8064 | 0.6841 | 0.0216 | 0.8613 | 0.7585 | 1e-10 | 85 |
| 0.0159 | 0.8068 | 0.6851 | 0.0212 | 0.8626 | 0.7604 | 1e-10 | 86 |
| 0.0158 | 0.8049 | 0.6822 | 0.0222 | 0.8600 | 0.7564 | 1e-10 | 87 |
| 0.0161 | 0.8028 | 0.6797 | 0.0210 | 0.8621 | 0.7597 | 1e-10 | 88 |
| 0.0163 | 0.8050 | 0.6814 | 0.0218 | 0.8602 | 0.7567 | 1e-10 | 89 |
| 0.0159 | 0.8077 | 0.6858 | 0.0215 | 0.8611 | 0.7582 | 1e-10 | 90 |
| 0.0159 | 0.8067 | 0.6841 | 0.0213 | 0.8623 | 0.7599 | 1e-10 | 91 |
| 0.0160 | 0.8064 | 0.6837 | 0.0213 | 0.8615 | 0.7588 | 1e-10 | 92 |
| 0.0160 | 0.8073 | 0.6847 | 0.0209 | 0.8627 | 0.7606 | 1e-10 | 93 |
| 0.0159 | 0.8056 | 0.6833 | 0.0214 | 0.8612 | 0.7583 | 1e-10 | 94 |
| 0.0159 | 0.8073 | 0.6852 | 0.0213 | 0.8616 | 0.7590 | 1e-10 | 95 |
| 0.0158 | 0.8051 | 0.6832 | 0.0219 | 0.8615 | 0.7587 | 1e-10 | 96 |
| 0.0161 | 0.8053 | 0.6826 | 0.0220 | 0.8593 | 0.7555 | 1e-10 | 97 |
| 0.0161 | 0.8059 | 0.6832 | 0.0218 | 0.8608 | 0.7577 | 1e-10 | 98 |
| 0.0159 | 0.8034 | 0.6803 | 0.0223 | 0.8606 | 0.7573 | 1e-10 | 99 |
### Framework versions
- Transformers 4.27.4
- TensorFlow 2.11.0
- Tokenizers 0.13.2
|