File size: 8,587 Bytes
989823e
e3048b4
 
 
 
 
 
989823e
e3048b4
 
 
 
 
 
 
 
aae2cfa
2085d98
 
 
 
 
27fa54c
2085d98
e3048b4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
83bcd2c
e3048b4
 
 
 
 
 
 
fbf44e9
4099131
21f2b08
e60ea41
717a6f0
f4096b3
8669279
218a98e
faf9056
3f46e12
4bceb24
af61c0a
84bd068
87e04ff
7c9ecac
11163fa
2360ca9
bf7e636
a8dc03e
3aad26c
8709491
618fb68
93ff89e
d8b0ac4
a658f87
a4536d6
a952da6
ae56c5c
fe48589
81d5f09
5f18301
c1b8edc
8faadbf
cf344fc
b5bacaa
0be2e99
a411e4f
ebce84c
b0b2467
28fa1e2
4342be5
204b7ac
b4b99ef
5afbfa7
1a6dea0
363a62e
0b35c12
a43148b
83bcd2c
27fa54c
8569f61
0a0582d
2f498f6
258ce97
acb96f3
aae2cfa
2085d98
e3048b4
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---
license: other
tags:
- generated_from_keras_callback
model-index:
- name: AhamadShaik/SegFormer_RESIZE_NLM
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# AhamadShaik/SegFormer_RESIZE_NLM

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0421
- Train Dice Coef: 0.8821
- Train Iou: 0.7909
- Validation Loss: 0.0426
- Validation Dice Coef: 0.8896
- Validation Iou: 0.8023
- Train Lr: 1e-10
- Epoch: 57

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 1e-10, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Train Dice Coef | Train Iou | Validation Loss | Validation Dice Coef | Validation Iou | Train Lr | Epoch |
|:----------:|:---------------:|:---------:|:---------------:|:--------------------:|:--------------:|:--------:|:-----:|
| 0.2282     | 0.5657          | 0.4102    | 0.1322          | 0.6524               | 0.4967         | 1e-04    | 0     |
| 0.1354     | 0.6853          | 0.5329    | 0.0855          | 0.7853               | 0.6544         | 1e-04    | 1     |
| 0.1105     | 0.7364          | 0.5924    | 0.0737          | 0.8147               | 0.6916         | 1e-04    | 2     |
| 0.0985     | 0.7610          | 0.6226    | 0.0632          | 0.8518               | 0.7440         | 1e-04    | 3     |
| 0.0933     | 0.7745          | 0.6399    | 0.0627          | 0.8455               | 0.7351         | 1e-04    | 4     |
| 0.0886     | 0.7856          | 0.6535    | 0.0584          | 0.8603               | 0.7566         | 1e-04    | 5     |
| 0.0831     | 0.7971          | 0.6695    | 0.0559          | 0.8621               | 0.7596         | 1e-04    | 6     |
| 0.0770     | 0.8107          | 0.6867    | 0.0530          | 0.8726               | 0.7756         | 1e-04    | 7     |
| 0.0741     | 0.8160          | 0.6942    | 0.0512          | 0.8775               | 0.7832         | 1e-04    | 8     |
| 0.0750     | 0.8163          | 0.6945    | 0.0581          | 0.8627               | 0.7606         | 1e-04    | 9     |
| 0.0678     | 0.8306          | 0.7138    | 0.0531          | 0.8719               | 0.7745         | 1e-04    | 10    |
| 0.0659     | 0.8341          | 0.7196    | 0.0519          | 0.8738               | 0.7781         | 1e-04    | 11    |
| 0.0626     | 0.8412          | 0.7294    | 0.0496          | 0.8789               | 0.7853         | 1e-04    | 12    |
| 0.0637     | 0.8383          | 0.7257    | 0.0515          | 0.8772               | 0.7828         | 1e-04    | 13    |
| 0.0601     | 0.8462          | 0.7367    | 0.0498          | 0.8765               | 0.7814         | 1e-04    | 14    |
| 0.0573     | 0.8525          | 0.7458    | 0.0474          | 0.8817               | 0.7897         | 1e-04    | 15    |
| 0.0565     | 0.8520          | 0.7456    | 0.0459          | 0.8850               | 0.7948         | 1e-04    | 16    |
| 0.0633     | 0.8381          | 0.7262    | 0.0487          | 0.8797               | 0.7868         | 1e-04    | 17    |
| 0.0558     | 0.8544          | 0.7489    | 0.0476          | 0.8828               | 0.7917         | 1e-04    | 18    |
| 0.0523     | 0.8617          | 0.7595    | 0.0454          | 0.8872               | 0.7983         | 1e-04    | 19    |
| 0.0516     | 0.8632          | 0.7617    | 0.0465          | 0.8838               | 0.7934         | 1e-04    | 20    |
| 0.0515     | 0.8636          | 0.7625    | 0.0494          | 0.8816               | 0.7894         | 1e-04    | 21    |
| 0.0518     | 0.8630          | 0.7615    | 0.0487          | 0.8836               | 0.7930         | 1e-04    | 22    |
| 0.0521     | 0.8616          | 0.7595    | 0.0483          | 0.8822               | 0.7908         | 1e-04    | 23    |
| 0.0510     | 0.8634          | 0.7624    | 0.0501          | 0.8814               | 0.7899         | 1e-04    | 24    |
| 0.0485     | 0.8703          | 0.7728    | 0.0439          | 0.8892               | 0.8018         | 5e-06    | 25    |
| 0.0464     | 0.8755          | 0.7807    | 0.0433          | 0.8890               | 0.8015         | 5e-06    | 26    |
| 0.0456     | 0.8760          | 0.7817    | 0.0439          | 0.8884               | 0.8004         | 5e-06    | 27    |
| 0.0446     | 0.8790          | 0.7860    | 0.0428          | 0.8896               | 0.8024         | 5e-06    | 28    |
| 0.0443     | 0.8786          | 0.7855    | 0.0426          | 0.8905               | 0.8038         | 5e-06    | 29    |
| 0.0439     | 0.8795          | 0.7867    | 0.0439          | 0.8881               | 0.7999         | 5e-06    | 30    |
| 0.0436     | 0.8800          | 0.7876    | 0.0429          | 0.8902               | 0.8032         | 5e-06    | 31    |
| 0.0430     | 0.8809          | 0.7890    | 0.0439          | 0.8876               | 0.7992         | 5e-06    | 32    |
| 0.0427     | 0.8812          | 0.7894    | 0.0432          | 0.8892               | 0.8016         | 5e-06    | 33    |
| 0.0431     | 0.8798          | 0.7875    | 0.0433          | 0.8895               | 0.8022         | 5e-06    | 34    |
| 0.0425     | 0.8816          | 0.7903    | 0.0435          | 0.8892               | 0.8016         | 2.5e-07  | 35    |
| 0.0420     | 0.8826          | 0.7917    | 0.0433          | 0.8894               | 0.8021         | 2.5e-07  | 36    |
| 0.0423     | 0.8833          | 0.7926    | 0.0429          | 0.8893               | 0.8018         | 2.5e-07  | 37    |
| 0.0420     | 0.8833          | 0.7929    | 0.0430          | 0.8895               | 0.8023         | 2.5e-07  | 38    |
| 0.0424     | 0.8832          | 0.7924    | 0.0437          | 0.8890               | 0.8013         | 2.5e-07  | 39    |
| 0.0422     | 0.8824          | 0.7914    | 0.0427          | 0.8897               | 0.8024         | 1.25e-08 | 40    |
| 0.0426     | 0.8824          | 0.7913    | 0.0431          | 0.8900               | 0.8030         | 1.25e-08 | 41    |
| 0.0424     | 0.8832          | 0.7926    | 0.0433          | 0.8893               | 0.8019         | 1.25e-08 | 42    |
| 0.0424     | 0.8830          | 0.7922    | 0.0436          | 0.8886               | 0.8008         | 1.25e-08 | 43    |
| 0.0427     | 0.8806          | 0.7888    | 0.0434          | 0.8893               | 0.8020         | 1.25e-08 | 44    |
| 0.0421     | 0.8829          | 0.7921    | 0.0431          | 0.8899               | 0.8028         | 6.25e-10 | 45    |
| 0.0427     | 0.8817          | 0.7901    | 0.0431          | 0.8896               | 0.8023         | 6.25e-10 | 46    |
| 0.0422     | 0.8825          | 0.7916    | 0.0433          | 0.8895               | 0.8022         | 6.25e-10 | 47    |
| 0.0423     | 0.8823          | 0.7912    | 0.0431          | 0.8897               | 0.8024         | 6.25e-10 | 48    |
| 0.0423     | 0.8826          | 0.7916    | 0.0433          | 0.8895               | 0.8021         | 6.25e-10 | 49    |
| 0.0425     | 0.8827          | 0.7918    | 0.0433          | 0.8896               | 0.8023         | 1e-10    | 50    |
| 0.0421     | 0.8838          | 0.7937    | 0.0431          | 0.8891               | 0.8014         | 1e-10    | 51    |
| 0.0424     | 0.8820          | 0.7907    | 0.0436          | 0.8884               | 0.8003         | 1e-10    | 52    |
| 0.0424     | 0.8824          | 0.7915    | 0.0426          | 0.8899               | 0.8029         | 1e-10    | 53    |
| 0.0423     | 0.8828          | 0.7920    | 0.0433          | 0.8894               | 0.8020         | 1e-10    | 54    |
| 0.0424     | 0.8818          | 0.7905    | 0.0431          | 0.8901               | 0.8031         | 1e-10    | 55    |
| 0.0421     | 0.8823          | 0.7911    | 0.0438          | 0.8887               | 0.8008         | 1e-10    | 56    |
| 0.0421     | 0.8821          | 0.7909    | 0.0426          | 0.8896               | 0.8023         | 1e-10    | 57    |


### Framework versions

- Transformers 4.27.4
- TensorFlow 2.10.1
- Datasets 2.11.0
- Tokenizers 0.13.3