File size: 4,561 Bytes
989823e
e3048b4
 
 
 
 
 
989823e
e3048b4
 
 
 
 
 
 
 
d8b0ac4
 
 
 
 
 
e3048b4
d8b0ac4
e3048b4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d8b0ac4
e3048b4
 
 
 
 
 
 
fbf44e9
4099131
21f2b08
e60ea41
717a6f0
f4096b3
8669279
218a98e
faf9056
3f46e12
4bceb24
af61c0a
84bd068
87e04ff
7c9ecac
11163fa
2360ca9
bf7e636
a8dc03e
3aad26c
8709491
618fb68
93ff89e
d8b0ac4
e3048b4
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
license: other
tags:
- generated_from_keras_callback
model-index:
- name: AhamadShaik/SegFormer_RESIZE_NLM
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# AhamadShaik/SegFormer_RESIZE_NLM

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0510
- Train Dice Coef: 0.8634
- Train Iou: 0.7624
- Validation Loss: 0.0501
- Validation Dice Coef: 0.8814
- Validation Iou: 0.7899
- Train Lr: 1e-04
- Epoch: 24

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 5e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Train Dice Coef | Train Iou | Validation Loss | Validation Dice Coef | Validation Iou | Train Lr | Epoch |
|:----------:|:---------------:|:---------:|:---------------:|:--------------------:|:--------------:|:--------:|:-----:|
| 0.2282     | 0.5657          | 0.4102    | 0.1322          | 0.6524               | 0.4967         | 1e-04    | 0     |
| 0.1354     | 0.6853          | 0.5329    | 0.0855          | 0.7853               | 0.6544         | 1e-04    | 1     |
| 0.1105     | 0.7364          | 0.5924    | 0.0737          | 0.8147               | 0.6916         | 1e-04    | 2     |
| 0.0985     | 0.7610          | 0.6226    | 0.0632          | 0.8518               | 0.7440         | 1e-04    | 3     |
| 0.0933     | 0.7745          | 0.6399    | 0.0627          | 0.8455               | 0.7351         | 1e-04    | 4     |
| 0.0886     | 0.7856          | 0.6535    | 0.0584          | 0.8603               | 0.7566         | 1e-04    | 5     |
| 0.0831     | 0.7971          | 0.6695    | 0.0559          | 0.8621               | 0.7596         | 1e-04    | 6     |
| 0.0770     | 0.8107          | 0.6867    | 0.0530          | 0.8726               | 0.7756         | 1e-04    | 7     |
| 0.0741     | 0.8160          | 0.6942    | 0.0512          | 0.8775               | 0.7832         | 1e-04    | 8     |
| 0.0750     | 0.8163          | 0.6945    | 0.0581          | 0.8627               | 0.7606         | 1e-04    | 9     |
| 0.0678     | 0.8306          | 0.7138    | 0.0531          | 0.8719               | 0.7745         | 1e-04    | 10    |
| 0.0659     | 0.8341          | 0.7196    | 0.0519          | 0.8738               | 0.7781         | 1e-04    | 11    |
| 0.0626     | 0.8412          | 0.7294    | 0.0496          | 0.8789               | 0.7853         | 1e-04    | 12    |
| 0.0637     | 0.8383          | 0.7257    | 0.0515          | 0.8772               | 0.7828         | 1e-04    | 13    |
| 0.0601     | 0.8462          | 0.7367    | 0.0498          | 0.8765               | 0.7814         | 1e-04    | 14    |
| 0.0573     | 0.8525          | 0.7458    | 0.0474          | 0.8817               | 0.7897         | 1e-04    | 15    |
| 0.0565     | 0.8520          | 0.7456    | 0.0459          | 0.8850               | 0.7948         | 1e-04    | 16    |
| 0.0633     | 0.8381          | 0.7262    | 0.0487          | 0.8797               | 0.7868         | 1e-04    | 17    |
| 0.0558     | 0.8544          | 0.7489    | 0.0476          | 0.8828               | 0.7917         | 1e-04    | 18    |
| 0.0523     | 0.8617          | 0.7595    | 0.0454          | 0.8872               | 0.7983         | 1e-04    | 19    |
| 0.0516     | 0.8632          | 0.7617    | 0.0465          | 0.8838               | 0.7934         | 1e-04    | 20    |
| 0.0515     | 0.8636          | 0.7625    | 0.0494          | 0.8816               | 0.7894         | 1e-04    | 21    |
| 0.0518     | 0.8630          | 0.7615    | 0.0487          | 0.8836               | 0.7930         | 1e-04    | 22    |
| 0.0521     | 0.8616          | 0.7595    | 0.0483          | 0.8822               | 0.7908         | 1e-04    | 23    |
| 0.0510     | 0.8634          | 0.7624    | 0.0501          | 0.8814               | 0.7899         | 1e-04    | 24    |


### Framework versions

- Transformers 4.27.4
- TensorFlow 2.10.1
- Datasets 2.11.0
- Tokenizers 0.13.3