gogobattle commited on
Commit
78d5163
1 Parent(s): 52a4d13

End of training

Browse files
README.md CHANGED
@@ -17,17 +17,12 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
19
  It achieves the following results on the evaluation set:
20
- - eval_loss: 3.8239
21
- - eval_mean_iou: 0.0561
22
- - eval_mean_accuracy: 0.1192
23
- - eval_overall_accuracy: 0.4745
24
- - eval_per_category_iou: [0.4655971148378795, 0.451911745929416, 0.8263958610052735, 0.10643353454379628, 0.37555328491585666, 0.010895539847261911, 0.01388106905779724, 0.010106569270025517, 0.0, 0.4158521677639333, 0.0, 0.0, 0.13453689433840837, 0.0, 0.0, 0.03205838274581351, 0.029376135675348275, 0.0, 0.0, 0.0, 0.0, 0.0, 0.04515055731159541, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan]
25
- - eval_per_category_accuracy: [0.7553991473096148, 0.6333018497116608, 0.9809226320451422, 0.8229344785991702, 0.8472403000645834, 0.01553860819828408, 0.020446819838637015, 0.010194297249558415, 0.0, 0.4722481604601651, 0.0, 0.0, 0.14492912265338515, 0.0, 0.0, 0.03903832420852178, 0.029645476772616138, 0.0, nan, 0.0, 0.0, 0.0, 0.11732664706899533, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan]
26
- - eval_runtime: 21.0723
27
- - eval_samples_per_second: 0.475
28
- - eval_steps_per_second: 0.237
29
- - epoch: 4.0
30
- - step: 80
31
 
32
  ## Model description
33
 
@@ -47,12 +42,20 @@ More information needed
47
 
48
  The following hyperparameters were used during training:
49
  - learning_rate: 6e-05
50
- - train_batch_size: 2
51
- - eval_batch_size: 2
52
  - seed: 42
53
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
54
  - lr_scheduler_type: linear
55
- - num_epochs: 50
 
 
 
 
 
 
 
 
56
 
57
  ### Framework versions
58
 
 
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 3.6538
21
+ - Mean Iou: 0.0513
22
+ - Mean Accuracy: 0.1002
23
+ - Overall Accuracy: 0.3674
24
+ - Per Category Iou: [0.41523063020750783, 0.009939916202993341, 0.9051053539418947, 0.2405546042011158, 0.5592627102446484, 0.008100205817661452, 0.4158655729996604, 0.0, nan, 0.0, 0.0, 0.0, 0.1653567193403381, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.00014481591921904725, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan]
25
+ - Per Category Accuracy: [0.9111665467920647, 0.5584905660377358, 0.96496051635236, 0.7798917474318848, 0.9961487722881824, 0.008167736993524596, 0.4204927396509486, nan, nan, nan, 0.0, 0.0, 0.17034574845876305, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.00014681348014681348, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan]
 
 
 
 
 
26
 
27
  ## Model description
28
 
 
42
 
43
  The following hyperparameters were used during training:
44
  - learning_rate: 6e-05
45
+ - train_batch_size: 4
46
+ - eval_batch_size: 4
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
+ - num_epochs: 5
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
55
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
56
+ | 3.2677 | 2.0 | 20 | 3.6205 | 0.0524 | 0.0966 | 0.3613 | [0.40213077197792346, 0.009939273919119628, 0.8721087225424647, 0.2050790121535925, 0.6743243243243243, 0.006688714489444168, 0.4140137237846719, 0.0, nan, 0.0, 0.0, 0.0, 0.15446220048180365, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.003601957810561147, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.00043712407329696463, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.03405304544848727, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan] | [0.9390478618316233, 0.46037735849056605, 0.972415533840551, 0.6499966195928399, 0.9874249968083748, 0.006809416501144552, 0.4243094579335574, nan, nan, nan, 0.0, 0.0, 0.1590859588814076, nan, 0.0, 0.0, nan, 0.0, nan, 0.003607242099449421, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.00043712407329696463, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.03405304544848727, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan] |
57
+ | 3.1569 | 4.0 | 40 | 3.6538 | 0.0513 | 0.1002 | 0.3674 | [0.41523063020750783, 0.009939916202993341, 0.9051053539418947, 0.2405546042011158, 0.5592627102446484, 0.008100205817661452, 0.4158655729996604, 0.0, nan, 0.0, 0.0, 0.0, 0.1653567193403381, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.00014481591921904725, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan] | [0.9111665467920647, 0.5584905660377358, 0.96496051635236, 0.7798917474318848, 0.9961487722881824, 0.008167736993524596, 0.4204927396509486, nan, nan, nan, 0.0, 0.0, 0.17034574845876305, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.00014681348014681348, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan] |
58
+
59
 
60
  ### Framework versions
61
 
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e7732f5bf73a3944fc5e5282086d42f76042267af3b44bd22b94da21358bcd44
3
  size 15036944
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0f4a363c9edd484616b602ea2305ccbf1d86079df62f92eac71cb794f0a2595b
3
  size 15036944
runs/Jun25_15-00-23_8c74e9b89fb9/events.out.tfevents.1719327623.8c74e9b89fb9.5763.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:576306ab7e54df180f6a705060b454eb8eb5c85fae1e30c52ccd162441c6d585
3
- size 20281
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2e67db0e273d0c1214e999241caaf2a53bf27a809a1853f686297dff098d325e
3
+ size 22699