lombardata commited on
Commit
38e9099
1 Parent(s): d0cdfb4

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +148 -103
README.md CHANGED
@@ -1,124 +1,169 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-large
 
4
  tags:
 
 
5
  - generated_from_trainer
 
6
  model-index:
7
  - name: bd_ortho-DinoVdeau-large-2024_11_27-batch-size64_freeze_probs
8
  results: []
9
  ---
10
 
11
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
- should probably proofread and complete it, then remove this comment. -->
13
 
14
- # bd_ortho-DinoVdeau-large-2024_11_27-batch-size64_freeze_probs
15
 
16
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
17
- It achieves the following results on the evaluation set:
18
  - Loss: 0.4551
19
- - Rmse: 0.0866
20
- - Mae: 0.0630
21
- - Kl Divergence: 0.1147
22
- - Explained Variance: 0.6593
23
- - Learning Rate: 0.0000
 
 
 
24
 
25
- ## Model description
26
 
27
- More information needed
28
 
29
- ## Intended uses & limitations
 
 
 
30
 
31
- More information needed
32
 
33
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
34
 
35
- More information needed
36
 
37
- ## Training procedure
38
 
39
- ### Training hyperparameters
40
 
41
  The following hyperparameters were used during training:
42
- - learning_rate: 0.001
43
- - train_batch_size: 64
44
- - eval_batch_size: 64
45
- - seed: 42
46
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
- - lr_scheduler_type: linear
48
- - num_epochs: 150
49
- - mixed_precision_training: Native AMP
50
-
51
- ### Training results
52
-
53
- | Training Loss | Epoch | Step | Validation Loss | Rmse | Mae | Kl Divergence | Explained Variance | Rate |
54
- |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:-------------:|:------------------:|:------:|
55
- | No log | 1.0 | 221 | 0.4634 | 0.1018 | 0.0760 | 0.0696 | 0.5492 | 0.001 |
56
- | No log | 2.0 | 442 | 0.4593 | 0.0952 | 0.0716 | 0.0038 | 0.6113 | 0.001 |
57
- | 0.5185 | 3.0 | 663 | 0.4574 | 0.0918 | 0.0670 | 0.0583 | 0.6245 | 0.001 |
58
- | 0.5185 | 4.0 | 884 | 0.4595 | 0.0955 | 0.0713 | -0.0650 | 0.6130 | 0.001 |
59
- | 0.4806 | 5.0 | 1105 | 0.4593 | 0.0954 | 0.0702 | -0.0835 | 0.6206 | 0.001 |
60
- | 0.4806 | 6.0 | 1326 | 0.4608 | 0.0977 | 0.0728 | -0.0705 | 0.6041 | 0.001 |
61
- | 0.4786 | 7.0 | 1547 | 0.4581 | 0.0927 | 0.0683 | -0.0044 | 0.6283 | 0.001 |
62
- | 0.4786 | 8.0 | 1768 | 0.4573 | 0.0916 | 0.0680 | 0.0799 | 0.6277 | 0.001 |
63
- | 0.4786 | 9.0 | 1989 | 0.4594 | 0.0947 | 0.0706 | 0.0233 | 0.6196 | 0.001 |
64
- | 0.4776 | 10.0 | 2210 | 0.4577 | 0.0918 | 0.0675 | 0.0885 | 0.6293 | 0.001 |
65
- | 0.4776 | 11.0 | 2431 | 0.4564 | 0.0898 | 0.0662 | 0.1296 | 0.6422 | 0.001 |
66
- | 0.4772 | 12.0 | 2652 | 0.4572 | 0.0913 | 0.0677 | -0.0061 | 0.6386 | 0.001 |
67
- | 0.4772 | 13.0 | 2873 | 0.4623 | 0.1002 | 0.0747 | -0.2060 | 0.6186 | 0.001 |
68
- | 0.4769 | 14.0 | 3094 | 0.4578 | 0.0925 | 0.0678 | -0.0371 | 0.6346 | 0.001 |
69
- | 0.4769 | 15.0 | 3315 | 0.4575 | 0.0917 | 0.0667 | 0.0458 | 0.6340 | 0.001 |
70
- | 0.4766 | 16.0 | 3536 | 0.4579 | 0.0926 | 0.0680 | 0.0151 | 0.6277 | 0.001 |
71
- | 0.4766 | 17.0 | 3757 | 0.4592 | 0.0949 | 0.0702 | -0.0679 | 0.6246 | 0.001 |
72
- | 0.4766 | 18.0 | 3978 | 0.4557 | 0.0887 | 0.0651 | 0.0421 | 0.6493 | 0.0001 |
73
- | 0.4758 | 19.0 | 4199 | 0.4556 | 0.0885 | 0.0647 | 0.0468 | 0.6508 | 0.0001 |
74
- | 0.4758 | 20.0 | 4420 | 0.4555 | 0.0884 | 0.0648 | 0.0405 | 0.6518 | 0.0001 |
75
- | 0.4741 | 21.0 | 4641 | 0.4555 | 0.0884 | 0.0650 | 0.0475 | 0.6533 | 0.0001 |
76
- | 0.4741 | 22.0 | 4862 | 0.4555 | 0.0883 | 0.0646 | 0.0570 | 0.6535 | 0.0001 |
77
- | 0.4738 | 23.0 | 5083 | 0.4551 | 0.0874 | 0.0641 | 0.0887 | 0.6570 | 0.0001 |
78
- | 0.4738 | 24.0 | 5304 | 0.4552 | 0.0878 | 0.0642 | 0.0555 | 0.6553 | 0.0001 |
79
- | 0.4736 | 25.0 | 5525 | 0.4552 | 0.0878 | 0.0645 | 0.0238 | 0.6582 | 0.0001 |
80
- | 0.4736 | 26.0 | 5746 | 0.4557 | 0.0885 | 0.0646 | 0.0409 | 0.6572 | 0.0001 |
81
- | 0.4736 | 27.0 | 5967 | 0.4551 | 0.0876 | 0.0639 | 0.0548 | 0.6576 | 0.0001 |
82
- | 0.4731 | 28.0 | 6188 | 0.4551 | 0.0876 | 0.0642 | 0.0273 | 0.6588 | 0.0001 |
83
- | 0.4731 | 29.0 | 6409 | 0.4548 | 0.0869 | 0.0634 | 0.0744 | 0.6618 | 0.0001 |
84
- | 0.4727 | 30.0 | 6630 | 0.4549 | 0.0873 | 0.0636 | 0.0492 | 0.6595 | 0.0001 |
85
- | 0.4727 | 31.0 | 6851 | 0.4548 | 0.0869 | 0.0632 | 0.0688 | 0.6613 | 0.0001 |
86
- | 0.4732 | 32.0 | 7072 | 0.4550 | 0.0874 | 0.0639 | 0.0271 | 0.6602 | 0.0001 |
87
- | 0.4732 | 33.0 | 7293 | 0.4554 | 0.0882 | 0.0647 | -0.0174 | 0.6580 | 0.0001 |
88
- | 0.4725 | 34.0 | 7514 | 0.4546 | 0.0866 | 0.0628 | 0.1094 | 0.6616 | 0.0001 |
89
- | 0.4725 | 35.0 | 7735 | 0.4550 | 0.0874 | 0.0639 | 0.0571 | 0.6583 | 0.0001 |
90
- | 0.4725 | 36.0 | 7956 | 0.4548 | 0.0869 | 0.0629 | 0.1453 | 0.6616 | 0.0001 |
91
- | 0.4727 | 37.0 | 8177 | 0.4553 | 0.0881 | 0.0645 | -0.0152 | 0.6587 | 0.0001 |
92
- | 0.4727 | 38.0 | 8398 | 0.4548 | 0.0870 | 0.0636 | 0.0490 | 0.6613 | 0.0001 |
93
- | 0.4727 | 39.0 | 8619 | 0.4548 | 0.0870 | 0.0631 | 0.0726 | 0.6610 | 0.0001 |
94
- | 0.4727 | 40.0 | 8840 | 0.4548 | 0.0870 | 0.0632 | 0.0637 | 0.6605 | 0.0001 |
95
- | 0.4721 | 41.0 | 9061 | 0.4547 | 0.0869 | 0.0634 | 0.0390 | 0.6628 | 1e-05 |
96
- | 0.4721 | 42.0 | 9282 | 0.4544 | 0.0862 | 0.0628 | 0.1115 | 0.6657 | 1e-05 |
97
- | 0.4721 | 43.0 | 9503 | 0.4546 | 0.0866 | 0.0632 | 0.0533 | 0.6646 | 1e-05 |
98
- | 0.4721 | 44.0 | 9724 | 0.4545 | 0.0864 | 0.0625 | 0.1350 | 0.6648 | 1e-05 |
99
- | 0.4721 | 45.0 | 9945 | 0.4550 | 0.0874 | 0.0642 | 0.0044 | 0.6625 | 1e-05 |
100
- | 0.4716 | 46.0 | 10166 | 0.4546 | 0.0867 | 0.0632 | 0.0389 | 0.6642 | 1e-05 |
101
- | 0.4716 | 47.0 | 10387 | 0.4545 | 0.0866 | 0.0630 | 0.0370 | 0.6651 | 1e-05 |
102
- | 0.4722 | 48.0 | 10608 | 0.4546 | 0.0868 | 0.0634 | 0.0194 | 0.6645 | 1e-05 |
103
- | 0.4722 | 49.0 | 10829 | 0.4544 | 0.0862 | 0.0627 | 0.0667 | 0.6667 | 0.0000 |
104
- | 0.4717 | 50.0 | 11050 | 0.4545 | 0.0865 | 0.0631 | 0.0548 | 0.6651 | 0.0000 |
105
- | 0.4717 | 51.0 | 11271 | 0.4545 | 0.0865 | 0.0629 | 0.0428 | 0.6651 | 0.0000 |
106
- | 0.4717 | 52.0 | 11492 | 0.4542 | 0.0859 | 0.0623 | 0.1236 | 0.6672 | 0.0000 |
107
- | 0.4718 | 53.0 | 11713 | 0.4542 | 0.0859 | 0.0625 | 0.0887 | 0.6672 | 0.0000 |
108
- | 0.4718 | 54.0 | 11934 | 0.4543 | 0.0862 | 0.0624 | 0.0917 | 0.6653 | 0.0000 |
109
- | 0.4716 | 55.0 | 12155 | 0.4546 | 0.0865 | 0.0631 | 0.0774 | 0.6650 | 0.0000 |
110
- | 0.4716 | 56.0 | 12376 | 0.4546 | 0.0866 | 0.0633 | 0.0473 | 0.6649 | 0.0000 |
111
- | 0.4717 | 57.0 | 12597 | 0.4549 | 0.0871 | 0.0639 | -0.0046 | 0.6658 | 0.0000 |
112
- | 0.4717 | 58.0 | 12818 | 0.4544 | 0.0864 | 0.0627 | 0.0553 | 0.6656 | 0.0000 |
113
- | 0.4716 | 59.0 | 13039 | 0.4545 | 0.0865 | 0.0631 | 0.0368 | 0.6654 | 0.0000 |
114
- | 0.4716 | 60.0 | 13260 | 0.4544 | 0.0863 | 0.0629 | 0.0471 | 0.6660 | 0.0000 |
115
- | 0.4716 | 61.0 | 13481 | 0.4542 | 0.0860 | 0.0624 | 0.0928 | 0.6670 | 0.0000 |
116
- | 0.4718 | 62.0 | 13702 | 0.4545 | 0.0866 | 0.0632 | 0.0286 | 0.6661 | 0.0000 |
117
-
118
-
119
- ### Framework versions
120
-
121
- - Transformers 4.41.0
122
- - Pytorch 2.5.0+cu124
123
- - Datasets 3.0.2
124
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: cc0-1.0
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: bd_ortho-DinoVdeau-large-2024_11_27-batch-size64_freeze_probs
11
  model-index:
12
  - name: bd_ortho-DinoVdeau-large-2024_11_27-batch-size64_freeze_probs
13
  results: []
14
  ---
15
 
16
+ bd_ortho-DinoVdeau is a fine-tuned version of [bd_ortho-DinoVdeau-large-2024_11_27-batch-size64_freeze_probs](https://huggingface.co/bd_ortho-DinoVdeau-large-2024_11_27-batch-size64_freeze_probs). It achieves the following results on the test set:
 
17
 
 
18
 
 
 
19
  - Loss: 0.4551
20
+ - RMSE: 0.0866
21
+ - MAE: 0.0630
22
+ - KL Divergence: 0.1147
23
+
24
+ ---
25
+
26
+ # Model description
27
+ bd_ortho-DinoVdeau is a model built on top of bd_ortho-DinoVdeau-large-2024_11_27-batch-size64_freeze_probs model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
28
 
29
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
30
 
31
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
32
 
33
+ ---
34
+
35
+ # Intended uses & limitations
36
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
37
 
38
+ ---
39
 
40
+ # Training and evaluation data
41
+ Details on the estimated number of images for each class are given in the following table:
42
+ | Class | train | test | val | Total |
43
+ |:------------------------|--------:|-------:|------:|--------:|
44
+ | Acropore_branched | 8074 | 2667 | 2658 | 13399 |
45
+ | Acropore_digitised | 3730 | 829 | 823 | 5382 |
46
+ | Acropore_tabular | 125 | 23 | 40 | 188 |
47
+ | Algae | 14027 | 4662 | 4660 | 23349 |
48
+ | Dead_coral | 11369 | 3364 | 3369 | 18102 |
49
+ | Millepore | 2 | 1 | 1 | 4 |
50
+ | No_acropore_encrusting | 0 | 0 | 0 | 0 |
51
+ | No_acropore_massive | 3265 | 423 | 463 | 4151 |
52
+ | No_acropore_sub_massive | 10241 | 2911 | 2924 | 16076 |
53
+ | Rock | 14090 | 4694 | 4693 | 23477 |
54
+ | Rubble | 12455 | 3915 | 3883 | 20253 |
55
+ | Sand | 12848 | 4098 | 4079 | 21025 |
56
 
57
+ ---
58
 
59
+ # Training procedure
60
 
61
+ ## Training hyperparameters
62
 
63
  The following hyperparameters were used during training:
64
+
65
+ - **Number of Epochs**: 62.0
66
+ - **Learning Rate**: 0.001
67
+ - **Train Batch Size**: 64
68
+ - **Eval Batch Size**: 64
69
+ - **Optimizer**: Adam
70
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
71
+ - **Freeze Encoder**: Yes
72
+ - **Data Augmentation**: Yes
73
+
74
+
75
+ ## Data Augmentation
76
+ Data were augmented using the following transformations :
77
+
78
+ Train Transforms
79
+ - **PreProcess**: No additional parameters
80
+ - **Resize**: probability=1.00
81
+ - **RandomHorizontalFlip**: probability=0.25
82
+ - **RandomVerticalFlip**: probability=0.25
83
+ - **ColorJiggle**: probability=0.25
84
+ - **RandomPerspective**: probability=0.25
85
+ - **Normalize**: probability=1.00
86
+
87
+ Val Transforms
88
+ - **PreProcess**: No additional parameters
89
+ - **Resize**: probability=1.00
90
+ - **Normalize**: probability=1.00
91
+
92
+
93
+
94
+ ## Training results
95
+ Epoch | Validation Loss | MAE | RMSE | KL div | Learning Rate
96
+ --- | --- | --- | --- | --- | ---
97
+ 1 | 0.46336060762405396 | 0.0760 | 0.1018 | 0.0696 | 0.001
98
+ 2 | 0.45933997631073 | 0.0716 | 0.0952 | 0.0038 | 0.001
99
+ 3 | 0.457367479801178 | 0.0670 | 0.0918 | 0.0583 | 0.001
100
+ 4 | 0.459468811750412 | 0.0713 | 0.0955 | -0.0650 | 0.001
101
+ 5 | 0.45927393436431885 | 0.0702 | 0.0954 | -0.0835 | 0.001
102
+ 6 | 0.46080395579338074 | 0.0728 | 0.0977 | -0.0705 | 0.001
103
+ 7 | 0.4581476151943207 | 0.0683 | 0.0927 | -0.0044 | 0.001
104
+ 8 | 0.4573117196559906 | 0.0680 | 0.0916 | 0.0799 | 0.001
105
+ 9 | 0.45939013361930847 | 0.0706 | 0.0947 | 0.0233 | 0.001
106
+ 10 | 0.45772281289100647 | 0.0675 | 0.0918 | 0.0885 | 0.001
107
+ 11 | 0.45641985535621643 | 0.0662 | 0.0898 | 0.1296 | 0.001
108
+ 12 | 0.45718902349472046 | 0.0677 | 0.0913 | -0.0061 | 0.001
109
+ 13 | 0.4622880220413208 | 0.0747 | 0.1002 | -0.2060 | 0.001
110
+ 14 | 0.45775285363197327 | 0.0678 | 0.0925 | -0.0371 | 0.001
111
+ 15 | 0.4575214684009552 | 0.0667 | 0.0917 | 0.0458 | 0.001
112
+ 16 | 0.4578736424446106 | 0.0680 | 0.0926 | 0.0151 | 0.001
113
+ 17 | 0.4592094421386719 | 0.0702 | 0.0949 | -0.0679 | 0.001
114
+ 18 | 0.45573291182518005 | 0.0651 | 0.0887 | 0.0421 | 0.0001
115
+ 19 | 0.4555513262748718 | 0.0647 | 0.0885 | 0.0468 | 0.0001
116
+ 20 | 0.45553284883499146 | 0.0648 | 0.0884 | 0.0405 | 0.0001
117
+ 21 | 0.4555487334728241 | 0.0650 | 0.0884 | 0.0475 | 0.0001
118
+ 22 | 0.45551028847694397 | 0.0646 | 0.0883 | 0.0570 | 0.0001
119
+ 23 | 0.45505577325820923 | 0.0641 | 0.0874 | 0.0887 | 0.0001
120
+ 24 | 0.4552234709262848 | 0.0642 | 0.0878 | 0.0555 | 0.0001
121
+ 25 | 0.45521080493927 | 0.0645 | 0.0878 | 0.0238 | 0.0001
122
+ 26 | 0.4557025730609894 | 0.0646 | 0.0885 | 0.0409 | 0.0001
123
+ 27 | 0.4550967216491699 | 0.0639 | 0.0876 | 0.0548 | 0.0001
124
+ 28 | 0.45512688159942627 | 0.0642 | 0.0876 | 0.0273 | 0.0001
125
+ 29 | 0.45477041602134705 | 0.0634 | 0.0869 | 0.0744 | 0.0001
126
+ 30 | 0.4549327790737152 | 0.0636 | 0.0873 | 0.0492 | 0.0001
127
+ 31 | 0.4547973871231079 | 0.0632 | 0.0869 | 0.0688 | 0.0001
128
+ 32 | 0.454988956451416 | 0.0639 | 0.0874 | 0.0271 | 0.0001
129
+ 33 | 0.455375999212265 | 0.0647 | 0.0882 | -0.0174 | 0.0001
130
+ 34 | 0.45461305975914 | 0.0628 | 0.0866 | 0.1094 | 0.0001
131
+ 35 | 0.45498156547546387 | 0.0639 | 0.0874 | 0.0571 | 0.0001
132
+ 36 | 0.4548388123512268 | 0.0629 | 0.0869 | 0.1453 | 0.0001
133
+ 37 | 0.45526784658432007 | 0.0645 | 0.0881 | -0.0152 | 0.0001
134
+ 38 | 0.45479556918144226 | 0.0636 | 0.0870 | 0.0490 | 0.0001
135
+ 39 | 0.454780250787735 | 0.0631 | 0.0870 | 0.0726 | 0.0001
136
+ 40 | 0.45476558804512024 | 0.0632 | 0.0870 | 0.0637 | 0.0001
137
+ 41 | 0.45470812916755676 | 0.0634 | 0.0869 | 0.0390 | 1e-05
138
+ 42 | 0.4543863534927368 | 0.0628 | 0.0862 | 0.1115 | 1e-05
139
+ 43 | 0.4545557498931885 | 0.0632 | 0.0866 | 0.0533 | 1e-05
140
+ 44 | 0.45448434352874756 | 0.0625 | 0.0864 | 0.1350 | 1e-05
141
+ 45 | 0.4550137519836426 | 0.0642 | 0.0874 | 0.0044 | 1e-05
142
+ 46 | 0.4545902609825134 | 0.0632 | 0.0867 | 0.0389 | 1e-05
143
+ 47 | 0.4544997215270996 | 0.0630 | 0.0866 | 0.0370 | 1e-05
144
+ 48 | 0.4546374976634979 | 0.0634 | 0.0868 | 0.0194 | 1e-05
145
+ 49 | 0.45436596870422363 | 0.0627 | 0.0862 | 0.0667 | 1.0000000000000002e-06
146
+ 50 | 0.45450592041015625 | 0.0631 | 0.0865 | 0.0548 | 1.0000000000000002e-06
147
+ 51 | 0.4544804096221924 | 0.0629 | 0.0865 | 0.0428 | 1.0000000000000002e-06
148
+ 52 | 0.45421910285949707 | 0.0623 | 0.0859 | 0.1236 | 1.0000000000000002e-06
149
+ 53 | 0.4542272686958313 | 0.0625 | 0.0859 | 0.0887 | 1.0000000000000002e-06
150
+ 54 | 0.4543103575706482 | 0.0624 | 0.0862 | 0.0917 | 1.0000000000000002e-06
151
+ 55 | 0.45456644892692566 | 0.0631 | 0.0865 | 0.0774 | 1.0000000000000002e-06
152
+ 56 | 0.45458319783210754 | 0.0633 | 0.0866 | 0.0473 | 1.0000000000000002e-06
153
+ 57 | 0.4548773169517517 | 0.0639 | 0.0871 | -0.0046 | 1.0000000000000002e-06
154
+ 58 | 0.45440155267715454 | 0.0627 | 0.0864 | 0.0553 | 1.0000000000000002e-06
155
+ 59 | 0.45448538661003113 | 0.0631 | 0.0865 | 0.0368 | 1.0000000000000002e-07
156
+ 60 | 0.4544091522693634 | 0.0629 | 0.0863 | 0.0471 | 1.0000000000000002e-07
157
+ 61 | 0.4542348086833954 | 0.0624 | 0.0860 | 0.0928 | 1.0000000000000002e-07
158
+ 62 | 0.4545469284057617 | 0.0632 | 0.0866 | 0.0286 | 1.0000000000000002e-07
159
+
160
+
161
+ ---
162
+
163
+ # Framework Versions
164
+
165
+ - **Transformers**: 4.41.0
166
+ - **Pytorch**: 2.5.0+cu124
167
+ - **Datasets**: 3.0.2
168
+ - **Tokenizers**: 0.19.1
169
+