SegForCoral-b2-2025_06_30_19127-bs16_refine is a fine-tuned version of nvidia/mit-b2.


Model description

SegForCoral-b2-2025_06_30_19127-bs16_refine is a model built on top of nvidia/mit-b2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.

The source code for training the model can be found in this Git repository.


Intended uses & limitations

You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.


Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • Number of Epochs: 20.0
  • Learning Rate: 1e-05
  • Train Batch Size: 16
  • Eval Batch Size: 16
  • Optimizer: Adam
  • LR Scheduler Type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
  • Freeze Encoder: No
  • Data Augmentation: No

Training results

Epoch Validation Loss Learning Rate
1 0.6623146533966064 1e-05
2 0.6218862533569336 1e-05
3 0.5973241329193115 1e-05
4 0.5866864919662476 1e-05
5 0.5728989243507385 1e-05
6 0.5666205286979675 1e-05
7 0.5585390329360962 1e-05
8 0.5711775422096252 1e-05
9 0.550676703453064 1e-05
10 0.5479335188865662 1e-05
11 0.5629982352256775 1e-05
12 0.5543109178543091 1e-05
13 0.5586446523666382 1e-05
14 0.5647701621055603 1e-05
15 0.5843802690505981 1e-05
16 0.5760069489479065 1e-05
17 0.5521435141563416 1.0000000000000002e-06
18 0.5510115027427673 1.0000000000000002e-06
19 0.5491860508918762 1.0000000000000002e-06
20 0.5507251620292664 1.0000000000000002e-06

Framework Versions

  • Transformers: 4.51.3
  • Pytorch: 2.6.0+cu124
  • Datasets: 3.6.0
  • Tokenizers: 0.21.1
Downloads last month
3
Safetensors
Model size
27.4M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support