SegIGNCoral-b0-2025_10_09_40484-bs16_refine is a fine-tuned version of nvidia/mit-b0.


Model description

SegIGNCoral-b0-2025_10_09_40484-bs16_refine is a model built on top of nvidia/mit-b0 model for aerial image segmentation.

The source code for training the model can be found in this Git repository.


Intended uses & limitations

You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.


Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • Number of Epochs: 175.0
  • Learning Rate: 1e-05
  • Train Batch Size: 16
  • Eval Batch Size: 16
  • Optimizer: Adam
  • LR Scheduler Type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
  • Freeze Encoder: No
  • Data Augmentation: No

Training results

Evaluate on training set.

{
    "SegIGNCoral-b0-2025_10_09_40484-bs16_refine": {
        "pixel_acc": 0.8066,
        "mean_iou": 0.5697,
        "iou_per_class": {
            "Acropora Branching": 0.6439,
            "Non-acropora Massive": 0.2779,
            "Other Corals": 0.3327,
            "Sand": 0.821,
            "Seagrass": 0.7729
        }
    }
}

Evaluate on manually annotate zone.

πŸ“Š Evaluating zone: hermitage
  βœ… Pixel Accuracy: 0.8814
  βœ… Mean Accuracy : 0.3494
  βœ… Mean IoU     : 0.3283
  Pixel Accuracy Per Class:
    Acropora Branching: 0.0000
    Non-acropora Massive: 0.0000
    Other Corals: 0.0000
    Sand: 0.9011
    Syringodium: 0.8457
  IoU Per Class:
    Acropora Branching: 0.0000
    Non-acropora Massive: 0.0000
    Other Corals: 0.0000
    Sand: 0.8805
    Syringodium: 0.7611

πŸ“Š Evaluating zone: troudeau
  βœ… Pixel Accuracy: 0.8389
  βœ… Mean Accuracy : 0.5461
  βœ… Mean IoU     : 0.3493
  Pixel Accuracy Per Class:
    Acropora Branching: 0.5263
    Non-acropora Massive: 0.3429
    Other Corals: 0.4113
    Sand: 0.9040
  IoU Per Class:
    Acropora Branching: 0.0481
    Non-acropora Massive: 0.1714
    Other Corals: 0.2827
    Sand: 0.8949

πŸ“Š Evaluating zone: stleu
  βœ… Pixel Accuracy: 0.6907
  βœ… Mean Accuracy : 0.6036
  βœ… Mean IoU     : 0.3995
  Pixel Accuracy Per Class:
    Acropora Branching: 0.9457
    Non-acropora Massive: 0.5060
    Other Corals: 0.1624
    Sand: 0.8006
  IoU Per Class:
    Acropora Branching: 0.3820
    Non-acropora Massive: 0.3172
    Other Corals: 0.1358
    Sand: 0.7631

πŸ“¦ Micro-Averaged Metrics Across Zones (all pixels):
  Pixel Accuracy Per Class:
    Acropora Branching: 0.9351
    Non-acropora Massive: 0.4778
    Other Corals: 0.2117
    Sand: 0.8996
    Syringodium: 0.8457
  IoU Per Class:
    Acropora Branching: 0.0906
    Non-acropora Massive: 0.1418
    Other Corals: 0.0172
    Sand: 0.8788
    Syringodium: 0.7611

  βœ… Pixel Accuracy: 0.8781
  βœ… Mean Accuracy : 0.6740
  βœ… Mean IoU     : 0.3779

confusion_matrix_all_in_one

Epoch Validation Loss Learning Rate
1.0 0.17825450003147125 1e-05
2.0 0.17652319371700287 1e-05
3.0 0.17437469959259033 1e-05
4.0 0.17109064757823944 1e-05
5.0 0.16746488213539124 1e-05
6.0 0.1647380143404007 1e-05
7.0 0.1629670113325119 1e-05
8.0 0.16162735223770142 1e-05
9.0 0.16086645424365997 1e-05
10.0 0.1595691293478012 1e-05
11.0 0.1584334373474121 1e-05
12.0 0.15793322026729584 1e-05
13.0 0.15778672695159912 1e-05
14.0 0.1567489206790924 1e-05
15.0 0.15607595443725586 1e-05
16.0 0.15554767847061157 1e-05
17.0 0.1547158807516098 1e-05
18.0 0.1543949544429779 1e-05
19.0 0.15392234921455383 1e-05
20.0 0.15279631316661835 1e-05
21.0 0.15245354175567627 1e-05
22.0 0.152305468916893 1e-05
23.0 0.15164558589458466 1e-05
24.0 0.15041886270046234 1e-05
25.0 0.1490705907344818 1e-05
26.0 0.14869147539138794 1e-05
27.0 0.14790189266204834 1e-05
28.0 0.146309033036232 1e-05
29.0 0.14488185942173004 1e-05
30.0 0.14381851255893707 1e-05
31.0 0.14236079156398773 1e-05
32.0 0.14145053923130035 1e-05
33.0 0.14065156877040863 1e-05
34.0 0.14140239357948303 1e-05
35.0 0.13949288427829742 1e-05
36.0 0.13923388719558716 1e-05
37.0 0.1388508528470993 1e-05
38.0 0.1380811184644699 1e-05
39.0 0.13670456409454346 1e-05
40.0 0.13674621284008026 1e-05
41.0 0.13670817017555237 1e-05
42.0 0.13663434982299805 1e-05
43.0 0.1345621645450592 1e-05
44.0 0.1353185921907425 1e-05
45.0 0.13393084704875946 1e-05
46.0 0.13347546756267548 1e-05
47.0 0.13446684181690216 1e-05
48.0 0.13390202820301056 1e-05
49.0 0.1328386515378952 1e-05
50.0 0.13314343988895416 1e-05
51.0 0.13273906707763672 1e-05
52.0 0.13288834691047668 1e-05
53.0 0.13220956921577454 1e-05
54.0 0.13126344978809357 1e-05
55.0 0.131490096449852 1e-05
56.0 0.13168570399284363 1e-05
57.0 0.13094131648540497 1e-05
58.0 0.13041794300079346 1e-05
59.0 0.12996184825897217 1e-05
60.0 0.13028591871261597 1e-05
61.0 0.1296725869178772 1e-05
62.0 0.1295732855796814 1e-05
63.0 0.1319553703069687 1e-05
64.0 0.12922300398349762 1e-05
65.0 0.12969540059566498 1e-05
66.0 0.1294255256652832 1e-05
67.0 0.12892434000968933 1e-05
68.0 0.13009504973888397 1e-05
69.0 0.1286858767271042 1e-05
70.0 0.12956416606903076 1e-05
71.0 0.12911681830883026 1e-05
72.0 0.1296709179878235 1e-05
73.0 0.1287732869386673 1e-05
74.0 0.1294514536857605 1e-05
75.0 0.12915697693824768 1e-05
76.0 0.12841865420341492 1e-05
77.0 0.12817372381687164 1e-05
78.0 0.1281365156173706 1e-05
79.0 0.1279040277004242 1e-05
80.0 0.1278274655342102 1e-05
81.0 0.1280566155910492 1e-05
82.0 0.12727053463459015 1e-05
83.0 0.12869715690612793 1e-05
84.0 0.12711170315742493 1e-05
85.0 0.12702301144599915 1e-05
86.0 0.12730672955513 1e-05
87.0 0.12725737690925598 1e-05
88.0 0.1271396279335022 1e-05
89.0 0.12912477552890778 1e-05
90.0 0.12642411887645721 1e-05
91.0 0.12625634670257568 1e-05
92.0 0.127320796251297 1e-05
93.0 0.1259581744670868 1e-05
94.0 0.1271350383758545 1e-05
95.0 0.12815390527248383 1e-05
96.0 0.1261141449213028 1e-05
97.0 0.12658798694610596 1e-05
98.0 0.1271129697561264 1e-05
99.0 0.12775172293186188 1e-05
100.0 0.12581050395965576 1e-05
101.0 0.12726755440235138 1e-05
102.0 0.12562812864780426 1e-05
103.0 0.12696777284145355 1e-05
104.0 0.12738262116909027 1e-05
105.0 0.12724792957305908 1e-05
106.0 0.12833286821842194 1e-05
107.0 0.12599430978298187 1e-05
108.0 0.12838247418403625 1e-05
109.0 0.12654130160808563 1e-05
110.0 0.12597838044166565 1e-05
111.0 0.12549486756324768 1e-05
112.0 0.12560401856899261 1e-05
113.0 0.1261509358882904 1e-05
114.0 0.12717725336551666 1e-05
115.0 0.1258397251367569 1e-05
116.0 0.12457232922315598 1e-05
117.0 0.125536248087883 1e-05
118.0 0.12458962202072144 1e-05
119.0 0.1279805451631546 1e-05
120.0 0.12609897553920746 1e-05
121.0 0.12450611591339111 1e-05
122.0 0.1254856139421463 1e-05
123.0 0.12441741675138474 1e-05
124.0 0.1260562390089035 1e-05
125.0 0.12651772797107697 1e-05
126.0 0.12526915967464447 1e-05
127.0 0.12434830516576767 1e-05
128.0 0.1253136843442917 1e-05
129.0 0.12462199479341507 1e-05
130.0 0.12565144896507263 1e-05
131.0 0.12453299760818481 1e-05
132.0 0.1274179220199585 1e-05
133.0 0.1271260529756546 1e-05
134.0 0.12498216331005096 1e-05
135.0 0.1252247840166092 1e-05
136.0 0.12617097795009613 1e-05
137.0 0.12494047731161118 1e-05
138.0 0.1247497946023941 1e-05
139.0 0.1245729923248291 1.0000000000000002e-06
140.0 0.12425211817026138 1.0000000000000002e-06
141.0 0.12417663633823395 1.0000000000000002e-06
142.0 0.12412694841623306 1.0000000000000002e-06
143.0 0.12370014935731888 1.0000000000000002e-06
144.0 0.12406766414642334 1.0000000000000002e-06
145.0 0.12380032241344452 1.0000000000000002e-06
146.0 0.12423602491617203 1.0000000000000002e-06
147.0 0.12404759973287582 1.0000000000000002e-06
148.0 0.12442447990179062 1.0000000000000002e-06
149.0 0.12393788993358612 1.0000000000000002e-06
150.0 0.12402787059545517 1.0000000000000002e-06
151.0 0.12422869354486465 1.0000000000000002e-06
152.0 0.12365838885307312 1.0000000000000002e-06
153.0 0.12383443862199783 1.0000000000000002e-06
154.0 0.12431268393993378 1.0000000000000002e-06
155.0 0.12341240048408508 1.0000000000000002e-06
156.0 0.12390484660863876 1.0000000000000002e-06
157.0 0.12473750114440918 1.0000000000000002e-06
158.0 0.12396185100078583 1.0000000000000002e-06
159.0 0.12398138642311096 1.0000000000000002e-06
160.0 0.1234150305390358 1.0000000000000002e-06
161.0 0.12368131428956985 1.0000000000000002e-06
162.0 0.12391027063131332 1.0000000000000002e-06
163.0 0.12366586923599243 1.0000000000000002e-06
164.0 0.1238330602645874 1.0000000000000002e-06
165.0 0.12347564101219177 1.0000000000000002e-06
166.0 0.12349989265203476 1.0000000000000002e-06
167.0 0.12391113489866257 1.0000000000000002e-07
168.0 0.12376048415899277 1.0000000000000002e-07
169.0 0.12376976013183594 1.0000000000000002e-07
170.0 0.12379935383796692 1.0000000000000002e-07
171.0 0.12354343384504318 1.0000000000000002e-07
172.0 0.12349338084459305 1.0000000000000002e-07
173.0 0.12366225570440292 1.0000000000000002e-07
174.0 0.12352761626243591 1.0000000000000002e-07
175.0 0.12430573999881744 1.0000000000000002e-07

Framework Versions

  • Transformers: 4.56.2
  • Pytorch: 2.8.0+cu128
  • Datasets: 4.1.1
  • Tokenizers: 0.22.1
Downloads last month
323
Safetensors
Model size
3.72M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support