AhamadShaik/SegFormer_RESIZE_NLM

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0424
  • Train Dice Coef: 0.8817
  • Train Iou: 0.7903
  • Validation Loss: 0.0436
  • Validation Dice Coef: 0.8897
  • Validation Iou: 0.8024
  • Train Lr: 1e-10
  • Epoch: 99

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'learning_rate': 1e-10, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train Dice Coef Train Iou Validation Loss Validation Dice Coef Validation Iou Train Lr Epoch
0.2282 0.5657 0.4102 0.1322 0.6524 0.4967 1e-04 0
0.1354 0.6853 0.5329 0.0855 0.7853 0.6544 1e-04 1
0.1105 0.7364 0.5924 0.0737 0.8147 0.6916 1e-04 2
0.0985 0.7610 0.6226 0.0632 0.8518 0.7440 1e-04 3
0.0933 0.7745 0.6399 0.0627 0.8455 0.7351 1e-04 4
0.0886 0.7856 0.6535 0.0584 0.8603 0.7566 1e-04 5
0.0831 0.7971 0.6695 0.0559 0.8621 0.7596 1e-04 6
0.0770 0.8107 0.6867 0.0530 0.8726 0.7756 1e-04 7
0.0741 0.8160 0.6942 0.0512 0.8775 0.7832 1e-04 8
0.0750 0.8163 0.6945 0.0581 0.8627 0.7606 1e-04 9
0.0678 0.8306 0.7138 0.0531 0.8719 0.7745 1e-04 10
0.0659 0.8341 0.7196 0.0519 0.8738 0.7781 1e-04 11
0.0626 0.8412 0.7294 0.0496 0.8789 0.7853 1e-04 12
0.0637 0.8383 0.7257 0.0515 0.8772 0.7828 1e-04 13
0.0601 0.8462 0.7367 0.0498 0.8765 0.7814 1e-04 14
0.0573 0.8525 0.7458 0.0474 0.8817 0.7897 1e-04 15
0.0565 0.8520 0.7456 0.0459 0.8850 0.7948 1e-04 16
0.0633 0.8381 0.7262 0.0487 0.8797 0.7868 1e-04 17
0.0558 0.8544 0.7489 0.0476 0.8828 0.7917 1e-04 18
0.0523 0.8617 0.7595 0.0454 0.8872 0.7983 1e-04 19
0.0516 0.8632 0.7617 0.0465 0.8838 0.7934 1e-04 20
0.0515 0.8636 0.7625 0.0494 0.8816 0.7894 1e-04 21
0.0518 0.8630 0.7615 0.0487 0.8836 0.7930 1e-04 22
0.0521 0.8616 0.7595 0.0483 0.8822 0.7908 1e-04 23
0.0510 0.8634 0.7624 0.0501 0.8814 0.7899 1e-04 24
0.0485 0.8703 0.7728 0.0439 0.8892 0.8018 5e-06 25
0.0464 0.8755 0.7807 0.0433 0.8890 0.8015 5e-06 26
0.0456 0.8760 0.7817 0.0439 0.8884 0.8004 5e-06 27
0.0446 0.8790 0.7860 0.0428 0.8896 0.8024 5e-06 28
0.0443 0.8786 0.7855 0.0426 0.8905 0.8038 5e-06 29
0.0439 0.8795 0.7867 0.0439 0.8881 0.7999 5e-06 30
0.0436 0.8800 0.7876 0.0429 0.8902 0.8032 5e-06 31
0.0430 0.8809 0.7890 0.0439 0.8876 0.7992 5e-06 32
0.0427 0.8812 0.7894 0.0432 0.8892 0.8016 5e-06 33
0.0431 0.8798 0.7875 0.0433 0.8895 0.8022 5e-06 34
0.0425 0.8816 0.7903 0.0435 0.8892 0.8016 2.5e-07 35
0.0420 0.8826 0.7917 0.0433 0.8894 0.8021 2.5e-07 36
0.0423 0.8833 0.7926 0.0429 0.8893 0.8018 2.5e-07 37
0.0420 0.8833 0.7929 0.0430 0.8895 0.8023 2.5e-07 38
0.0424 0.8832 0.7924 0.0437 0.8890 0.8013 2.5e-07 39
0.0422 0.8824 0.7914 0.0427 0.8897 0.8024 1.25e-08 40
0.0426 0.8824 0.7913 0.0431 0.8900 0.8030 1.25e-08 41
0.0424 0.8832 0.7926 0.0433 0.8893 0.8019 1.25e-08 42
0.0424 0.8830 0.7922 0.0436 0.8886 0.8008 1.25e-08 43
0.0427 0.8806 0.7888 0.0434 0.8893 0.8020 1.25e-08 44
0.0421 0.8829 0.7921 0.0431 0.8899 0.8028 6.25e-10 45
0.0427 0.8817 0.7901 0.0431 0.8896 0.8023 6.25e-10 46
0.0422 0.8825 0.7916 0.0433 0.8895 0.8022 6.25e-10 47
0.0423 0.8823 0.7912 0.0431 0.8897 0.8024 6.25e-10 48
0.0423 0.8826 0.7916 0.0433 0.8895 0.8021 6.25e-10 49
0.0425 0.8827 0.7918 0.0433 0.8896 0.8023 1e-10 50
0.0421 0.8838 0.7937 0.0431 0.8891 0.8014 1e-10 51
0.0424 0.8820 0.7907 0.0436 0.8884 0.8003 1e-10 52
0.0424 0.8824 0.7915 0.0426 0.8899 0.8029 1e-10 53
0.0423 0.8828 0.7920 0.0433 0.8894 0.8020 1e-10 54
0.0424 0.8818 0.7905 0.0431 0.8901 0.8031 1e-10 55
0.0421 0.8823 0.7911 0.0438 0.8887 0.8008 1e-10 56
0.0421 0.8821 0.7909 0.0426 0.8896 0.8023 1e-10 57
0.0420 0.8818 0.7906 0.0428 0.8903 0.8035 1e-10 58
0.0416 0.8845 0.7945 0.0434 0.8889 0.8012 1e-10 59
0.0421 0.8830 0.7921 0.0429 0.8900 0.8029 1e-10 60
0.0420 0.8834 0.7927 0.0433 0.8888 0.8010 1e-10 61
0.0425 0.8820 0.7909 0.0429 0.8896 0.8023 1e-10 62
0.0421 0.8827 0.7919 0.0431 0.8906 0.8039 1e-10 63
0.0422 0.8815 0.7901 0.0429 0.8901 0.8031 1e-10 64
0.0420 0.8833 0.7927 0.0430 0.8899 0.8029 1e-10 65
0.0426 0.8822 0.7911 0.0431 0.8891 0.8015 1e-10 66
0.0422 0.8829 0.7923 0.0428 0.8902 0.8033 1e-10 67
0.0424 0.8813 0.7898 0.0435 0.8893 0.8019 1e-10 68
0.0420 0.8826 0.7918 0.0430 0.8896 0.8024 1e-10 69
0.0428 0.8811 0.7895 0.0434 0.8900 0.8030 1e-10 70
0.0422 0.8832 0.7926 0.0431 0.8895 0.8021 1e-10 71
0.0427 0.8816 0.7902 0.0432 0.8898 0.8026 1e-10 72
0.0426 0.8817 0.7904 0.0434 0.8891 0.8015 1e-10 73
0.0424 0.8811 0.7897 0.0434 0.8899 0.8028 1e-10 74
0.0432 0.8807 0.7890 0.0430 0.8897 0.8025 1e-10 75
0.0423 0.8816 0.7904 0.0435 0.8894 0.8019 1e-10 76
0.0418 0.8838 0.7935 0.0431 0.8897 0.8025 1e-10 77
0.0425 0.8817 0.7901 0.0428 0.8898 0.8026 1e-10 78
0.0424 0.8818 0.7904 0.0434 0.8891 0.8015 1e-10 79
0.0419 0.8828 0.7920 0.0431 0.8901 0.8031 1e-10 80
0.0429 0.8812 0.7897 0.0425 0.8903 0.8034 1e-10 81
0.0419 0.8829 0.7922 0.0427 0.8905 0.8038 1e-10 82
0.0426 0.8820 0.7908 0.0431 0.8894 0.8019 1e-10 83
0.0424 0.8830 0.7921 0.0433 0.8893 0.8018 1e-10 84
0.0420 0.8832 0.7927 0.0432 0.8894 0.8019 1e-10 85
0.0421 0.8828 0.7921 0.0426 0.8907 0.8042 1e-10 86
0.0424 0.8817 0.7903 0.0430 0.8905 0.8038 1e-10 87
0.0423 0.8819 0.7908 0.0431 0.8901 0.8032 1e-10 88
0.0428 0.8809 0.7891 0.0429 0.8897 0.8025 1e-10 89
0.0424 0.8818 0.7903 0.0434 0.8897 0.8025 1e-10 90
0.0422 0.8827 0.7918 0.0428 0.8902 0.8033 1e-10 91
0.0426 0.8813 0.7897 0.0433 0.8891 0.8016 1e-10 92
0.0418 0.8839 0.7936 0.0427 0.8898 0.8026 1e-10 93
0.0418 0.8831 0.7924 0.0431 0.8900 0.8031 1e-10 94
0.0425 0.8822 0.7912 0.0429 0.8904 0.8037 1e-10 95
0.0424 0.8812 0.7895 0.0429 0.8896 0.8023 1e-10 96
0.0423 0.8818 0.7908 0.0428 0.8900 0.8028 1e-10 97
0.0417 0.8838 0.7934 0.0427 0.8906 0.8040 1e-10 98
0.0424 0.8817 0.7903 0.0436 0.8897 0.8024 1e-10 99

Framework versions

  • Transformers 4.27.4
  • TensorFlow 2.10.1
  • Datasets 2.11.0
  • Tokenizers 0.13.3
Downloads last month
8
Inference API
Unable to determine this model’s pipeline type. Check the docs .