mobilenetv2-cocoa / README.md
CristianR8's picture
End of training
84b6192 verified
metadata
library_name: transformers
license: other
base_model: google/mobilenet_v2_1.0_224
tags:
  - image-classification
  - vision
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: mobilenetv2-cocoa
    results: []

mobilenetv2-cocoa

This model is a fine-tuned version of google/mobilenet_v2_1.0_224 on the SemilleroCV/Cocoa-dataset dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3226
  • Accuracy: 0.8953

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.569 1.0 196 0.5072 0.8628
0.3973 2.0 392 0.4278 0.8700
0.5873 3.0 588 0.4138 0.8773
0.4781 4.0 784 0.4718 0.8736
0.4483 5.0 980 0.4506 0.8773
0.655 6.0 1176 0.3685 0.8953
0.3441 7.0 1372 0.4751 0.8773
0.3166 8.0 1568 0.3796 0.8809
0.5114 9.0 1764 0.4087 0.8917
0.6452 10.0 1960 0.3760 0.8989
0.4747 11.0 2156 0.4223 0.8773
0.5145 12.0 2352 1.1704 0.5957
0.1991 13.0 2548 0.3454 0.9097
0.2396 14.0 2744 0.3913 0.8700
0.3259 15.0 2940 0.3689 0.8881
0.3434 16.0 3136 0.3743 0.8736
0.389 17.0 3332 0.3657 0.9025
0.302 18.0 3528 0.4218 0.8917
0.4693 19.0 3724 0.3226 0.8953
0.6346 20.0 3920 0.3277 0.8881
0.481 21.0 4116 0.3484 0.8700
0.2628 22.0 4312 0.3942 0.9025
0.3653 23.0 4508 0.3537 0.8989
0.344 24.0 4704 0.4758 0.8809
0.2819 25.0 4900 0.4318 0.8989
0.513 26.0 5096 0.4277 0.8412
0.201 27.0 5292 0.3915 0.8953
0.2696 28.0 5488 0.4401 0.8809
0.4204 29.0 5684 0.3856 0.8953
0.316 30.0 5880 0.3576 0.8845
0.3102 31.0 6076 0.4155 0.8809
0.1489 32.0 6272 0.4147 0.8953
0.3302 33.0 6468 0.4217 0.8953
0.3271 34.0 6664 0.3321 0.9097
0.3481 35.0 6860 0.3828 0.8809
0.3329 36.0 7056 0.4045 0.8700
0.2471 37.0 7252 0.5536 0.8664
0.2007 38.0 7448 0.3503 0.8881
0.7535 39.0 7644 0.4819 0.8809
0.1851 40.0 7840 0.3762 0.8773
0.2329 41.0 8036 0.4465 0.8845
0.2889 42.0 8232 0.4696 0.9061
0.1409 43.0 8428 0.4876 0.8809
0.2683 44.0 8624 0.6134 0.8809
0.3535 45.0 8820 0.4364 0.8809
0.1683 46.0 9016 0.4059 0.8881
0.43 47.0 9212 0.3955 0.8881
0.5702 48.0 9408 0.3898 0.8809
0.8043 49.0 9604 0.5963 0.8953
0.3742 50.0 9800 0.5273 0.8989
0.1026 51.0 9996 0.3999 0.8989
0.2357 52.0 10192 0.4724 0.8592
0.2612 53.0 10388 0.4169 0.8845
0.4747 54.0 10584 0.3973 0.8917
0.4943 55.0 10780 0.5156 0.9061
0.2296 56.0 10976 0.6397 0.8917
0.1789 57.0 11172 0.5098 0.8267
0.4355 58.0 11368 0.5032 0.8917
0.3957 59.0 11564 0.4205 0.9025
0.4806 60.0 11760 0.7011 0.8917
0.2356 61.0 11956 0.7832 0.8881
0.3865 62.0 12152 0.4622 0.8917
0.3504 63.0 12348 0.5889 0.8773
0.3766 64.0 12544 0.5246 0.8592
0.1336 65.0 12740 0.6462 0.8773
0.3275 66.0 12936 0.5013 0.8628
0.3765 67.0 13132 0.4857 0.8953
0.1622 68.0 13328 0.4918 0.8845
0.2291 69.0 13524 0.5734 0.8736
0.1786 70.0 13720 0.6691 0.8231
0.3451 71.0 13916 0.7318 0.8773
0.2313 72.0 14112 0.5041 0.8700
0.1984 73.0 14308 0.6518 0.7690
0.2345 74.0 14504 0.5280 0.8845
0.0851 75.0 14700 0.6302 0.8917
0.2234 76.0 14896 0.4843 0.8809
0.2266 77.0 15092 0.4900 0.8628
0.2735 78.0 15288 0.5249 0.8736
0.2442 79.0 15484 0.5061 0.8917
0.2246 80.0 15680 0.4810 0.8664
0.3557 81.0 15876 0.6420 0.8123
0.2017 82.0 16072 0.5158 0.8845
0.249 83.0 16268 0.4364 0.9025
0.2566 84.0 16464 0.5507 0.8736
0.1012 85.0 16660 0.4728 0.8845
0.1972 86.0 16856 0.5746 0.8809
0.7922 87.0 17052 0.5262 0.8628
0.1229 88.0 17248 0.6293 0.8845
0.0248 89.0 17444 0.6193 0.8881
0.0925 90.0 17640 0.4755 0.8700
0.1968 91.0 17836 0.5528 0.8700
0.1694 92.0 18032 0.4338 0.8953
0.2083 93.0 18228 1.1286 0.8809
0.3666 94.0 18424 0.6879 0.8267
0.1358 95.0 18620 0.5071 0.8881
0.2247 96.0 18816 0.5941 0.8520
0.2682 97.0 19012 0.5219 0.8592
0.1762 98.0 19208 0.6929 0.8520
0.2368 99.0 19404 0.5324 0.8845
0.1268 100.0 19600 0.6160 0.8881

Framework versions

  • Transformers 4.48.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.21.0