CristianR8's picture
Model save
e08798d verified
|
raw
history blame
7.74 kB
metadata
library_name: transformers
license: apache-2.0
base_model: google/efficientnet-b0
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: efficientnet-b0-cocoa
    results: []

efficientnet-b0-cocoa

This model is a fine-tuned version of google/efficientnet-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3624
  • Accuracy: 0.8809

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0627 1.0 196 1.5223 0.5596
0.591 2.0 392 0.8975 0.8303
0.6623 3.0 588 0.6564 0.8773
0.4874 4.0 784 0.6842 0.8339
0.4671 5.0 980 0.4894 0.8809
0.5623 6.0 1176 0.4160 0.8736
0.3917 7.0 1372 0.4022 0.8845
0.3153 8.0 1568 0.4939 0.8412
0.5814 9.0 1764 0.3540 0.8773
0.5883 10.0 1960 0.3493 0.8953
0.4616 11.0 2156 0.7928 0.7762
0.499 12.0 2352 2.0659 0.2960
0.2236 13.0 2548 0.4444 0.8520
0.2083 14.0 2744 0.4640 0.8736
0.3408 15.0 2940 0.3775 0.8773
0.3529 16.0 3136 0.3519 0.8881
0.3859 17.0 3332 0.3310 0.9061
0.3557 18.0 3528 0.3475 0.8917
0.4979 19.0 3724 0.3839 0.8592
0.7133 20.0 3920 0.3032 0.9134
0.4489 21.0 4116 0.4246 0.8520
0.2605 22.0 4312 0.2951 0.8989
0.3787 23.0 4508 0.4357 0.8520
0.3015 24.0 4704 0.3990 0.8917
0.1965 25.0 4900 0.3536 0.9097
0.3903 26.0 5096 0.4166 0.8592
0.1902 27.0 5292 0.4354 0.8520
0.2089 28.0 5488 0.4089 0.8592
0.3574 29.0 5684 0.4787 0.8231
0.3532 30.0 5880 0.3165 0.9097
0.2967 31.0 6076 0.3105 0.9134
0.2364 32.0 6272 0.3560 0.9061
0.3136 33.0 6468 0.2657 0.9097
0.4061 34.0 6664 0.2680 0.9134
0.3296 35.0 6860 0.3798 0.9061
0.2905 36.0 7056 0.5098 0.8556
0.2763 37.0 7252 0.4219 0.8809
0.2454 38.0 7448 0.2852 0.9134
0.6077 39.0 7644 0.3603 0.8989
0.1966 40.0 7840 0.3519 0.8736
0.2473 41.0 8036 0.3343 0.9025
0.2795 42.0 8232 0.3384 0.9170
0.1249 43.0 8428 0.4046 0.8773
0.2943 44.0 8624 0.3953 0.8917
0.3002 45.0 8820 0.5003 0.8592
0.1525 46.0 9016 0.3232 0.9170
0.4022 47.0 9212 0.3113 0.9170
0.4994 48.0 9408 0.4494 0.8556
0.6512 49.0 9604 0.3722 0.9206
0.3152 50.0 9800 0.2852 0.9097
0.1165 51.0 9996 0.4138 0.8628
0.216 52.0 10192 0.3413 0.8953
0.1455 53.0 10388 0.3046 0.9170
0.554 54.0 10584 0.2849 0.8989
0.3586 55.0 10780 0.3517 0.9134
0.2239 56.0 10976 0.4538 0.9025
0.1725 57.0 11172 0.4492 0.8592
0.4689 58.0 11368 0.4739 0.8628
0.3565 59.0 11564 0.2831 0.9206
0.2259 60.0 11760 0.3465 0.9206
0.2212 61.0 11956 0.2884 0.9314
0.2648 62.0 12152 0.4875 0.8448
0.3438 63.0 12348 0.3989 0.9061
0.4785 64.0 12544 0.5953 0.8520
0.06 65.0 12740 0.2954 0.9278
0.1965 66.0 12936 0.5033 0.8520
0.3548 67.0 13132 0.4132 0.8809
0.1279 68.0 13328 0.3743 0.9170
0.2879 69.0 13524 0.6423 0.7762
0.1757 70.0 13720 0.5979 0.8014
0.3338 71.0 13916 0.4398 0.8989
0.1604 72.0 14112 0.5634 0.8231
0.1078 73.0 14308 0.6204 0.7762
0.258 74.0 14504 0.3685 0.8953
0.1227 75.0 14700 0.7026 0.8159
0.2257 76.0 14896 0.4048 0.9170
0.1786 77.0 15092 0.4891 0.8845
0.2006 78.0 15288 0.4216 0.8773
0.3144 79.0 15484 0.2721 0.8953
0.1969 80.0 15680 0.4270 0.8484
0.1405 81.0 15876 0.7632 0.7834
0.1427 82.0 16072 0.3249 0.9025
0.2493 83.0 16268 0.3838 0.8989
0.331 84.0 16464 0.3330 0.9206
0.1231 85.0 16660 0.3246 0.8700
0.2781 86.0 16856 0.3710 0.8736
0.7193 87.0 17052 0.3384 0.9061
0.1149 88.0 17248 0.3703 0.9097
0.0269 89.0 17444 0.5013 0.8592
0.0967 90.0 17640 0.3456 0.8989
0.177 91.0 17836 0.3799 0.8881
0.1917 92.0 18032 0.3239 0.9061
0.2082 93.0 18228 0.4861 0.8989
0.3836 94.0 18424 0.4444 0.8736
0.1 95.0 18620 0.3713 0.8845
0.1785 96.0 18816 0.4279 0.8303
0.19 97.0 19012 0.6588 0.8412
0.099 98.0 19208 0.6632 0.8267
0.1467 99.0 19404 0.4642 0.8809
0.2617 100.0 19600 0.3624 0.8809

Framework versions

  • Transformers 4.48.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.21.0