Edit model card

google-vit-base-patch16-224-Waste-O-I-classification

This model is a fine-tuned version performed by Miguel Calderon of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Accuracy: 0.956
  • Loss: 0.3036

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 4

Training results

Training Loss Epoch Step Accuracy Validation Loss
0.2168 0.1580 1000 0.9525 0.1303
0.196 0.3159 2000 0.941 0.1638
0.1993 0.4739 3000 0.9285 0.2206
0.1849 0.6318 4000 0.9225 0.2288
0.199 0.7898 5000 0.9105 0.3331
0.2171 0.9477 6000 0.944 0.1582
0.1209 1.1057 7000 0.9495 0.1887
0.114 1.2636 8000 0.932 0.1950
0.1268 1.4216 9000 0.9335 0.1965
0.1272 1.5795 10000 0.9165 0.3112
0.1003 1.7375 11000 0.9575 0.1353
0.0844 1.8954 12000 0.9345 0.2635
0.0757 2.0534 13000 0.952 0.1434
0.053 2.2113 14000 0.933 0.3203
0.0994 2.3693 15000 0.9405 0.2165
0.0248 2.5272 16000 0.951 0.2400
0.0842 2.6852 17000 0.906 0.4092
0.0733 2.8432 18000 0.9515 0.1937
0.0542 3.0011 19000 0.938 0.2911
0.0202 3.1591 20000 0.936 0.3648
0.0237 3.3170 21000 0.9355 0.3618
0.0294 3.4750 22000 0.9255 0.4209
0.0375 3.6329 23000 0.943 0.2840
0.0176 3.7909 24000 0.9525 0.2604
0.0252 3.9488 25000 0.9515 0.2500
0.0024 4.1068 26000 0.9545 0.2892
0.0119 4.2647 27000 0.956 0.3036

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0+cpu
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
34
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Giecom/google-vit-base-patch16-224-Waste-O-I-classification

Finetuned
(486)
this model

Datasets used to train Giecom/google-vit-base-patch16-224-Waste-O-I-classification

Evaluation results