Edit model card

vit-base-patch16-224-dmae-va-da-40D

This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3404
  • Accuracy: 0.9302

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 3 1.2901 0.4186
No log 1.85 6 1.2314 0.4419
No log 2.77 9 1.1530 0.4651
1.2976 4.0 13 0.9852 0.6047
1.2976 4.92 16 0.8450 0.7674
1.2976 5.85 19 0.8367 0.6512
1.2976 6.77 22 0.7545 0.7209
0.8294 8.0 26 0.6711 0.7907
0.8294 8.92 29 0.6739 0.7209
0.8294 9.85 32 0.6010 0.7442
0.8294 10.77 35 0.5369 0.7442
0.4293 12.0 39 0.5272 0.7907
0.4293 12.92 42 0.5217 0.7442
0.4293 13.85 45 0.4844 0.7674
0.2695 14.77 48 0.4948 0.7907
0.2695 16.0 52 0.4776 0.7674
0.2695 16.92 55 0.4410 0.7907
0.2695 17.85 58 0.4871 0.7442
0.1905 18.77 61 0.4375 0.7907
0.1905 20.0 65 0.4578 0.8140
0.1905 20.92 68 0.4956 0.8140
0.1905 21.85 71 0.4500 0.8140
0.135 22.77 74 0.4071 0.8605
0.135 24.0 78 0.4158 0.8605
0.135 24.92 81 0.4380 0.8372
0.1485 25.85 84 0.4281 0.8140
0.1485 26.77 87 0.3777 0.8837
0.1485 28.0 91 0.3404 0.9302
0.1485 28.92 94 0.3581 0.9070
0.1001 29.85 97 0.3807 0.8605
0.1001 30.77 100 0.3700 0.8837
0.1001 32.0 104 0.3730 0.8837
0.1001 32.92 107 0.3868 0.8837
0.0797 33.85 110 0.3883 0.8605
0.0797 34.77 113 0.3933 0.8372
0.0797 36.0 117 0.3998 0.8372
0.0991 36.92 120 0.4014 0.8372

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Augusto777/vit-base-patch16-224-dmae-va-da-40D

Finetuned
(502)
this model