Edit model card

vit-base-patch16-224-dmae-va-da-40B

This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2421
  • Accuracy: 0.9302

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 3 1.4669 0.3256
No log 1.85 6 1.2689 0.4419
No log 2.77 9 1.1591 0.4651
1.3901 4.0 13 0.9778 0.5814
1.3901 4.92 16 0.8885 0.6512
1.3901 5.85 19 0.7885 0.6512
0.9794 6.77 22 0.6854 0.7442
0.9794 8.0 26 0.5822 0.7674
0.9794 8.92 29 0.4929 0.8605
0.6573 9.85 32 0.4822 0.8605
0.6573 10.77 35 0.4529 0.8372
0.6573 12.0 39 0.4203 0.7907
0.4166 12.92 42 0.3889 0.8605
0.4166 13.85 45 0.3697 0.8605
0.4166 14.77 48 0.3991 0.8140
0.3376 16.0 52 0.3038 0.9070
0.3376 16.92 55 0.3139 0.8837
0.3376 17.85 58 0.2821 0.8837
0.191 18.77 61 0.2905 0.8837
0.191 20.0 65 0.2616 0.8605
0.191 20.92 68 0.2636 0.8837
0.2065 21.85 71 0.2864 0.9070
0.2065 22.77 74 0.2833 0.8605
0.2065 24.0 78 0.2507 0.9070
0.1328 24.92 81 0.2890 0.8837
0.1328 25.85 84 0.3065 0.8837
0.1328 26.77 87 0.2891 0.8837
0.1065 28.0 91 0.2815 0.8837
0.1065 28.92 94 0.2753 0.8837
0.1065 29.85 97 0.2768 0.8837
0.1122 30.77 100 0.2864 0.8837
0.1122 32.0 104 0.2563 0.9070
0.1122 32.92 107 0.2421 0.9302
0.0879 33.85 110 0.2453 0.9070
0.0879 34.77 113 0.2434 0.8837
0.0879 36.0 117 0.2406 0.8837
0.1082 36.92 120 0.2407 0.8837

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Augusto777/vit-base-patch16-224-dmae-va-da-40B

Finetuned
(505)
this model