Edit model card

dinov2-base-finetuned-har

This model is a fine-tuned version of facebook/dinov2-base on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4424
  • Accuracy: 0.8968

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.9155 0.9910 83 0.6204 0.8053
0.749 1.9940 167 0.4433 0.8667
0.8197 2.9970 251 0.4826 0.8571
0.6854 4.0 335 0.4243 0.8725
0.7058 4.9910 418 0.4349 0.8593
0.6717 5.9940 502 0.4984 0.8434
0.6544 6.9970 586 0.4730 0.8545
0.5846 8.0 670 0.4631 0.8630
0.5207 8.9910 753 0.4072 0.8751
0.4977 9.9940 837 0.4627 0.8608
0.4974 10.9970 921 0.4600 0.8661
0.4502 12.0 1005 0.4548 0.8725
0.4051 12.9910 1088 0.4404 0.8709
0.3862 13.9940 1172 0.4498 0.8772
0.351 14.9970 1256 0.4859 0.8677
0.3807 16.0 1340 0.5189 0.8556
0.3538 16.9910 1423 0.4959 0.8646
0.3181 17.9940 1507 0.4831 0.8698
0.3225 18.9970 1591 0.4890 0.8804
0.3257 20.0 1675 0.4817 0.8735
0.2667 20.9910 1758 0.5199 0.8683
0.2863 21.9940 1842 0.4835 0.8683
0.2527 22.9970 1926 0.4764 0.8772
0.2657 24.0 2010 0.4651 0.8767
0.1995 24.9910 2093 0.5079 0.8693
0.2481 25.9940 2177 0.5112 0.8698
0.2072 26.9970 2261 0.5082 0.8831
0.2164 28.0 2345 0.5002 0.8730
0.2198 28.9910 2428 0.4785 0.8778
0.2137 29.9940 2512 0.5012 0.8889
0.1936 30.9970 2596 0.4961 0.8757
0.2255 32.0 2680 0.4987 0.8788
0.1818 32.9910 2763 0.4840 0.8852
0.1644 33.9940 2847 0.4694 0.8862
0.1799 34.9970 2931 0.4599 0.8915
0.1624 36.0 3015 0.5122 0.8852
0.157 36.9910 3098 0.4546 0.8899
0.2165 37.9940 3182 0.5097 0.8836
0.1565 38.9970 3266 0.4566 0.8952
0.1476 40.0 3350 0.4579 0.8915
0.1296 40.9910 3433 0.4595 0.8931
0.1159 41.9940 3517 0.4841 0.8884
0.1071 42.9970 3601 0.4730 0.8820
0.1017 44.0 3685 0.4470 0.8931
0.11 44.9910 3768 0.4557 0.8910
0.126 45.9940 3852 0.4585 0.8926
0.1079 46.9970 3936 0.4551 0.8905
0.1194 48.0 4020 0.4401 0.8947
0.11 48.9910 4103 0.4424 0.8968
0.1104 49.5522 4150 0.4414 0.8958

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
12
Safetensors
Model size
86.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for amauriciogonzalez/dinov2-base-finetuned-har

Finetuned
(22)
this model

Evaluation results