--- license: mit base_model: google/vivit-b-16x2-kinetics400 tags: - generated_from_trainer metrics: - accuracy model-index: - name: vivit-b-16x2-kinetics400-ft-76388 results: [] --- # vivit-b-16x2-kinetics400-ft-76388 This model is a fine-tuned version of [google/vivit-b-16x2-kinetics400](https://huggingface.co/google/vivit-b-16x2-kinetics400) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.9924 - Accuracy: 0.5595 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - training_steps: 5500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:--------:| | 1.1083 | 0.0202 | 111 | 1.1112 | 0.3347 | | 1.0789 | 1.0202 | 222 | 1.0576 | 0.4259 | | 1.0767 | 2.0202 | 333 | 1.0863 | 0.4246 | | 1.1114 | 3.0202 | 444 | 1.1061 | 0.3704 | | 1.0832 | 4.0202 | 555 | 1.0536 | 0.4193 | | 1.0622 | 5.0202 | 666 | 1.0720 | 0.4577 | | 1.0874 | 6.0202 | 777 | 1.0304 | 0.4709 | | 0.9742 | 7.0202 | 888 | 1.0340 | 0.4511 | | 0.9848 | 8.0202 | 999 | 1.0367 | 0.4669 | | 1.12 | 9.0202 | 1110 | 1.0269 | 0.4193 | | 1.0484 | 10.0202 | 1221 | 1.0105 | 0.4511 | | 0.9445 | 11.0202 | 1332 | 1.0052 | 0.4881 | | 1.032 | 12.0202 | 1443 | 1.0365 | 0.4524 | | 0.987 | 13.0202 | 1554 | 1.0019 | 0.5106 | | 1.0797 | 14.0202 | 1665 | 1.0128 | 0.4656 | | 0.9196 | 15.0202 | 1776 | 1.0431 | 0.5013 | | 1.0727 | 16.0202 | 1887 | 1.0016 | 0.5344 | | 0.9481 | 17.0202 | 1998 | 0.9983 | 0.5265 | | 0.9034 | 18.0202 | 2109 | 1.0221 | 0.5013 | | 0.8569 | 19.0202 | 2220 | 0.9825 | 0.5265 | | 0.9256 | 20.0202 | 2331 | 0.9678 | 0.5397 | | 1.0311 | 21.0202 | 2442 | 0.9574 | 0.5106 | | 0.8651 | 22.0202 | 2553 | 1.0048 | 0.4987 | | 0.9384 | 23.0202 | 2664 | 0.9717 | 0.5225 | | 0.9545 | 24.0202 | 2775 | 0.9763 | 0.5172 | | 0.9187 | 25.0202 | 2886 | 0.9628 | 0.5212 | | 0.7953 | 26.0202 | 2997 | 0.9523 | 0.5265 | | 0.8793 | 27.0202 | 3108 | 0.9977 | 0.5370 | | 0.7897 | 28.0202 | 3219 | 0.9965 | 0.5317 | | 0.8034 | 29.0202 | 3330 | 0.9272 | 0.5463 | | 0.8469 | 30.0202 | 3441 | 0.9231 | 0.5384 | | 0.79 | 31.0202 | 3552 | 0.9281 | 0.5728 | | 0.8516 | 32.0202 | 3663 | 0.9310 | 0.5569 | | 0.8138 | 33.0202 | 3774 | 0.9582 | 0.5675 | | 0.8322 | 34.0202 | 3885 | 0.9741 | 0.5622 | | 0.8064 | 35.0202 | 3996 | 0.9573 | 0.5754 | | 0.8767 | 36.0202 | 4107 | 0.9290 | 0.5714 | | 0.7978 | 37.0202 | 4218 | 0.9449 | 0.5728 | | 0.8113 | 38.0202 | 4329 | 0.9493 | 0.5780 | | 0.8065 | 39.0202 | 4440 | 0.9015 | 0.5926 | | 0.7989 | 40.0202 | 4551 | 0.9139 | 0.5886 | | 0.6323 | 41.0202 | 4662 | 0.9004 | 0.5992 | | 0.6847 | 42.0202 | 4773 | 0.9083 | 0.6124 | | 0.7711 | 43.0202 | 4884 | 0.9023 | 0.5979 | | 0.5815 | 44.0202 | 4995 | 0.9247 | 0.6058 | | 0.8821 | 45.0202 | 5106 | 0.9071 | 0.6058 | | 0.7436 | 46.0202 | 5217 | 0.8924 | 0.6085 | | 0.6863 | 47.0202 | 5328 | 0.8965 | 0.6111 | | 0.7035 | 48.0202 | 5439 | 0.8941 | 0.6045 | | 0.6348 | 49.0111 | 5500 | 0.8950 | 0.6124 | ### Framework versions - Transformers 4.41.2 - Pytorch 1.13.0+cu117 - Datasets 2.20.0 - Tokenizers 0.19.1