VideoMAE_Base_WLASL_100_200_epochs_p20_SR_8
This model is a fine-tuned version of MCG-NJU/videomae-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.8301
- Top 1 Accuracy: 0.5296
- Top 5 Accuracy: 0.7988
- Top 10 Accuracy: 0.8698
- Accuracy: 0.5296
- Precision: 0.5768
- Recall: 0.5296
- F1: 0.5133
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 36000
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Top 1 Accuracy | Top 5 Accuracy | Top 10 Accuracy | Accuracy | Precision | Recall | F1 |
|---|---|---|---|---|---|---|---|---|---|---|
| 18.5755 | 0.005 | 180 | 4.6413 | 0.0089 | 0.0444 | 0.1006 | 0.0089 | 0.0005 | 0.0089 | 0.0010 |
| 18.4843 | 1.0050 | 360 | 4.6308 | 0.0089 | 0.0621 | 0.1006 | 0.0089 | 0.0006 | 0.0089 | 0.0012 |
| 18.4592 | 2.0050 | 541 | 4.6209 | 0.0207 | 0.0710 | 0.1065 | 0.0207 | 0.0017 | 0.0207 | 0.0027 |
| 18.4604 | 3.005 | 721 | 4.6171 | 0.0178 | 0.0680 | 0.1213 | 0.0178 | 0.0005 | 0.0178 | 0.0010 |
| 18.4543 | 4.0050 | 901 | 4.6106 | 0.0237 | 0.0858 | 0.1450 | 0.0237 | 0.0008 | 0.0237 | 0.0016 |
| 18.3384 | 5.0050 | 1081 | 4.6190 | 0.0178 | 0.0680 | 0.1095 | 0.0178 | 0.0003 | 0.0178 | 0.0007 |
| 18.1947 | 6.0050 | 1262 | 4.6096 | 0.0296 | 0.0917 | 0.1361 | 0.0296 | 0.0038 | 0.0296 | 0.0057 |
| 18.1104 | 7.005 | 1442 | 4.5992 | 0.0355 | 0.0947 | 0.1450 | 0.0355 | 0.0060 | 0.0355 | 0.0082 |
| 18.0825 | 8.0050 | 1622 | 4.5940 | 0.0178 | 0.0976 | 0.1568 | 0.0178 | 0.0059 | 0.0178 | 0.0080 |
| 17.9081 | 9.0050 | 1802 | 4.5627 | 0.0325 | 0.1065 | 0.1686 | 0.0325 | 0.0076 | 0.0325 | 0.0103 |
| 17.4986 | 10.0050 | 1983 | 4.4287 | 0.0355 | 0.1213 | 0.2012 | 0.0355 | 0.0087 | 0.0355 | 0.0126 |
| 16.3869 | 11.005 | 2163 | 4.1366 | 0.0651 | 0.2337 | 0.3757 | 0.0651 | 0.0228 | 0.0651 | 0.0289 |
| 15.0278 | 12.0050 | 2343 | 3.7821 | 0.1065 | 0.3580 | 0.5178 | 0.1065 | 0.0601 | 0.1065 | 0.0572 |
| 13.4974 | 13.0050 | 2523 | 3.4419 | 0.2012 | 0.5030 | 0.6686 | 0.2012 | 0.1761 | 0.2012 | 0.1544 |
| 11.5774 | 14.0050 | 2704 | 3.2605 | 0.2189 | 0.5355 | 0.6982 | 0.2189 | 0.2069 | 0.2189 | 0.1725 |
| 10.263 | 15.005 | 2884 | 2.8508 | 0.3195 | 0.6627 | 0.8077 | 0.3195 | 0.3730 | 0.3195 | 0.2952 |
| 8.1589 | 16.0050 | 3064 | 2.5945 | 0.3905 | 0.7308 | 0.8462 | 0.3905 | 0.4168 | 0.3905 | 0.3592 |
| 6.8221 | 17.0050 | 3244 | 2.4311 | 0.3994 | 0.7337 | 0.8728 | 0.3994 | 0.4000 | 0.3994 | 0.3640 |
| 5.4923 | 18.0050 | 3425 | 2.2139 | 0.4615 | 0.7929 | 0.8669 | 0.4645 | 0.4886 | 0.4645 | 0.4350 |
| 4.1619 | 19.005 | 3605 | 2.1384 | 0.4734 | 0.7840 | 0.8817 | 0.4734 | 0.5276 | 0.4734 | 0.4503 |
| 3.3413 | 20.0050 | 3785 | 1.9583 | 0.5118 | 0.8107 | 0.9053 | 0.5118 | 0.5485 | 0.5118 | 0.4908 |
| 2.5832 | 21.0050 | 3965 | 1.8604 | 0.5 | 0.8284 | 0.9201 | 0.5 | 0.5000 | 0.5 | 0.4713 |
| 1.9003 | 22.0050 | 4146 | 1.9390 | 0.5296 | 0.8195 | 0.8905 | 0.5296 | 0.5860 | 0.5296 | 0.5140 |
| 1.4226 | 23.005 | 4326 | 1.9288 | 0.5266 | 0.8077 | 0.9083 | 0.5266 | 0.6341 | 0.5266 | 0.5303 |
| 1.1341 | 24.0050 | 4506 | 1.8854 | 0.5266 | 0.7899 | 0.8964 | 0.5266 | 0.5719 | 0.5266 | 0.5079 |
| 0.9315 | 25.0050 | 4686 | 1.7328 | 0.5769 | 0.8462 | 0.8994 | 0.5769 | 0.6261 | 0.5769 | 0.5565 |
| 0.7536 | 26.0050 | 4867 | 1.8349 | 0.5385 | 0.8195 | 0.8964 | 0.5385 | 0.6288 | 0.5385 | 0.5296 |
| 0.4518 | 27.005 | 5047 | 1.7999 | 0.5533 | 0.8550 | 0.9172 | 0.5533 | 0.6276 | 0.5533 | 0.5437 |
| 0.3322 | 28.0050 | 5227 | 1.6931 | 0.6006 | 0.8491 | 0.9260 | 0.6006 | 0.6578 | 0.6006 | 0.5896 |
| 0.403 | 29.0050 | 5407 | 1.8000 | 0.5740 | 0.8462 | 0.9142 | 0.5740 | 0.6240 | 0.5740 | 0.5584 |
| 0.1837 | 30.0050 | 5588 | 1.8765 | 0.5769 | 0.8373 | 0.8935 | 0.5769 | 0.6391 | 0.5769 | 0.5664 |
| 0.1579 | 31.005 | 5768 | 2.0752 | 0.5473 | 0.8432 | 0.8994 | 0.5473 | 0.6337 | 0.5473 | 0.5426 |
| 0.2079 | 32.0050 | 5948 | 1.9234 | 0.5947 | 0.8136 | 0.8964 | 0.5947 | 0.6524 | 0.5947 | 0.5791 |
| 0.2738 | 33.0050 | 6128 | 1.8529 | 0.6036 | 0.8491 | 0.9024 | 0.6065 | 0.6622 | 0.6065 | 0.5892 |
| 0.2621 | 34.0050 | 6309 | 1.9906 | 0.5740 | 0.8284 | 0.9172 | 0.5740 | 0.6191 | 0.5740 | 0.5640 |
| 0.2024 | 35.005 | 6489 | 1.8942 | 0.5976 | 0.8639 | 0.9260 | 0.5976 | 0.6615 | 0.5976 | 0.5886 |
| 0.0983 | 36.0050 | 6669 | 2.0340 | 0.5858 | 0.8254 | 0.8846 | 0.5858 | 0.6500 | 0.5858 | 0.5729 |
| 0.0592 | 37.0050 | 6849 | 1.8493 | 0.6095 | 0.8609 | 0.9231 | 0.6095 | 0.6775 | 0.6095 | 0.5986 |
| 0.0922 | 38.0050 | 7030 | 1.9036 | 0.6302 | 0.8669 | 0.9260 | 0.6302 | 0.6825 | 0.6302 | 0.6125 |
| 0.1547 | 39.005 | 7210 | 1.9897 | 0.6036 | 0.8432 | 0.9053 | 0.6036 | 0.6726 | 0.6036 | 0.5948 |
| 0.1162 | 40.0050 | 7390 | 2.3056 | 0.5828 | 0.8284 | 0.8876 | 0.5828 | 0.6518 | 0.5828 | 0.5675 |
| 0.0514 | 41.0050 | 7570 | 2.3211 | 0.5888 | 0.7988 | 0.8817 | 0.5888 | 0.6510 | 0.5888 | 0.5767 |
| 0.1138 | 42.0050 | 7751 | 2.3149 | 0.5740 | 0.8491 | 0.9053 | 0.5740 | 0.6285 | 0.5740 | 0.5628 |
| 0.1197 | 43.005 | 7931 | 2.1156 | 0.6124 | 0.8669 | 0.9172 | 0.6124 | 0.6907 | 0.6124 | 0.6073 |
| 0.0673 | 44.0050 | 8111 | 2.2835 | 0.5828 | 0.8580 | 0.9083 | 0.5828 | 0.6274 | 0.5828 | 0.5641 |
| 0.1501 | 45.0050 | 8291 | 2.2719 | 0.5917 | 0.8521 | 0.8905 | 0.5917 | 0.6419 | 0.5917 | 0.5757 |
| 0.2022 | 46.0050 | 8472 | 2.3422 | 0.5562 | 0.8491 | 0.9053 | 0.5562 | 0.6034 | 0.5562 | 0.5402 |
| 0.2185 | 47.005 | 8652 | 2.6431 | 0.5237 | 0.8284 | 0.8817 | 0.5237 | 0.5808 | 0.5237 | 0.5125 |
| 0.2385 | 48.0050 | 8832 | 2.3147 | 0.5799 | 0.8521 | 0.9053 | 0.5799 | 0.6324 | 0.5799 | 0.5623 |
| 0.1769 | 49.0050 | 9012 | 2.3451 | 0.5769 | 0.8373 | 0.8876 | 0.5769 | 0.6246 | 0.5769 | 0.5622 |
| 0.1927 | 50.0050 | 9193 | 2.7140 | 0.5562 | 0.8018 | 0.8728 | 0.5562 | 0.6025 | 0.5562 | 0.5347 |
| 0.2048 | 51.005 | 9373 | 2.3876 | 0.5917 | 0.8225 | 0.8935 | 0.5917 | 0.6367 | 0.5917 | 0.5748 |
| 0.1608 | 52.0050 | 9553 | 2.6983 | 0.5266 | 0.8077 | 0.8580 | 0.5266 | 0.5645 | 0.5266 | 0.5013 |
| 0.1256 | 53.0050 | 9733 | 2.7464 | 0.5385 | 0.8018 | 0.8905 | 0.5385 | 0.5773 | 0.5385 | 0.5257 |
| 0.1327 | 54.0050 | 9914 | 2.5133 | 0.5651 | 0.7988 | 0.8846 | 0.5651 | 0.5812 | 0.5651 | 0.5342 |
| 0.0503 | 55.005 | 10094 | 2.5687 | 0.5769 | 0.8314 | 0.8964 | 0.5769 | 0.6336 | 0.5769 | 0.5635 |
| 0.0841 | 56.0050 | 10274 | 2.7311 | 0.5503 | 0.8284 | 0.8905 | 0.5503 | 0.6025 | 0.5503 | 0.5329 |
| 0.0888 | 57.0050 | 10454 | 2.6771 | 0.5592 | 0.8195 | 0.8935 | 0.5592 | 0.6142 | 0.5592 | 0.5473 |
| 0.0629 | 58.0050 | 10635 | 2.8301 | 0.5296 | 0.7988 | 0.8698 | 0.5296 | 0.5768 | 0.5296 | 0.5133 |
Framework versions
- Transformers 4.46.1
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.1
- Downloads last month
- 377
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Shawon16/VideoMAE_Base_WLASL_100_200_epochs_p20_SR_8
Base model
MCG-NJU/videomae-base