File size: 4,850 Bytes
04783c7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
---

license: cc-by-nc-4.0
base_model: facebook/timesformer-base-finetuned-k400
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: tsf-gs-rots-wtoken-DRPT0.3-r224-f150-6.6-h768-i3072-p32-b8-e50
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# tsf-gs-rots-wtoken-DRPT0.3-r224-f150-6.6-h768-i3072-p32-b8-e50

This model is a fine-tuned version of [facebook/timesformer-base-finetuned-k400](https://huggingface.co/facebook/timesformer-base-finetuned-k400) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2019
- Accuracy: 0.6043

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05

- train_batch_size: 8

- eval_batch_size: 8

- seed: 42

- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08

- lr_scheduler_type: linear

- lr_scheduler_warmup_ratio: 0.1
- training_steps: 5400

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch   | Step | Validation Loss | Accuracy |

|:-------------:|:-------:|:----:|:---------------:|:--------:|

| 1.1714        | 0.0202  | 109  | 1.1130          | 0.3369   |

| 1.1431        | 1.0202  | 218  | 1.1018          | 0.3369   |

| 1.1363        | 2.0202  | 327  | 1.1018          | 0.3369   |

| 1.1117        | 3.0202  | 436  | 1.1062          | 0.3369   |

| 1.0901        | 4.0202  | 545  | 1.1024          | 0.3369   |

| 1.061         | 5.0202  | 654  | 1.1049          | 0.3262   |

| 1.1047        | 6.0202  | 763  | 1.0996          | 0.3262   |

| 1.0911        | 7.0202  | 872  | 1.1056          | 0.3369   |

| 1.1305        | 8.0202  | 981  | 1.0994          | 0.3262   |

| 1.1145        | 9.0202  | 1090 | 1.0978          | 0.3262   |

| 1.1075        | 10.0202 | 1199 | 1.0921          | 0.3369   |

| 1.0886        | 11.0202 | 1308 | 1.0732          | 0.4118   |

| 1.2003        | 12.0202 | 1417 | 1.1430          | 0.3369   |

| 1.1006        | 13.0202 | 1526 | 1.0699          | 0.4599   |

| 1.0594        | 14.0202 | 1635 | 1.0786          | 0.3743   |

| 1.0283        | 15.0202 | 1744 | 1.0056          | 0.4813   |

| 1.0804        | 16.0202 | 1853 | 1.0239          | 0.4332   |

| 1.1055        | 17.0202 | 1962 | 1.0384          | 0.5027   |

| 1.0577        | 18.0202 | 2071 | 0.9683          | 0.4813   |

| 0.9961        | 19.0202 | 2180 | 1.1497          | 0.3422   |

| 0.9365        | 20.0202 | 2289 | 1.1324          | 0.4064   |

| 1.0121        | 21.0202 | 2398 | 1.0838          | 0.5080   |

| 0.8774        | 22.0202 | 2507 | 1.1312          | 0.5722   |

| 0.8922        | 23.0202 | 2616 | 1.1480          | 0.5241   |

| 1.0667        | 24.0202 | 2725 | 0.9482          | 0.6096   |

| 0.825         | 25.0202 | 2834 | 0.8572          | 0.6524   |

| 0.985         | 26.0202 | 2943 | 0.9455          | 0.6043   |

| 0.8331        | 27.0202 | 3052 | 0.7787          | 0.6310   |

| 0.8799        | 28.0202 | 3161 | 0.8728          | 0.5989   |

| 0.7616        | 29.0202 | 3270 | 0.7851          | 0.6952   |

| 0.6198        | 30.0202 | 3379 | 0.8857          | 0.6524   |

| 0.9501        | 31.0202 | 3488 | 0.7847          | 0.6791   |

| 0.708         | 32.0202 | 3597 | 0.7956          | 0.6684   |

| 0.697         | 33.0202 | 3706 | 0.8540          | 0.6471   |

| 0.7463        | 34.0202 | 3815 | 0.7873          | 0.7005   |

| 0.6234        | 35.0202 | 3924 | 1.0743          | 0.5615   |

| 0.7174        | 36.0202 | 4033 | 0.7159          | 0.7219   |

| 0.6577        | 37.0202 | 4142 | 0.8100          | 0.6417   |

| 0.7482        | 38.0202 | 4251 | 0.8508          | 0.6631   |

| 0.6993        | 39.0202 | 4360 | 0.8796          | 0.6684   |

| 0.6782        | 40.0202 | 4469 | 0.9814          | 0.6578   |

| 0.614         | 41.0202 | 4578 | 0.8482          | 0.6631   |

| 0.7516        | 42.0202 | 4687 | 0.9654          | 0.6471   |

| 0.7425        | 43.0202 | 4796 | 1.0050          | 0.6257   |

| 0.6818        | 44.0202 | 4905 | 1.1266          | 0.6096   |

| 0.4599        | 45.0202 | 5014 | 1.1824          | 0.6364   |

| 0.7561        | 46.0202 | 5123 | 1.1567          | 0.6417   |

| 0.7543        | 47.0202 | 5232 | 1.2946          | 0.5882   |

| 0.4198        | 48.0202 | 5341 | 1.2598          | 0.5882   |

| 0.6238        | 49.0109 | 5400 | 1.2019          | 0.6043   |





### Framework versions



- Transformers 4.41.2

- Pytorch 1.13.0+cu117

- Datasets 2.20.0

- Tokenizers 0.19.1