beingbatman
commited on
Commit
•
621dde0
1
Parent(s):
d8c36cd
Model save
Browse files- README.md +209 -0
- model.safetensors +1 -1
README.md
ADDED
@@ -0,0 +1,209 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
library_name: transformers
|
3 |
+
license: cc-by-nc-4.0
|
4 |
+
base_model: MCG-NJU/videomae-large-finetuned-kinetics
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
metrics:
|
8 |
+
- accuracy
|
9 |
+
model-index:
|
10 |
+
- name: MAE-CT-M1N0-M12_v8_split3_v3
|
11 |
+
results: []
|
12 |
+
---
|
13 |
+
|
14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
15 |
+
should probably proofread and complete it, then remove this comment. -->
|
16 |
+
|
17 |
+
# MAE-CT-M1N0-M12_v8_split3_v3
|
18 |
+
|
19 |
+
This model is a fine-tuned version of [MCG-NJU/videomae-large-finetuned-kinetics](https://huggingface.co/MCG-NJU/videomae-large-finetuned-kinetics) on an unknown dataset.
|
20 |
+
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 1.0548
|
22 |
+
- Accuracy: 0.8462
|
23 |
+
|
24 |
+
## Model description
|
25 |
+
|
26 |
+
More information needed
|
27 |
+
|
28 |
+
## Intended uses & limitations
|
29 |
+
|
30 |
+
More information needed
|
31 |
+
|
32 |
+
## Training and evaluation data
|
33 |
+
|
34 |
+
More information needed
|
35 |
+
|
36 |
+
## Training procedure
|
37 |
+
|
38 |
+
### Training hyperparameters
|
39 |
+
|
40 |
+
The following hyperparameters were used during training:
|
41 |
+
- learning_rate: 1e-05
|
42 |
+
- train_batch_size: 4
|
43 |
+
- eval_batch_size: 4
|
44 |
+
- seed: 42
|
45 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
46 |
+
- lr_scheduler_type: linear
|
47 |
+
- lr_scheduler_warmup_ratio: 0.1
|
48 |
+
- training_steps: 10350
|
49 |
+
|
50 |
+
### Training results
|
51 |
+
|
52 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
53 |
+
|:-------------:|:--------:|:-----:|:---------------:|:--------:|
|
54 |
+
| 0.6676 | 0.0068 | 70 | 0.6765 | 0.6538 |
|
55 |
+
| 0.6735 | 1.0068 | 140 | 0.6379 | 0.6667 |
|
56 |
+
| 0.6391 | 2.0068 | 210 | 0.6261 | 0.6667 |
|
57 |
+
| 0.8085 | 3.0068 | 280 | 0.6050 | 0.6667 |
|
58 |
+
| 0.4774 | 4.0068 | 350 | 0.5725 | 0.6538 |
|
59 |
+
| 0.5668 | 5.0068 | 420 | 0.5514 | 0.7179 |
|
60 |
+
| 0.6607 | 6.0068 | 490 | 0.4834 | 0.7564 |
|
61 |
+
| 1.0769 | 7.0068 | 560 | 0.7450 | 0.6667 |
|
62 |
+
| 0.5425 | 8.0068 | 630 | 0.4623 | 0.7949 |
|
63 |
+
| 0.2959 | 9.0068 | 700 | 0.5113 | 0.7564 |
|
64 |
+
| 0.7674 | 10.0068 | 770 | 0.5330 | 0.7308 |
|
65 |
+
| 0.3015 | 11.0068 | 840 | 0.6790 | 0.7436 |
|
66 |
+
| 0.6252 | 12.0068 | 910 | 1.6537 | 0.6667 |
|
67 |
+
| 0.4554 | 13.0068 | 980 | 0.8944 | 0.7564 |
|
68 |
+
| 0.364 | 14.0068 | 1050 | 0.8103 | 0.7179 |
|
69 |
+
| 0.444 | 15.0068 | 1120 | 0.7103 | 0.7821 |
|
70 |
+
| 0.104 | 16.0068 | 1190 | 0.9000 | 0.7179 |
|
71 |
+
| 0.5647 | 17.0068 | 1260 | 1.4782 | 0.7051 |
|
72 |
+
| 0.783 | 18.0068 | 1330 | 0.8539 | 0.8205 |
|
73 |
+
| 0.5938 | 19.0068 | 1400 | 0.8426 | 0.7564 |
|
74 |
+
| 0.5341 | 20.0068 | 1470 | 0.9862 | 0.7949 |
|
75 |
+
| 0.3391 | 21.0068 | 1540 | 1.1159 | 0.7692 |
|
76 |
+
| 0.2071 | 22.0068 | 1610 | 1.5833 | 0.7692 |
|
77 |
+
| 0.1159 | 23.0068 | 1680 | 1.0205 | 0.8205 |
|
78 |
+
| 0.1579 | 24.0068 | 1750 | 1.4633 | 0.7692 |
|
79 |
+
| 0.1042 | 25.0068 | 1820 | 1.5864 | 0.7564 |
|
80 |
+
| 0.1466 | 26.0068 | 1890 | 1.2990 | 0.7692 |
|
81 |
+
| 0.0006 | 27.0068 | 1960 | 1.5597 | 0.7564 |
|
82 |
+
| 0.001 | 28.0068 | 2030 | 1.6434 | 0.7564 |
|
83 |
+
| 0.0096 | 29.0068 | 2100 | 1.4161 | 0.7821 |
|
84 |
+
| 0.0015 | 30.0068 | 2170 | 1.3385 | 0.7949 |
|
85 |
+
| 0.3627 | 31.0068 | 2240 | 1.7864 | 0.7436 |
|
86 |
+
| 0.1541 | 32.0068 | 2310 | 1.5618 | 0.7821 |
|
87 |
+
| 0.1285 | 33.0068 | 2380 | 1.3062 | 0.7949 |
|
88 |
+
| 0.2193 | 34.0068 | 2450 | 1.5554 | 0.7821 |
|
89 |
+
| 0.0002 | 35.0068 | 2520 | 1.5445 | 0.7949 |
|
90 |
+
| 0.0003 | 36.0068 | 2590 | 1.7189 | 0.7821 |
|
91 |
+
| 0.092 | 37.0068 | 2660 | 1.4980 | 0.7692 |
|
92 |
+
| 0.2403 | 38.0068 | 2730 | 2.0993 | 0.7436 |
|
93 |
+
| 0.0001 | 39.0068 | 2800 | 1.8646 | 0.7436 |
|
94 |
+
| 0.0001 | 40.0068 | 2870 | 1.6642 | 0.7436 |
|
95 |
+
| 0.2108 | 41.0068 | 2940 | 1.4566 | 0.7949 |
|
96 |
+
| 0.1085 | 42.0068 | 3010 | 1.2915 | 0.7821 |
|
97 |
+
| 0.0019 | 43.0068 | 3080 | 1.2691 | 0.8333 |
|
98 |
+
| 0.0072 | 44.0068 | 3150 | 2.2703 | 0.7179 |
|
99 |
+
| 0.0002 | 45.0068 | 3220 | 1.5151 | 0.7564 |
|
100 |
+
| 0.4057 | 46.0068 | 3290 | 1.4170 | 0.7564 |
|
101 |
+
| 0.0862 | 47.0068 | 3360 | 2.1072 | 0.7308 |
|
102 |
+
| 0.0478 | 48.0068 | 3430 | 1.6826 | 0.7949 |
|
103 |
+
| 0.001 | 49.0068 | 3500 | 1.4349 | 0.7564 |
|
104 |
+
| 0.0001 | 50.0068 | 3570 | 1.4423 | 0.7949 |
|
105 |
+
| 0.0056 | 51.0068 | 3640 | 2.2833 | 0.7051 |
|
106 |
+
| 0.0004 | 52.0068 | 3710 | 1.0548 | 0.8462 |
|
107 |
+
| 0.1768 | 53.0068 | 3780 | 1.2974 | 0.8077 |
|
108 |
+
| 0.0001 | 54.0068 | 3850 | 1.2797 | 0.8333 |
|
109 |
+
| 0.1027 | 55.0068 | 3920 | 1.5159 | 0.8077 |
|
110 |
+
| 0.1638 | 56.0068 | 3990 | 1.9403 | 0.7564 |
|
111 |
+
| 0.0001 | 57.0068 | 4060 | 1.5075 | 0.8077 |
|
112 |
+
| 0.0003 | 58.0068 | 4130 | 2.1291 | 0.7436 |
|
113 |
+
| 0.004 | 59.0068 | 4200 | 1.6104 | 0.7949 |
|
114 |
+
| 0.0214 | 60.0068 | 4270 | 1.7017 | 0.7949 |
|
115 |
+
| 0.0 | 61.0068 | 4340 | 1.6485 | 0.8205 |
|
116 |
+
| 0.0 | 62.0068 | 4410 | 1.6668 | 0.8077 |
|
117 |
+
| 0.1604 | 63.0068 | 4480 | 1.7437 | 0.7949 |
|
118 |
+
| 0.0002 | 64.0068 | 4550 | 1.6770 | 0.7821 |
|
119 |
+
| 0.0 | 65.0068 | 4620 | 1.7766 | 0.7821 |
|
120 |
+
| 0.0214 | 66.0068 | 4690 | 1.6351 | 0.7821 |
|
121 |
+
| 0.0 | 67.0068 | 4760 | 1.6878 | 0.7821 |
|
122 |
+
| 0.0 | 68.0068 | 4830 | 1.8764 | 0.7564 |
|
123 |
+
| 0.2082 | 69.0068 | 4900 | 1.7799 | 0.7564 |
|
124 |
+
| 0.0 | 70.0068 | 4970 | 1.7388 | 0.7949 |
|
125 |
+
| 0.0 | 71.0068 | 5040 | 1.6719 | 0.7949 |
|
126 |
+
| 0.0001 | 72.0068 | 5110 | 1.6066 | 0.7949 |
|
127 |
+
| 0.0001 | 73.0068 | 5180 | 2.1181 | 0.7564 |
|
128 |
+
| 0.0 | 74.0068 | 5250 | 2.1773 | 0.7564 |
|
129 |
+
| 0.0 | 75.0068 | 5320 | 2.5632 | 0.7179 |
|
130 |
+
| 0.0 | 76.0068 | 5390 | 1.1549 | 0.8077 |
|
131 |
+
| 0.0006 | 77.0068 | 5460 | 2.2296 | 0.7436 |
|
132 |
+
| 0.0 | 78.0068 | 5530 | 2.5382 | 0.7308 |
|
133 |
+
| 0.0001 | 79.0068 | 5600 | 1.5726 | 0.7821 |
|
134 |
+
| 0.0002 | 80.0068 | 5670 | 1.5704 | 0.8077 |
|
135 |
+
| 0.0199 | 81.0068 | 5740 | 1.5503 | 0.8077 |
|
136 |
+
| 0.0001 | 82.0068 | 5810 | 1.3654 | 0.8077 |
|
137 |
+
| 0.0 | 83.0068 | 5880 | 1.4727 | 0.8077 |
|
138 |
+
| 0.0001 | 84.0068 | 5950 | 2.2248 | 0.7179 |
|
139 |
+
| 0.0 | 85.0068 | 6020 | 1.7188 | 0.7821 |
|
140 |
+
| 0.0101 | 86.0068 | 6090 | 1.9271 | 0.7436 |
|
141 |
+
| 0.0001 | 87.0068 | 6160 | 1.5356 | 0.8333 |
|
142 |
+
| 0.0 | 88.0068 | 6230 | 1.6683 | 0.8077 |
|
143 |
+
| 0.0 | 89.0068 | 6300 | 2.5566 | 0.7308 |
|
144 |
+
| 0.0 | 90.0068 | 6370 | 1.7776 | 0.8077 |
|
145 |
+
| 0.0028 | 91.0068 | 6440 | 1.7494 | 0.7949 |
|
146 |
+
| 0.0 | 92.0068 | 6510 | 1.6141 | 0.8077 |
|
147 |
+
| 0.0 | 93.0068 | 6580 | 2.8243 | 0.7179 |
|
148 |
+
| 0.0 | 94.0068 | 6650 | 1.9026 | 0.7051 |
|
149 |
+
| 0.0002 | 95.0068 | 6720 | 2.2368 | 0.6667 |
|
150 |
+
| 0.0998 | 96.0068 | 6790 | 2.4985 | 0.7179 |
|
151 |
+
| 0.0 | 97.0068 | 6860 | 2.0686 | 0.7436 |
|
152 |
+
| 0.0074 | 98.0068 | 6930 | 2.7098 | 0.7051 |
|
153 |
+
| 0.0 | 99.0068 | 7000 | 2.2765 | 0.7308 |
|
154 |
+
| 0.0 | 100.0068 | 7070 | 2.2793 | 0.7308 |
|
155 |
+
| 0.0 | 101.0068 | 7140 | 2.2027 | 0.7308 |
|
156 |
+
| 0.0 | 102.0068 | 7210 | 2.2387 | 0.7308 |
|
157 |
+
| 0.0 | 103.0068 | 7280 | 2.1971 | 0.7308 |
|
158 |
+
| 0.0 | 104.0068 | 7350 | 2.3246 | 0.7179 |
|
159 |
+
| 0.0 | 105.0068 | 7420 | 1.5935 | 0.8077 |
|
160 |
+
| 0.0 | 106.0068 | 7490 | 1.4796 | 0.8077 |
|
161 |
+
| 0.0001 | 107.0068 | 7560 | 1.7052 | 0.7821 |
|
162 |
+
| 0.0 | 108.0068 | 7630 | 1.6022 | 0.8077 |
|
163 |
+
| 0.0002 | 109.0068 | 7700 | 1.6749 | 0.8077 |
|
164 |
+
| 0.0 | 110.0068 | 7770 | 1.7948 | 0.7436 |
|
165 |
+
| 0.0 | 111.0068 | 7840 | 1.8455 | 0.7949 |
|
166 |
+
| 0.0 | 112.0068 | 7910 | 1.8600 | 0.7949 |
|
167 |
+
| 0.0 | 113.0068 | 7980 | 1.8183 | 0.7949 |
|
168 |
+
| 0.0 | 114.0068 | 8050 | 1.7862 | 0.7949 |
|
169 |
+
| 0.0 | 115.0068 | 8120 | 1.8597 | 0.7821 |
|
170 |
+
| 0.0 | 116.0068 | 8190 | 1.8203 | 0.7821 |
|
171 |
+
| 0.0 | 117.0068 | 8260 | 1.8343 | 0.7949 |
|
172 |
+
| 0.0 | 118.0068 | 8330 | 1.8417 | 0.7949 |
|
173 |
+
| 0.0 | 119.0068 | 8400 | 1.7663 | 0.7821 |
|
174 |
+
| 0.0 | 120.0068 | 8470 | 1.9611 | 0.7821 |
|
175 |
+
| 0.0 | 121.0068 | 8540 | 1.9584 | 0.7949 |
|
176 |
+
| 0.0 | 122.0068 | 8610 | 1.5671 | 0.8205 |
|
177 |
+
| 0.0 | 123.0068 | 8680 | 2.3456 | 0.7564 |
|
178 |
+
| 0.0 | 124.0068 | 8750 | 2.3453 | 0.7564 |
|
179 |
+
| 0.0 | 125.0068 | 8820 | 2.4120 | 0.7436 |
|
180 |
+
| 0.0 | 126.0068 | 8890 | 2.3774 | 0.7436 |
|
181 |
+
| 0.0 | 127.0068 | 8960 | 2.3609 | 0.7436 |
|
182 |
+
| 0.0 | 128.0068 | 9030 | 2.3531 | 0.7564 |
|
183 |
+
| 0.0 | 129.0068 | 9100 | 1.9910 | 0.7821 |
|
184 |
+
| 0.0 | 130.0068 | 9170 | 2.0032 | 0.7821 |
|
185 |
+
| 0.0 | 131.0068 | 9240 | 2.0645 | 0.7692 |
|
186 |
+
| 0.0 | 132.0068 | 9310 | 2.0598 | 0.7692 |
|
187 |
+
| 0.0 | 133.0068 | 9380 | 2.0594 | 0.7692 |
|
188 |
+
| 0.0 | 134.0068 | 9450 | 2.0568 | 0.7692 |
|
189 |
+
| 0.0 | 135.0068 | 9520 | 2.0522 | 0.7821 |
|
190 |
+
| 0.0 | 136.0068 | 9590 | 1.9971 | 0.7564 |
|
191 |
+
| 0.0 | 137.0068 | 9660 | 1.9977 | 0.7564 |
|
192 |
+
| 0.0 | 138.0068 | 9730 | 2.0896 | 0.7692 |
|
193 |
+
| 0.0 | 139.0068 | 9800 | 2.1550 | 0.7692 |
|
194 |
+
| 0.0 | 140.0068 | 9870 | 2.1875 | 0.7692 |
|
195 |
+
| 0.0 | 141.0068 | 9940 | 2.1874 | 0.7692 |
|
196 |
+
| 0.0 | 142.0068 | 10010 | 2.1822 | 0.7692 |
|
197 |
+
| 0.0 | 143.0068 | 10080 | 2.1818 | 0.7692 |
|
198 |
+
| 0.0 | 144.0068 | 10150 | 2.1806 | 0.7692 |
|
199 |
+
| 0.0 | 145.0068 | 10220 | 2.1803 | 0.7692 |
|
200 |
+
| 0.0 | 146.0068 | 10290 | 2.1804 | 0.7692 |
|
201 |
+
| 0.0 | 147.0058 | 10350 | 2.1803 | 0.7692 |
|
202 |
+
|
203 |
+
|
204 |
+
### Framework versions
|
205 |
+
|
206 |
+
- Transformers 4.46.2
|
207 |
+
- Pytorch 2.0.1+cu117
|
208 |
+
- Datasets 3.0.1
|
209 |
+
- Tokenizers 0.20.0
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1215496208
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3069c194e2b6dd0e4587e021333a4b340c794095dc7abbb16a789fbb5a7a2dea
|
3 |
size 1215496208
|