beingbatman
commited on
Commit
•
d867726
1
Parent(s):
dd5795f
Model save
Browse files- README.md +209 -0
- model.safetensors +1 -1
README.md
ADDED
@@ -0,0 +1,209 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
library_name: transformers
|
3 |
+
license: cc-by-nc-4.0
|
4 |
+
base_model: MCG-NJU/videomae-large-finetuned-kinetics
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
metrics:
|
8 |
+
- accuracy
|
9 |
+
model-index:
|
10 |
+
- name: MAE-CT-M1N0-M12_v8_split5_v3
|
11 |
+
results: []
|
12 |
+
---
|
13 |
+
|
14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
15 |
+
should probably proofread and complete it, then remove this comment. -->
|
16 |
+
|
17 |
+
# MAE-CT-M1N0-M12_v8_split5_v3
|
18 |
+
|
19 |
+
This model is a fine-tuned version of [MCG-NJU/videomae-large-finetuned-kinetics](https://huggingface.co/MCG-NJU/videomae-large-finetuned-kinetics) on an unknown dataset.
|
20 |
+
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 1.1517
|
22 |
+
- Accuracy: 0.8701
|
23 |
+
|
24 |
+
## Model description
|
25 |
+
|
26 |
+
More information needed
|
27 |
+
|
28 |
+
## Intended uses & limitations
|
29 |
+
|
30 |
+
More information needed
|
31 |
+
|
32 |
+
## Training and evaluation data
|
33 |
+
|
34 |
+
More information needed
|
35 |
+
|
36 |
+
## Training procedure
|
37 |
+
|
38 |
+
### Training hyperparameters
|
39 |
+
|
40 |
+
The following hyperparameters were used during training:
|
41 |
+
- learning_rate: 1e-05
|
42 |
+
- train_batch_size: 4
|
43 |
+
- eval_batch_size: 4
|
44 |
+
- seed: 42
|
45 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
46 |
+
- lr_scheduler_type: linear
|
47 |
+
- lr_scheduler_warmup_ratio: 0.1
|
48 |
+
- training_steps: 10350
|
49 |
+
|
50 |
+
### Training results
|
51 |
+
|
52 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
53 |
+
|:-------------:|:--------:|:-----:|:---------------:|:--------:|
|
54 |
+
| 0.685 | 0.0068 | 70 | 0.6757 | 0.7792 |
|
55 |
+
| 0.5601 | 1.0068 | 140 | 0.6218 | 0.6234 |
|
56 |
+
| 0.6632 | 2.0068 | 210 | 0.6157 | 0.6234 |
|
57 |
+
| 0.5153 | 3.0068 | 280 | 0.5660 | 0.6364 |
|
58 |
+
| 0.5008 | 4.0068 | 350 | 0.5238 | 0.7662 |
|
59 |
+
| 0.4879 | 5.0068 | 420 | 0.5012 | 0.7792 |
|
60 |
+
| 0.3636 | 6.0068 | 490 | 0.5640 | 0.7013 |
|
61 |
+
| 0.7238 | 7.0068 | 560 | 0.5756 | 0.7013 |
|
62 |
+
| 0.3339 | 8.0068 | 630 | 0.9895 | 0.6883 |
|
63 |
+
| 0.4152 | 9.0068 | 700 | 0.5031 | 0.8182 |
|
64 |
+
| 0.3126 | 10.0068 | 770 | 0.5350 | 0.7273 |
|
65 |
+
| 0.4479 | 11.0068 | 840 | 0.4278 | 0.8312 |
|
66 |
+
| 0.5548 | 12.0068 | 910 | 0.6865 | 0.7013 |
|
67 |
+
| 0.1509 | 13.0068 | 980 | 0.8144 | 0.7143 |
|
68 |
+
| 0.4038 | 14.0068 | 1050 | 0.6039 | 0.7922 |
|
69 |
+
| 0.2748 | 15.0068 | 1120 | 1.1834 | 0.7662 |
|
70 |
+
| 0.4552 | 16.0068 | 1190 | 0.7594 | 0.7532 |
|
71 |
+
| 0.5584 | 17.0068 | 1260 | 0.9481 | 0.7922 |
|
72 |
+
| 0.0919 | 18.0068 | 1330 | 1.0080 | 0.7662 |
|
73 |
+
| 0.2309 | 19.0068 | 1400 | 0.8453 | 0.8182 |
|
74 |
+
| 0.191 | 20.0068 | 1470 | 1.0695 | 0.7662 |
|
75 |
+
| 0.2013 | 21.0068 | 1540 | 1.4657 | 0.7403 |
|
76 |
+
| 0.6645 | 22.0068 | 1610 | 1.0602 | 0.8052 |
|
77 |
+
| 0.1083 | 23.0068 | 1680 | 1.2148 | 0.7532 |
|
78 |
+
| 0.0885 | 24.0068 | 1750 | 1.2008 | 0.7792 |
|
79 |
+
| 0.0015 | 25.0068 | 1820 | 1.2987 | 0.7532 |
|
80 |
+
| 0.2372 | 26.0068 | 1890 | 1.6225 | 0.7532 |
|
81 |
+
| 0.001 | 27.0068 | 1960 | 1.1689 | 0.7662 |
|
82 |
+
| 0.0006 | 28.0068 | 2030 | 1.3817 | 0.7532 |
|
83 |
+
| 0.0002 | 29.0068 | 2100 | 1.7143 | 0.7273 |
|
84 |
+
| 0.0012 | 30.0068 | 2170 | 1.8865 | 0.7273 |
|
85 |
+
| 0.153 | 31.0068 | 2240 | 2.4574 | 0.6623 |
|
86 |
+
| 0.1308 | 32.0068 | 2310 | 1.1800 | 0.8052 |
|
87 |
+
| 0.0002 | 33.0068 | 2380 | 1.2817 | 0.7792 |
|
88 |
+
| 0.0001 | 34.0068 | 2450 | 1.2770 | 0.7792 |
|
89 |
+
| 0.0001 | 35.0068 | 2520 | 1.2779 | 0.7922 |
|
90 |
+
| 0.0001 | 36.0068 | 2590 | 1.3971 | 0.7792 |
|
91 |
+
| 0.0001 | 37.0068 | 2660 | 1.1263 | 0.8182 |
|
92 |
+
| 0.0001 | 38.0068 | 2730 | 1.1233 | 0.8182 |
|
93 |
+
| 0.0675 | 39.0068 | 2800 | 1.4885 | 0.7662 |
|
94 |
+
| 0.0002 | 40.0068 | 2870 | 1.8406 | 0.7013 |
|
95 |
+
| 0.0001 | 41.0068 | 2940 | 1.9085 | 0.7532 |
|
96 |
+
| 0.0005 | 42.0068 | 3010 | 1.9380 | 0.7143 |
|
97 |
+
| 0.1589 | 43.0068 | 3080 | 0.9674 | 0.8312 |
|
98 |
+
| 0.0001 | 44.0068 | 3150 | 1.5574 | 0.7403 |
|
99 |
+
| 0.0353 | 45.0068 | 3220 | 1.1688 | 0.8312 |
|
100 |
+
| 0.0001 | 46.0068 | 3290 | 1.7684 | 0.7143 |
|
101 |
+
| 0.0002 | 47.0068 | 3360 | 1.3363 | 0.7792 |
|
102 |
+
| 0.1237 | 48.0068 | 3430 | 1.2230 | 0.7922 |
|
103 |
+
| 0.0001 | 49.0068 | 3500 | 1.4665 | 0.7792 |
|
104 |
+
| 0.0 | 50.0068 | 3570 | 1.5472 | 0.7662 |
|
105 |
+
| 0.1479 | 51.0068 | 3640 | 2.3369 | 0.7273 |
|
106 |
+
| 0.0001 | 52.0068 | 3710 | 2.2529 | 0.6753 |
|
107 |
+
| 0.1081 | 53.0068 | 3780 | 1.4745 | 0.7273 |
|
108 |
+
| 0.0002 | 54.0068 | 3850 | 1.5813 | 0.7403 |
|
109 |
+
| 0.0119 | 55.0068 | 3920 | 1.6007 | 0.7662 |
|
110 |
+
| 0.1478 | 56.0068 | 3990 | 2.3310 | 0.7143 |
|
111 |
+
| 0.0001 | 57.0068 | 4060 | 1.4788 | 0.8052 |
|
112 |
+
| 0.0001 | 58.0068 | 4130 | 1.1851 | 0.8442 |
|
113 |
+
| 0.0001 | 59.0068 | 4200 | 1.1920 | 0.8571 |
|
114 |
+
| 0.0904 | 60.0068 | 4270 | 1.1858 | 0.8312 |
|
115 |
+
| 0.0001 | 61.0068 | 4340 | 1.4534 | 0.7662 |
|
116 |
+
| 0.0017 | 62.0068 | 4410 | 1.6716 | 0.7792 |
|
117 |
+
| 0.0001 | 63.0068 | 4480 | 2.2017 | 0.6883 |
|
118 |
+
| 0.3407 | 64.0068 | 4550 | 1.2424 | 0.8052 |
|
119 |
+
| 0.0001 | 65.0068 | 4620 | 1.5786 | 0.7792 |
|
120 |
+
| 0.0002 | 66.0068 | 4690 | 1.3379 | 0.8182 |
|
121 |
+
| 0.0005 | 67.0068 | 4760 | 1.1517 | 0.8701 |
|
122 |
+
| 0.0 | 68.0068 | 4830 | 1.5294 | 0.7792 |
|
123 |
+
| 0.0 | 69.0068 | 4900 | 2.4381 | 0.6883 |
|
124 |
+
| 0.0032 | 70.0068 | 4970 | 1.7952 | 0.7532 |
|
125 |
+
| 0.0 | 71.0068 | 5040 | 3.0253 | 0.6753 |
|
126 |
+
| 0.214 | 72.0068 | 5110 | 1.9327 | 0.7143 |
|
127 |
+
| 0.0 | 73.0068 | 5180 | 2.0236 | 0.7532 |
|
128 |
+
| 0.0 | 74.0068 | 5250 | 1.9076 | 0.7662 |
|
129 |
+
| 0.0 | 75.0068 | 5320 | 1.7070 | 0.8052 |
|
130 |
+
| 0.0003 | 76.0068 | 5390 | 1.8621 | 0.7532 |
|
131 |
+
| 0.0 | 77.0068 | 5460 | 1.8847 | 0.7662 |
|
132 |
+
| 0.0 | 78.0068 | 5530 | 1.8880 | 0.7662 |
|
133 |
+
| 0.0001 | 79.0068 | 5600 | 1.8182 | 0.7792 |
|
134 |
+
| 0.0 | 80.0068 | 5670 | 1.7965 | 0.8052 |
|
135 |
+
| 0.0001 | 81.0068 | 5740 | 3.0536 | 0.6753 |
|
136 |
+
| 0.0005 | 82.0068 | 5810 | 1.5427 | 0.7922 |
|
137 |
+
| 0.0006 | 83.0068 | 5880 | 1.8892 | 0.7403 |
|
138 |
+
| 0.0001 | 84.0068 | 5950 | 1.9648 | 0.7403 |
|
139 |
+
| 0.0 | 85.0068 | 6020 | 1.7625 | 0.7532 |
|
140 |
+
| 0.1655 | 86.0068 | 6090 | 1.6751 | 0.7662 |
|
141 |
+
| 0.0 | 87.0068 | 6160 | 1.8559 | 0.7403 |
|
142 |
+
| 0.0 | 88.0068 | 6230 | 1.8886 | 0.7532 |
|
143 |
+
| 0.0 | 89.0068 | 6300 | 1.8957 | 0.7532 |
|
144 |
+
| 0.0 | 90.0068 | 6370 | 1.8181 | 0.7662 |
|
145 |
+
| 0.0 | 91.0068 | 6440 | 1.8299 | 0.7532 |
|
146 |
+
| 0.0 | 92.0068 | 6510 | 1.5186 | 0.8182 |
|
147 |
+
| 0.0393 | 93.0068 | 6580 | 1.9234 | 0.7792 |
|
148 |
+
| 0.0 | 94.0068 | 6650 | 2.1199 | 0.7273 |
|
149 |
+
| 0.0 | 95.0068 | 6720 | 2.1309 | 0.7403 |
|
150 |
+
| 0.0009 | 96.0068 | 6790 | 1.9311 | 0.7532 |
|
151 |
+
| 0.0001 | 97.0068 | 6860 | 1.7858 | 0.7792 |
|
152 |
+
| 0.0894 | 98.0068 | 6930 | 1.5577 | 0.8052 |
|
153 |
+
| 0.0 | 99.0068 | 7000 | 1.8138 | 0.7792 |
|
154 |
+
| 0.0 | 100.0068 | 7070 | 2.0068 | 0.7532 |
|
155 |
+
| 0.0163 | 101.0068 | 7140 | 1.8340 | 0.7922 |
|
156 |
+
| 0.0 | 102.0068 | 7210 | 1.3226 | 0.8312 |
|
157 |
+
| 0.0 | 103.0068 | 7280 | 2.4607 | 0.7532 |
|
158 |
+
| 0.0683 | 104.0068 | 7350 | 1.7550 | 0.7922 |
|
159 |
+
| 0.0 | 105.0068 | 7420 | 1.4900 | 0.8312 |
|
160 |
+
| 0.0 | 106.0068 | 7490 | 1.5684 | 0.7662 |
|
161 |
+
| 0.0 | 107.0068 | 7560 | 1.7333 | 0.8052 |
|
162 |
+
| 0.0 | 108.0068 | 7630 | 1.4233 | 0.7922 |
|
163 |
+
| 0.0001 | 109.0068 | 7700 | 1.7542 | 0.7792 |
|
164 |
+
| 0.0 | 110.0068 | 7770 | 1.4554 | 0.8052 |
|
165 |
+
| 0.0 | 111.0068 | 7840 | 1.3538 | 0.8571 |
|
166 |
+
| 0.0 | 112.0068 | 7910 | 1.4165 | 0.8571 |
|
167 |
+
| 0.0 | 113.0068 | 7980 | 1.4229 | 0.8571 |
|
168 |
+
| 0.0 | 114.0068 | 8050 | 1.4191 | 0.8571 |
|
169 |
+
| 0.0 | 115.0068 | 8120 | 1.4364 | 0.8571 |
|
170 |
+
| 0.0 | 116.0068 | 8190 | 1.4575 | 0.8312 |
|
171 |
+
| 0.0 | 117.0068 | 8260 | 1.4640 | 0.8312 |
|
172 |
+
| 0.0 | 118.0068 | 8330 | 1.4807 | 0.8312 |
|
173 |
+
| 0.0 | 119.0068 | 8400 | 1.5030 | 0.8312 |
|
174 |
+
| 0.0 | 120.0068 | 8470 | 1.5188 | 0.8312 |
|
175 |
+
| 0.0 | 121.0068 | 8540 | 1.5642 | 0.8182 |
|
176 |
+
| 0.0 | 122.0068 | 8610 | 1.5663 | 0.8182 |
|
177 |
+
| 0.0 | 123.0068 | 8680 | 1.5686 | 0.8182 |
|
178 |
+
| 0.0 | 124.0068 | 8750 | 1.4284 | 0.8571 |
|
179 |
+
| 0.0 | 125.0068 | 8820 | 1.4352 | 0.8571 |
|
180 |
+
| 0.0 | 126.0068 | 8890 | 1.4392 | 0.8571 |
|
181 |
+
| 0.0 | 127.0068 | 8960 | 1.5200 | 0.8442 |
|
182 |
+
| 0.0 | 128.0068 | 9030 | 1.5244 | 0.8442 |
|
183 |
+
| 0.0 | 129.0068 | 9100 | 1.5282 | 0.8442 |
|
184 |
+
| 0.0 | 130.0068 | 9170 | 1.5338 | 0.8442 |
|
185 |
+
| 0.0 | 131.0068 | 9240 | 1.5489 | 0.8442 |
|
186 |
+
| 0.0 | 132.0068 | 9310 | 1.5530 | 0.8442 |
|
187 |
+
| 0.0 | 133.0068 | 9380 | 1.5586 | 0.8442 |
|
188 |
+
| 0.0 | 134.0068 | 9450 | 1.5642 | 0.8442 |
|
189 |
+
| 0.0 | 135.0068 | 9520 | 1.5596 | 0.8442 |
|
190 |
+
| 0.0 | 136.0068 | 9590 | 1.5681 | 0.8442 |
|
191 |
+
| 0.0 | 137.0068 | 9660 | 1.4498 | 0.8182 |
|
192 |
+
| 0.0 | 138.0068 | 9730 | 1.6159 | 0.8312 |
|
193 |
+
| 0.0 | 139.0068 | 9800 | 1.6950 | 0.8182 |
|
194 |
+
| 0.0 | 140.0068 | 9870 | 1.6978 | 0.8182 |
|
195 |
+
| 0.0 | 141.0068 | 9940 | 1.6985 | 0.8182 |
|
196 |
+
| 0.0 | 142.0068 | 10010 | 1.6995 | 0.8182 |
|
197 |
+
| 0.0 | 143.0068 | 10080 | 1.7037 | 0.8052 |
|
198 |
+
| 0.0 | 144.0068 | 10150 | 1.7056 | 0.8052 |
|
199 |
+
| 0.0 | 145.0068 | 10220 | 1.7054 | 0.8052 |
|
200 |
+
| 0.0 | 146.0068 | 10290 | 1.7054 | 0.8052 |
|
201 |
+
| 0.0 | 147.0058 | 10350 | 1.7041 | 0.8052 |
|
202 |
+
|
203 |
+
|
204 |
+
### Framework versions
|
205 |
+
|
206 |
+
- Transformers 4.46.2
|
207 |
+
- Pytorch 2.0.1+cu117
|
208 |
+
- Datasets 3.0.1
|
209 |
+
- Tokenizers 0.20.0
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1215496208
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5e5d0e57a1528802c801ab3fbed49431f8417c6342baf5bb1ee5338ae974f101
|
3 |
size 1215496208
|