SimonMA commited on
Commit
1e1d7e7
β€’
1 Parent(s): 8a940bf

End of training

Browse files
README.md ADDED
@@ -0,0 +1,219 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: codellama/CodeLlama-7b-Instruct-hf
3
+ library_name: peft
4
+ license: llama2
5
+ tags:
6
+ - trl
7
+ - sft
8
+ - generated_from_trainer
9
+ model-index:
10
+ - name: ECS-Codellama-7b-lora-rps-adapter
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # ECS-Codellama-7b-lora-rps-adapter
18
+
19
+ This model is a fine-tuned version of [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) on the None dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.2955
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 0.0002
41
+ - train_batch_size: 2
42
+ - eval_batch_size: 2
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - lr_scheduler_warmup_ratio: 0.03
47
+ - num_epochs: 4
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss |
52
+ |:-------------:|:------:|:-----:|:---------------:|
53
+ | 0.1784 | 2.6210 | 15000 | 0.2849 |
54
+ | 0.2039 | 2.6297 | 15050 | 0.2825 |
55
+ | 0.194 | 2.6385 | 15100 | 0.2842 |
56
+ | 0.2073 | 2.6472 | 15150 | 0.2844 |
57
+ | 0.1818 | 2.6559 | 15200 | 0.2841 |
58
+ | 0.1858 | 2.6647 | 15250 | 0.2837 |
59
+ | 0.191 | 2.6734 | 15300 | 0.2821 |
60
+ | 0.2024 | 2.6822 | 15350 | 0.2814 |
61
+ | 0.1699 | 2.6909 | 15400 | 0.2832 |
62
+ | 0.1782 | 2.6996 | 15450 | 0.2813 |
63
+ | 0.1971 | 2.7084 | 15500 | 0.2818 |
64
+ | 0.1974 | 2.7171 | 15550 | 0.2811 |
65
+ | 0.1867 | 2.7258 | 15600 | 0.2818 |
66
+ | 0.1843 | 2.7346 | 15650 | 0.2836 |
67
+ | 0.192 | 2.7433 | 15700 | 0.2834 |
68
+ | 0.2191 | 2.7521 | 15750 | 0.2800 |
69
+ | 0.1797 | 2.7608 | 15800 | 0.2797 |
70
+ | 0.1871 | 2.7695 | 15850 | 0.2817 |
71
+ | 0.1893 | 2.7783 | 15900 | 0.2817 |
72
+ | 0.1845 | 2.7870 | 15950 | 0.2824 |
73
+ | 0.1954 | 2.7957 | 16000 | 0.2828 |
74
+ | 0.1752 | 2.8045 | 16050 | 0.2824 |
75
+ | 0.213 | 2.8132 | 16100 | 0.2803 |
76
+ | 0.1953 | 2.8219 | 16150 | 0.2818 |
77
+ | 0.1959 | 2.8307 | 16200 | 0.2807 |
78
+ | 0.1904 | 2.8394 | 16250 | 0.2814 |
79
+ | 0.191 | 2.8482 | 16300 | 0.2806 |
80
+ | 0.1783 | 2.8569 | 16350 | 0.2803 |
81
+ | 0.1997 | 2.8656 | 16400 | 0.2802 |
82
+ | 0.2195 | 2.8744 | 16450 | 0.2787 |
83
+ | 0.189 | 2.8831 | 16500 | 0.2800 |
84
+ | 0.1951 | 2.8918 | 16550 | 0.2788 |
85
+ | 0.1985 | 2.9006 | 16600 | 0.2789 |
86
+ | 0.2169 | 2.9093 | 16650 | 0.2785 |
87
+ | 0.195 | 2.9180 | 16700 | 0.2788 |
88
+ | 0.1744 | 2.9268 | 16750 | 0.2800 |
89
+ | 0.1635 | 2.9355 | 16800 | 0.2800 |
90
+ | 0.1877 | 2.9443 | 16850 | 0.2782 |
91
+ | 0.1977 | 2.9530 | 16900 | 0.2770 |
92
+ | 0.1808 | 2.9617 | 16950 | 0.2781 |
93
+ | 0.1824 | 2.9705 | 17000 | 0.2784 |
94
+ | 0.1947 | 2.9792 | 17050 | 0.2781 |
95
+ | 0.1946 | 2.9879 | 17100 | 0.2767 |
96
+ | 0.1742 | 2.9967 | 17150 | 0.2770 |
97
+ | 0.1527 | 3.0054 | 17200 | 0.2886 |
98
+ | 0.1205 | 3.0142 | 17250 | 0.2929 |
99
+ | 0.1261 | 3.0229 | 17300 | 0.2981 |
100
+ | 0.1122 | 3.0316 | 17350 | 0.2997 |
101
+ | 0.1441 | 3.0404 | 17400 | 0.2979 |
102
+ | 0.1202 | 3.0491 | 17450 | 0.3007 |
103
+ | 0.1285 | 3.0578 | 17500 | 0.2983 |
104
+ | 0.149 | 3.0666 | 17550 | 0.3007 |
105
+ | 0.1369 | 3.0753 | 17600 | 0.2968 |
106
+ | 0.1225 | 3.0840 | 17650 | 0.2994 |
107
+ | 0.132 | 3.0928 | 17700 | 0.3007 |
108
+ | 0.1296 | 3.1015 | 17750 | 0.3006 |
109
+ | 0.1207 | 3.1103 | 17800 | 0.3000 |
110
+ | 0.1385 | 3.1190 | 17850 | 0.2981 |
111
+ | 0.1347 | 3.1277 | 17900 | 0.3000 |
112
+ | 0.114 | 3.1365 | 17950 | 0.2994 |
113
+ | 0.1233 | 3.1452 | 18000 | 0.2991 |
114
+ | 0.1284 | 3.1539 | 18050 | 0.2991 |
115
+ | 0.1222 | 3.1627 | 18100 | 0.3005 |
116
+ | 0.1367 | 3.1714 | 18150 | 0.2988 |
117
+ | 0.1308 | 3.1802 | 18200 | 0.2992 |
118
+ | 0.1138 | 3.1889 | 18250 | 0.3001 |
119
+ | 0.1259 | 3.1976 | 18300 | 0.2979 |
120
+ | 0.1383 | 3.2064 | 18350 | 0.2993 |
121
+ | 0.1288 | 3.2151 | 18400 | 0.2989 |
122
+ | 0.1364 | 3.2238 | 18450 | 0.2974 |
123
+ | 0.1232 | 3.2326 | 18500 | 0.2989 |
124
+ | 0.1348 | 3.2413 | 18550 | 0.3012 |
125
+ | 0.1168 | 3.2500 | 18600 | 0.2998 |
126
+ | 0.1342 | 3.2588 | 18650 | 0.3026 |
127
+ | 0.1385 | 3.2675 | 18700 | 0.2979 |
128
+ | 0.1298 | 3.2763 | 18750 | 0.2962 |
129
+ | 0.1373 | 3.2850 | 18800 | 0.2950 |
130
+ | 0.1292 | 3.2937 | 18850 | 0.2986 |
131
+ | 0.1329 | 3.3025 | 18900 | 0.2965 |
132
+ | 0.1324 | 3.3112 | 18950 | 0.3016 |
133
+ | 0.1176 | 3.3199 | 19000 | 0.2991 |
134
+ | 0.1444 | 3.3287 | 19050 | 0.2940 |
135
+ | 0.1395 | 3.3374 | 19100 | 0.2960 |
136
+ | 0.1247 | 3.3461 | 19150 | 0.2975 |
137
+ | 0.1313 | 3.3549 | 19200 | 0.2976 |
138
+ | 0.1299 | 3.3636 | 19250 | 0.2967 |
139
+ | 0.1339 | 3.3724 | 19300 | 0.2969 |
140
+ | 0.128 | 3.3811 | 19350 | 0.2949 |
141
+ | 0.1296 | 3.3898 | 19400 | 0.2978 |
142
+ | 0.1346 | 3.3986 | 19450 | 0.2961 |
143
+ | 0.1388 | 3.4073 | 19500 | 0.2960 |
144
+ | 0.1236 | 3.4160 | 19550 | 0.2951 |
145
+ | 0.1203 | 3.4248 | 19600 | 0.2952 |
146
+ | 0.1161 | 3.4335 | 19650 | 0.2977 |
147
+ | 0.1158 | 3.4423 | 19700 | 0.2955 |
148
+ | 0.1292 | 3.4510 | 19750 | 0.2979 |
149
+ | 0.1224 | 3.4597 | 19800 | 0.2976 |
150
+ | 0.1241 | 3.4685 | 19850 | 0.2979 |
151
+ | 0.1411 | 3.4772 | 19900 | 0.2953 |
152
+ | 0.1337 | 3.4859 | 19950 | 0.2966 |
153
+ | 0.1298 | 3.4947 | 20000 | 0.2964 |
154
+ | 0.1176 | 3.5034 | 20050 | 0.2958 |
155
+ | 0.1175 | 3.5121 | 20100 | 0.2966 |
156
+ | 0.1409 | 3.5209 | 20150 | 0.2952 |
157
+ | 0.1339 | 3.5296 | 20200 | 0.2951 |
158
+ | 0.1348 | 3.5384 | 20250 | 0.2956 |
159
+ | 0.1281 | 3.5471 | 20300 | 0.2956 |
160
+ | 0.1293 | 3.5558 | 20350 | 0.2981 |
161
+ | 0.1257 | 3.5646 | 20400 | 0.2969 |
162
+ | 0.1152 | 3.5733 | 20450 | 0.2955 |
163
+ | 0.1276 | 3.5820 | 20500 | 0.2960 |
164
+ | 0.1366 | 3.5908 | 20550 | 0.2977 |
165
+ | 0.1364 | 3.5995 | 20600 | 0.2982 |
166
+ | 0.134 | 3.6082 | 20650 | 0.2967 |
167
+ | 0.1266 | 3.6170 | 20700 | 0.2965 |
168
+ | 0.1215 | 3.6257 | 20750 | 0.2970 |
169
+ | 0.1253 | 3.6345 | 20800 | 0.2991 |
170
+ | 0.116 | 3.6432 | 20850 | 0.2976 |
171
+ | 0.1255 | 3.6519 | 20900 | 0.2972 |
172
+ | 0.1271 | 3.6607 | 20950 | 0.2969 |
173
+ | 0.1155 | 3.6694 | 21000 | 0.2970 |
174
+ | 0.1223 | 3.6781 | 21050 | 0.2968 |
175
+ | 0.1317 | 3.6869 | 21100 | 0.2956 |
176
+ | 0.1257 | 3.6956 | 21150 | 0.2957 |
177
+ | 0.1262 | 3.7044 | 21200 | 0.2952 |
178
+ | 0.1215 | 3.7131 | 21250 | 0.2957 |
179
+ | 0.1285 | 3.7218 | 21300 | 0.2955 |
180
+ | 0.1264 | 3.7306 | 21350 | 0.2956 |
181
+ | 0.1364 | 3.7393 | 21400 | 0.2967 |
182
+ | 0.1213 | 3.7480 | 21450 | 0.2966 |
183
+ | 0.1316 | 3.7568 | 21500 | 0.2972 |
184
+ | 0.1174 | 3.7655 | 21550 | 0.2991 |
185
+ | 0.1167 | 3.7742 | 21600 | 0.2982 |
186
+ | 0.1274 | 3.7830 | 21650 | 0.2974 |
187
+ | 0.1302 | 3.7917 | 21700 | 0.2960 |
188
+ | 0.118 | 3.8005 | 21750 | 0.2958 |
189
+ | 0.1264 | 3.8092 | 21800 | 0.2977 |
190
+ | 0.1115 | 3.8179 | 21850 | 0.2971 |
191
+ | 0.1128 | 3.8267 | 21900 | 0.2973 |
192
+ | 0.1186 | 3.8354 | 21950 | 0.2965 |
193
+ | 0.1173 | 3.8441 | 22000 | 0.2965 |
194
+ | 0.1293 | 3.8529 | 22050 | 0.2963 |
195
+ | 0.1226 | 3.8616 | 22100 | 0.2964 |
196
+ | 0.1173 | 3.8703 | 22150 | 0.2964 |
197
+ | 0.1343 | 3.8791 | 22200 | 0.2966 |
198
+ | 0.1365 | 3.8878 | 22250 | 0.2962 |
199
+ | 0.1187 | 3.8966 | 22300 | 0.2963 |
200
+ | 0.1132 | 3.9053 | 22350 | 0.2963 |
201
+ | 0.1328 | 3.9140 | 22400 | 0.2961 |
202
+ | 0.1394 | 3.9228 | 22450 | 0.2956 |
203
+ | 0.1312 | 3.9315 | 22500 | 0.2959 |
204
+ | 0.1256 | 3.9402 | 22550 | 0.2958 |
205
+ | 0.1272 | 3.9490 | 22600 | 0.2955 |
206
+ | 0.1128 | 3.9577 | 22650 | 0.2954 |
207
+ | 0.1193 | 3.9665 | 22700 | 0.2955 |
208
+ | 0.1169 | 3.9752 | 22750 | 0.2954 |
209
+ | 0.1308 | 3.9839 | 22800 | 0.2954 |
210
+ | 0.1185 | 3.9927 | 22850 | 0.2955 |
211
+
212
+
213
+ ### Framework versions
214
+
215
+ - PEFT 0.12.0
216
+ - Transformers 4.43.3
217
+ - Pytorch 2.3.1+cu121
218
+ - Datasets 2.20.0
219
+ - Tokenizers 0.19.1
adapter_config.json ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "codellama/CodeLlama-7b-Instruct-hf",
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": true,
8
+ "init_lora_weights": true,
9
+ "layer_replication": null,
10
+ "layers_pattern": null,
11
+ "layers_to_transform": null,
12
+ "loftq_config": {},
13
+ "lora_alpha": 8,
14
+ "lora_dropout": 0.05,
15
+ "megatron_config": null,
16
+ "megatron_core": "megatron.core",
17
+ "modules_to_save": null,
18
+ "peft_type": "LORA",
19
+ "r": 128,
20
+ "rank_pattern": {},
21
+ "revision": null,
22
+ "target_modules": [
23
+ "down_proj",
24
+ "v_proj",
25
+ "k_proj",
26
+ "o_proj",
27
+ "up_proj",
28
+ "gate_proj",
29
+ "q_proj"
30
+ ],
31
+ "task_type": "CAUSAL_LM",
32
+ "use_dora": false,
33
+ "use_rslora": true
34
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2712c5b37aaa5d7e34ec99b45db14dd31473a2dc10e4dfbea3c18dba19b557a4
3
+ size 2332095256
added_tokens.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "<PAD>": 32016
3
+ }
runs/Aug04_13-36-40_0d6c6470f9b3/events.out.tfevents.1722778612.0d6c6470f9b3.3512.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b5767ded8cd6ecef1437227db09fc3f9a3f15337f98c3d4c5b4f728eb8f029f1
3
+ size 146631
special_tokens_map.json ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "▁<PRE>",
4
+ "▁<MID>",
5
+ "▁<SUF>",
6
+ "▁<EOT>"
7
+ ],
8
+ "bos_token": {
9
+ "content": "<s>",
10
+ "lstrip": false,
11
+ "normalized": false,
12
+ "rstrip": false,
13
+ "single_word": false
14
+ },
15
+ "eos_token": {
16
+ "content": "</s>",
17
+ "lstrip": false,
18
+ "normalized": false,
19
+ "rstrip": false,
20
+ "single_word": false
21
+ },
22
+ "pad_token": {
23
+ "content": "<PAD>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": false,
27
+ "single_word": false
28
+ },
29
+ "unk_token": {
30
+ "content": "<unk>",
31
+ "lstrip": false,
32
+ "normalized": false,
33
+ "rstrip": false,
34
+ "single_word": false
35
+ }
36
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:45ccb9c8b6b561889acea59191d66986d314e7cbd6a78abc6e49b139ca91c1e6
3
+ size 500058
tokenizer_config.json ADDED
@@ -0,0 +1,92 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<unk>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<s>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "</s>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ },
29
+ "32007": {
30
+ "content": "▁<PRE>",
31
+ "lstrip": false,
32
+ "normalized": false,
33
+ "rstrip": false,
34
+ "single_word": false,
35
+ "special": true
36
+ },
37
+ "32008": {
38
+ "content": "▁<SUF>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false,
43
+ "special": true
44
+ },
45
+ "32009": {
46
+ "content": "▁<MID>",
47
+ "lstrip": false,
48
+ "normalized": false,
49
+ "rstrip": false,
50
+ "single_word": false,
51
+ "special": true
52
+ },
53
+ "32010": {
54
+ "content": "▁<EOT>",
55
+ "lstrip": false,
56
+ "normalized": false,
57
+ "rstrip": false,
58
+ "single_word": false,
59
+ "special": true
60
+ },
61
+ "32016": {
62
+ "content": "<PAD>",
63
+ "lstrip": false,
64
+ "normalized": false,
65
+ "rstrip": false,
66
+ "single_word": false,
67
+ "special": true
68
+ }
69
+ },
70
+ "additional_special_tokens": [
71
+ "▁<PRE>",
72
+ "▁<MID>",
73
+ "▁<SUF>",
74
+ "▁<EOT>"
75
+ ],
76
+ "bos_token": "<s>",
77
+ "chat_template": "{% if messages[0]['role'] == 'system' %}{% set loop_messages = messages[1:] %}{% set system_message = messages[0]['content'] %}{% else %}{% set loop_messages = messages %}{% set system_message = false %}{% endif %}{% for message in loop_messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% if loop.index0 == 0 and system_message != false %}{% set content = '<<SYS>>\\n' + system_message + '\\n<</SYS>>\\n\\n' + message['content'] %}{% else %}{% set content = message['content'] %}{% endif %}{% if message['role'] == 'user' %}{{ bos_token + '[INST] ' + content | trim + ' [/INST]' }}{% elif message['role'] == 'assistant' %}{{ ' ' + content | trim + ' ' + eos_token }}{% endif %}{% endfor %}",
78
+ "clean_up_tokenization_spaces": false,
79
+ "eos_token": "</s>",
80
+ "eot_token": "▁<EOT>",
81
+ "fill_token": "<FILL_ME>",
82
+ "legacy": null,
83
+ "middle_token": "▁<MID>",
84
+ "model_max_length": 1000000000000000019884624838656,
85
+ "pad_token": "<PAD>",
86
+ "prefix_token": "▁<PRE>",
87
+ "sp_model_kwargs": {},
88
+ "suffix_token": "▁<SUF>",
89
+ "tokenizer_class": "CodeLlamaTokenizer",
90
+ "unk_token": "<unk>",
91
+ "use_default_system_prompt": false
92
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e7c904905085e0075baf404f3edf4d53d0a090289d4108a987eed95d4b997698
3
+ size 5496