Elfrino commited on
Commit
3e599a6
1 Parent(s): d8e465e

Upload folder using huggingface_hub

Browse files
README.md ADDED
@@ -0,0 +1,406 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - Undi95/PsyMedRP-v1-20B
4
+ library_name: transformers
5
+ tags:
6
+ - mergekit
7
+ - merge
8
+
9
+ ---
10
+ # merge
11
+
12
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
13
+
14
+ ## Merge Details
15
+ ### Merge Method
16
+
17
+ This model was merged using the passthrough merge method.
18
+
19
+ ### Models Merged
20
+
21
+ The following models were included in the merge:
22
+ * [Undi95/PsyMedRP-v1-20B](https://huggingface.co/Undi95/PsyMedRP-v1-20B)
23
+
24
+ ### Configuration
25
+
26
+ The following YAML configuration was used to produce this model:
27
+
28
+ ```yaml
29
+ # The amount to attenuate the Q and K matrices of the *FIRST COPY* of each layer.
30
+ # NOTE: This scales the score matrix values by QK_ATTENUATION_FACTOR^2 (eg: sqrt(1/2)^2 = 1/2).
31
+ const_tag: &QK_ATTENUATION_FACTOR 0.84 # ≈ sqrt(1/2) <- This was changed, v4 was 0.7071067812 )
32
+
33
+ # The amount to scale the contribution to the residual stream (to hopefully reduce overshoot).
34
+ const_tag: &RESIDUAL_SCALE_FACTOR 0.71 # ≈ sqrt(1/2) <- This was changed, v4 was 0.7071067812 )
35
+
36
+ # Make the first copy *ONLY* take a more "bird's eye view" (ie: pay attention to more of the context).
37
+ model1-filter-env: &MODEL1_FILTER_ENV
38
+ parameters:
39
+ scale:
40
+ - filter: q_proj
41
+ value: *QK_ATTENUATION_FACTOR
42
+ - filter: k_proj
43
+ value: *QK_ATTENUATION_FACTOR
44
+ - filter: down_proj
45
+ value: *RESIDUAL_SCALE_FACTOR
46
+ - value: 1.0
47
+
48
+ # Make the scond copy pay attention to the context as before.
49
+ model2-filter-env: &MODEL2_FILTER_ENV
50
+ parameters:
51
+ scale:
52
+ - filter: down_proj
53
+ value: *RESIDUAL_SCALE_FACTOR
54
+ - value: 1.0
55
+
56
+ slices:
57
+
58
+ # The first 10 layers are not duplicated.
59
+ - sources:
60
+ - model: Undi95/PsyMedRP-v1-20B
61
+ layer_range: [0, 10]
62
+ - sources:
63
+ - model: Undi95/PsyMedRP-v1-20B
64
+ layer_range: [10, 11]
65
+ <<: *MODEL1_FILTER_ENV
66
+ - sources:
67
+ - model: Undi95/PsyMedRP-v1-20B
68
+ layer_range: [10, 11]
69
+ <<: *MODEL2_FILTER_ENV
70
+ - sources:
71
+ - model: Undi95/PsyMedRP-v1-20B
72
+ layer_range: [11, 12]
73
+ <<: *MODEL1_FILTER_ENV
74
+ - sources:
75
+ - model: Undi95/PsyMedRP-v1-20B
76
+ layer_range: [11, 12]
77
+ <<: *MODEL2_FILTER_ENV
78
+ - sources:
79
+ - model: Undi95/PsyMedRP-v1-20B
80
+ layer_range: [12, 13]
81
+ <<: *MODEL1_FILTER_ENV
82
+ - sources:
83
+ - model: Undi95/PsyMedRP-v1-20B
84
+ layer_range: [12, 13]
85
+ <<: *MODEL2_FILTER_ENV
86
+ - sources:
87
+ - model: Undi95/PsyMedRP-v1-20B
88
+ layer_range: [13, 14]
89
+ <<: *MODEL1_FILTER_ENV
90
+ - sources:
91
+ - model: Undi95/PsyMedRP-v1-20B
92
+ layer_range: [13, 14]
93
+ <<: *MODEL2_FILTER_ENV
94
+ - sources:
95
+ - model: Undi95/PsyMedRP-v1-20B
96
+ layer_range: [14, 15]
97
+ <<: *MODEL1_FILTER_ENV
98
+ - sources:
99
+ - model: Undi95/PsyMedRP-v1-20B
100
+ layer_range: [14, 15]
101
+ <<: *MODEL2_FILTER_ENV
102
+ - sources:
103
+ - model: Undi95/PsyMedRP-v1-20B
104
+ layer_range: [15, 16]
105
+ <<: *MODEL1_FILTER_ENV
106
+ - sources:
107
+ - model: Undi95/PsyMedRP-v1-20B
108
+ layer_range: [15, 16]
109
+ <<: *MODEL2_FILTER_ENV
110
+ - sources:
111
+ - model: Undi95/PsyMedRP-v1-20B
112
+ layer_range: [16, 17]
113
+ <<: *MODEL1_FILTER_ENV
114
+ - sources:
115
+ - model: Undi95/PsyMedRP-v1-20B
116
+ layer_range: [16, 17]
117
+ <<: *MODEL2_FILTER_ENV
118
+ - sources:
119
+ - model: Undi95/PsyMedRP-v1-20B
120
+ layer_range: [17, 18]
121
+ <<: *MODEL1_FILTER_ENV
122
+ - sources:
123
+ - model: Undi95/PsyMedRP-v1-20B
124
+ layer_range: [17, 18]
125
+ <<: *MODEL2_FILTER_ENV
126
+ - sources:
127
+ - model: Undi95/PsyMedRP-v1-20B
128
+ layer_range: [18, 19]
129
+ <<: *MODEL1_FILTER_ENV
130
+ - sources:
131
+ - model: Undi95/PsyMedRP-v1-20B
132
+ layer_range: [18, 19]
133
+ <<: *MODEL2_FILTER_ENV
134
+ - sources:
135
+ - model: Undi95/PsyMedRP-v1-20B
136
+ layer_range: [19, 20]
137
+ <<: *MODEL1_FILTER_ENV
138
+ - sources:
139
+ - model: Undi95/PsyMedRP-v1-20B
140
+ layer_range: [19, 20]
141
+ <<: *MODEL2_FILTER_ENV
142
+ - sources:
143
+ - model: Undi95/PsyMedRP-v1-20B
144
+ layer_range: [20, 21]
145
+ <<: *MODEL1_FILTER_ENV
146
+ - sources:
147
+ - model: Undi95/PsyMedRP-v1-20B
148
+ layer_range: [20, 21]
149
+ <<: *MODEL2_FILTER_ENV
150
+ - sources:
151
+ - model: Undi95/PsyMedRP-v1-20B
152
+ layer_range: [21, 22]
153
+ <<: *MODEL1_FILTER_ENV
154
+ - sources:
155
+ - model: Undi95/PsyMedRP-v1-20B
156
+ layer_range: [21, 22]
157
+ <<: *MODEL2_FILTER_ENV
158
+ - sources:
159
+ - model: Undi95/PsyMedRP-v1-20B
160
+ layer_range: [22, 23]
161
+ <<: *MODEL1_FILTER_ENV
162
+ - sources:
163
+ - model: Undi95/PsyMedRP-v1-20B
164
+ layer_range: [22, 23]
165
+ <<: *MODEL2_FILTER_ENV
166
+ - sources:
167
+ - model: Undi95/PsyMedRP-v1-20B
168
+ layer_range: [23, 24]
169
+ <<: *MODEL1_FILTER_ENV
170
+ - sources:
171
+ - model: Undi95/PsyMedRP-v1-20B
172
+ layer_range: [23, 24]
173
+ <<: *MODEL2_FILTER_ENV
174
+ - sources:
175
+ - model: Undi95/PsyMedRP-v1-20B
176
+ layer_range: [24, 25]
177
+ <<: *MODEL1_FILTER_ENV
178
+ - sources:
179
+ - model: Undi95/PsyMedRP-v1-20B
180
+ layer_range: [24, 25]
181
+ <<: *MODEL2_FILTER_ENV
182
+ - sources:
183
+ - model: Undi95/PsyMedRP-v1-20B
184
+ layer_range: [25, 26]
185
+ <<: *MODEL1_FILTER_ENV
186
+ - sources:
187
+ - model: Undi95/PsyMedRP-v1-20B
188
+ layer_range: [25, 26]
189
+ <<: *MODEL2_FILTER_ENV
190
+ - sources:
191
+ - model: Undi95/PsyMedRP-v1-20B
192
+ layer_range: [26, 27]
193
+ <<: *MODEL1_FILTER_ENV
194
+ - sources:
195
+ - model: Undi95/PsyMedRP-v1-20B
196
+ layer_range: [26, 27]
197
+ <<: *MODEL2_FILTER_ENV
198
+ - sources:
199
+ - model: Undi95/PsyMedRP-v1-20B
200
+ layer_range: [27, 28]
201
+ <<: *MODEL1_FILTER_ENV
202
+ - sources:
203
+ - model: Undi95/PsyMedRP-v1-20B
204
+ layer_range: [27, 28]
205
+ <<: *MODEL2_FILTER_ENV
206
+ - sources:
207
+ - model: Undi95/PsyMedRP-v1-20B
208
+ layer_range: [28, 29]
209
+ <<: *MODEL1_FILTER_ENV
210
+ - sources:
211
+ - model: Undi95/PsyMedRP-v1-20B
212
+ layer_range: [28, 29]
213
+ <<: *MODEL2_FILTER_ENV
214
+ - sources:
215
+ - model: Undi95/PsyMedRP-v1-20B
216
+ layer_range: [29, 30]
217
+ <<: *MODEL1_FILTER_ENV
218
+ - sources:
219
+ - model: Undi95/PsyMedRP-v1-20B
220
+ layer_range: [29, 30]
221
+ <<: *MODEL2_FILTER_ENV
222
+ - sources:
223
+ - model: Undi95/PsyMedRP-v1-20B
224
+ layer_range: [30, 31]
225
+ <<: *MODEL1_FILTER_ENV
226
+ - sources:
227
+ - model: Undi95/PsyMedRP-v1-20B
228
+ layer_range: [30, 31]
229
+ <<: *MODEL2_FILTER_ENV
230
+ - sources:
231
+ - model: Undi95/PsyMedRP-v1-20B
232
+ layer_range: [31, 32]
233
+ <<: *MODEL1_FILTER_ENV
234
+ - sources:
235
+ - model: Undi95/PsyMedRP-v1-20B
236
+ layer_range: [31, 32]
237
+ <<: *MODEL2_FILTER_ENV
238
+ - sources:
239
+ - model: Undi95/PsyMedRP-v1-20B
240
+ layer_range: [32, 33]
241
+ <<: *MODEL1_FILTER_ENV
242
+ - sources:
243
+ - model: Undi95/PsyMedRP-v1-20B
244
+ layer_range: [32, 33]
245
+ <<: *MODEL2_FILTER_ENV
246
+ - sources:
247
+ - model: Undi95/PsyMedRP-v1-20B
248
+ layer_range: [33, 34]
249
+ <<: *MODEL1_FILTER_ENV
250
+ - sources:
251
+ - model: Undi95/PsyMedRP-v1-20B
252
+ layer_range: [33, 34]
253
+ <<: *MODEL2_FILTER_ENV
254
+ - sources:
255
+ - model: Undi95/PsyMedRP-v1-20B
256
+ layer_range: [34, 35]
257
+ <<: *MODEL1_FILTER_ENV
258
+ - sources:
259
+ - model: Undi95/PsyMedRP-v1-20B
260
+ layer_range: [34, 35]
261
+ <<: *MODEL2_FILTER_ENV
262
+ - sources:
263
+ - model: Undi95/PsyMedRP-v1-20B
264
+ layer_range: [35, 36]
265
+ <<: *MODEL1_FILTER_ENV
266
+ - sources:
267
+ - model: Undi95/PsyMedRP-v1-20B
268
+ layer_range: [35, 36]
269
+ <<: *MODEL2_FILTER_ENV
270
+ - sources:
271
+ - model: Undi95/PsyMedRP-v1-20B
272
+ layer_range: [36, 37]
273
+ <<: *MODEL1_FILTER_ENV
274
+ - sources:
275
+ - model: Undi95/PsyMedRP-v1-20B
276
+ layer_range: [36, 37]
277
+ <<: *MODEL2_FILTER_ENV
278
+ - sources:
279
+ - model: Undi95/PsyMedRP-v1-20B
280
+ layer_range: [37, 38]
281
+ <<: *MODEL1_FILTER_ENV
282
+ - sources:
283
+ - model: Undi95/PsyMedRP-v1-20B
284
+ layer_range: [37, 38]
285
+ <<: *MODEL2_FILTER_ENV
286
+ - sources:
287
+ - model: Undi95/PsyMedRP-v1-20B
288
+ layer_range: [38, 39]
289
+ <<: *MODEL1_FILTER_ENV
290
+ - sources:
291
+ - model: Undi95/PsyMedRP-v1-20B
292
+ layer_range: [38, 39]
293
+ <<: *MODEL2_FILTER_ENV
294
+ - sources:
295
+ - model: Undi95/PsyMedRP-v1-20B
296
+ layer_range: [39, 40]
297
+ <<: *MODEL1_FILTER_ENV
298
+ - sources:
299
+ - model: Undi95/PsyMedRP-v1-20B
300
+ layer_range: [39, 40]
301
+ <<: *MODEL2_FILTER_ENV
302
+ - sources:
303
+ - model: Undi95/PsyMedRP-v1-20B
304
+ layer_range: [40, 41]
305
+ <<: *MODEL1_FILTER_ENV
306
+ - sources:
307
+ - model: Undi95/PsyMedRP-v1-20B
308
+ layer_range: [40, 41]
309
+ <<: *MODEL2_FILTER_ENV
310
+ - sources:
311
+ - model: Undi95/PsyMedRP-v1-20B
312
+ layer_range: [41, 42]
313
+ <<: *MODEL1_FILTER_ENV
314
+ - sources:
315
+ - model: Undi95/PsyMedRP-v1-20B
316
+ layer_range: [41, 42]
317
+ <<: *MODEL2_FILTER_ENV
318
+ - sources:
319
+ - model: Undi95/PsyMedRP-v1-20B
320
+ layer_range: [42, 43]
321
+ <<: *MODEL1_FILTER_ENV
322
+ - sources:
323
+ - model: Undi95/PsyMedRP-v1-20B
324
+ layer_range: [42, 43]
325
+ <<: *MODEL2_FILTER_ENV
326
+ - sources:
327
+ - model: Undi95/PsyMedRP-v1-20B
328
+ layer_range: [43, 44]
329
+ <<: *MODEL1_FILTER_ENV
330
+ - sources:
331
+ - model: Undi95/PsyMedRP-v1-20B
332
+ layer_range: [43, 44]
333
+ <<: *MODEL2_FILTER_ENV
334
+ - sources:
335
+ - model: Undi95/PsyMedRP-v1-20B
336
+ layer_range: [44, 45]
337
+ <<: *MODEL1_FILTER_ENV
338
+ - sources:
339
+ - model: Undi95/PsyMedRP-v1-20B
340
+ layer_range: [44, 45]
341
+ <<: *MODEL2_FILTER_ENV
342
+ - sources:
343
+ - model: Undi95/PsyMedRP-v1-20B
344
+ layer_range: [45, 46]
345
+ <<: *MODEL1_FILTER_ENV
346
+ - sources:
347
+ - model: Undi95/PsyMedRP-v1-20B
348
+ layer_range: [45, 46]
349
+ <<: *MODEL2_FILTER_ENV
350
+ - sources:
351
+ - model: Undi95/PsyMedRP-v1-20B
352
+ layer_range: [46, 47]
353
+ <<: *MODEL1_FILTER_ENV
354
+ - sources:
355
+ - model: Undi95/PsyMedRP-v1-20B
356
+ layer_range: [46, 47]
357
+ <<: *MODEL2_FILTER_ENV
358
+ - sources:
359
+ - model: Undi95/PsyMedRP-v1-20B
360
+ layer_range: [47, 48]
361
+ <<: *MODEL1_FILTER_ENV
362
+ - sources:
363
+ - model: Undi95/PsyMedRP-v1-20B
364
+ layer_range: [47, 48]
365
+ <<: *MODEL2_FILTER_ENV
366
+ - sources:
367
+ - model: Undi95/PsyMedRP-v1-20B
368
+ layer_range: [48, 49]
369
+ <<: *MODEL1_FILTER_ENV
370
+ - sources:
371
+ - model: Undi95/PsyMedRP-v1-20B
372
+ layer_range: [48, 49]
373
+ <<: *MODEL2_FILTER_ENV
374
+ - sources:
375
+ - model: Undi95/PsyMedRP-v1-20B
376
+ layer_range: [49, 50]
377
+ <<: *MODEL1_FILTER_ENV
378
+ - sources:
379
+ - model: Undi95/PsyMedRP-v1-20B
380
+ layer_range: [49, 50]
381
+ <<: *MODEL2_FILTER_ENV
382
+ - sources:
383
+ - model: Undi95/PsyMedRP-v1-20B
384
+ layer_range: [50, 51]
385
+ <<: *MODEL1_FILTER_ENV
386
+ - sources:
387
+ - model: Undi95/PsyMedRP-v1-20B
388
+ layer_range: [50, 51]
389
+ <<: *MODEL2_FILTER_ENV
390
+ - sources:
391
+ - model: Undi95/PsyMedRP-v1-20B
392
+ layer_range: [51, 52]
393
+ <<: *MODEL1_FILTER_ENV
394
+ - sources:
395
+ - model: Undi95/PsyMedRP-v1-20B
396
+ layer_range: [51, 52]
397
+ <<: *MODEL2_FILTER_ENV
398
+
399
+ # The last 10 layers are not duplicated.
400
+ - sources:
401
+ - model: Undi95/PsyMedRP-v1-20B
402
+ layer_range: [52, 62]
403
+
404
+ merge_method: passthrough
405
+ dtype: float16
406
+ ```
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "Undi95/PsyMedRP-v1-20B",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 1,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 5120,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 13824,
14
+ "max_position_embeddings": 4096,
15
+ "mlp_bias": false,
16
+ "model_type": "llama",
17
+ "num_attention_heads": 40,
18
+ "num_hidden_layers": 104,
19
+ "num_key_value_heads": 40,
20
+ "pretraining_tp": 1,
21
+ "rms_norm_eps": 1e-05,
22
+ "rope_scaling": null,
23
+ "rope_theta": 10000.0,
24
+ "tie_word_embeddings": false,
25
+ "torch_dtype": "float16",
26
+ "transformers_version": "4.44.1",
27
+ "use_cache": false,
28
+ "vocab_size": 32000
29
+ }
mergekit_config.yml ADDED
@@ -0,0 +1,377 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # The amount to attenuate the Q and K matrices of the *FIRST COPY* of each layer.
2
+ # NOTE: This scales the score matrix values by QK_ATTENUATION_FACTOR^2 (eg: sqrt(1/2)^2 = 1/2).
3
+ const_tag: &QK_ATTENUATION_FACTOR 0.84 # ≈ sqrt(1/2) <- This was changed, v4 was 0.7071067812 )
4
+
5
+ # The amount to scale the contribution to the residual stream (to hopefully reduce overshoot).
6
+ const_tag: &RESIDUAL_SCALE_FACTOR 0.71 # ≈ sqrt(1/2) <- This was changed, v4 was 0.7071067812 )
7
+
8
+ # Make the first copy *ONLY* take a more "bird's eye view" (ie: pay attention to more of the context).
9
+ model1-filter-env: &MODEL1_FILTER_ENV
10
+ parameters:
11
+ scale:
12
+ - filter: q_proj
13
+ value: *QK_ATTENUATION_FACTOR
14
+ - filter: k_proj
15
+ value: *QK_ATTENUATION_FACTOR
16
+ - filter: down_proj
17
+ value: *RESIDUAL_SCALE_FACTOR
18
+ - value: 1.0
19
+
20
+ # Make the scond copy pay attention to the context as before.
21
+ model2-filter-env: &MODEL2_FILTER_ENV
22
+ parameters:
23
+ scale:
24
+ - filter: down_proj
25
+ value: *RESIDUAL_SCALE_FACTOR
26
+ - value: 1.0
27
+
28
+ slices:
29
+
30
+ # The first 10 layers are not duplicated.
31
+ - sources:
32
+ - model: Undi95/PsyMedRP-v1-20B
33
+ layer_range: [0, 10]
34
+ - sources:
35
+ - model: Undi95/PsyMedRP-v1-20B
36
+ layer_range: [10, 11]
37
+ <<: *MODEL1_FILTER_ENV
38
+ - sources:
39
+ - model: Undi95/PsyMedRP-v1-20B
40
+ layer_range: [10, 11]
41
+ <<: *MODEL2_FILTER_ENV
42
+ - sources:
43
+ - model: Undi95/PsyMedRP-v1-20B
44
+ layer_range: [11, 12]
45
+ <<: *MODEL1_FILTER_ENV
46
+ - sources:
47
+ - model: Undi95/PsyMedRP-v1-20B
48
+ layer_range: [11, 12]
49
+ <<: *MODEL2_FILTER_ENV
50
+ - sources:
51
+ - model: Undi95/PsyMedRP-v1-20B
52
+ layer_range: [12, 13]
53
+ <<: *MODEL1_FILTER_ENV
54
+ - sources:
55
+ - model: Undi95/PsyMedRP-v1-20B
56
+ layer_range: [12, 13]
57
+ <<: *MODEL2_FILTER_ENV
58
+ - sources:
59
+ - model: Undi95/PsyMedRP-v1-20B
60
+ layer_range: [13, 14]
61
+ <<: *MODEL1_FILTER_ENV
62
+ - sources:
63
+ - model: Undi95/PsyMedRP-v1-20B
64
+ layer_range: [13, 14]
65
+ <<: *MODEL2_FILTER_ENV
66
+ - sources:
67
+ - model: Undi95/PsyMedRP-v1-20B
68
+ layer_range: [14, 15]
69
+ <<: *MODEL1_FILTER_ENV
70
+ - sources:
71
+ - model: Undi95/PsyMedRP-v1-20B
72
+ layer_range: [14, 15]
73
+ <<: *MODEL2_FILTER_ENV
74
+ - sources:
75
+ - model: Undi95/PsyMedRP-v1-20B
76
+ layer_range: [15, 16]
77
+ <<: *MODEL1_FILTER_ENV
78
+ - sources:
79
+ - model: Undi95/PsyMedRP-v1-20B
80
+ layer_range: [15, 16]
81
+ <<: *MODEL2_FILTER_ENV
82
+ - sources:
83
+ - model: Undi95/PsyMedRP-v1-20B
84
+ layer_range: [16, 17]
85
+ <<: *MODEL1_FILTER_ENV
86
+ - sources:
87
+ - model: Undi95/PsyMedRP-v1-20B
88
+ layer_range: [16, 17]
89
+ <<: *MODEL2_FILTER_ENV
90
+ - sources:
91
+ - model: Undi95/PsyMedRP-v1-20B
92
+ layer_range: [17, 18]
93
+ <<: *MODEL1_FILTER_ENV
94
+ - sources:
95
+ - model: Undi95/PsyMedRP-v1-20B
96
+ layer_range: [17, 18]
97
+ <<: *MODEL2_FILTER_ENV
98
+ - sources:
99
+ - model: Undi95/PsyMedRP-v1-20B
100
+ layer_range: [18, 19]
101
+ <<: *MODEL1_FILTER_ENV
102
+ - sources:
103
+ - model: Undi95/PsyMedRP-v1-20B
104
+ layer_range: [18, 19]
105
+ <<: *MODEL2_FILTER_ENV
106
+ - sources:
107
+ - model: Undi95/PsyMedRP-v1-20B
108
+ layer_range: [19, 20]
109
+ <<: *MODEL1_FILTER_ENV
110
+ - sources:
111
+ - model: Undi95/PsyMedRP-v1-20B
112
+ layer_range: [19, 20]
113
+ <<: *MODEL2_FILTER_ENV
114
+ - sources:
115
+ - model: Undi95/PsyMedRP-v1-20B
116
+ layer_range: [20, 21]
117
+ <<: *MODEL1_FILTER_ENV
118
+ - sources:
119
+ - model: Undi95/PsyMedRP-v1-20B
120
+ layer_range: [20, 21]
121
+ <<: *MODEL2_FILTER_ENV
122
+ - sources:
123
+ - model: Undi95/PsyMedRP-v1-20B
124
+ layer_range: [21, 22]
125
+ <<: *MODEL1_FILTER_ENV
126
+ - sources:
127
+ - model: Undi95/PsyMedRP-v1-20B
128
+ layer_range: [21, 22]
129
+ <<: *MODEL2_FILTER_ENV
130
+ - sources:
131
+ - model: Undi95/PsyMedRP-v1-20B
132
+ layer_range: [22, 23]
133
+ <<: *MODEL1_FILTER_ENV
134
+ - sources:
135
+ - model: Undi95/PsyMedRP-v1-20B
136
+ layer_range: [22, 23]
137
+ <<: *MODEL2_FILTER_ENV
138
+ - sources:
139
+ - model: Undi95/PsyMedRP-v1-20B
140
+ layer_range: [23, 24]
141
+ <<: *MODEL1_FILTER_ENV
142
+ - sources:
143
+ - model: Undi95/PsyMedRP-v1-20B
144
+ layer_range: [23, 24]
145
+ <<: *MODEL2_FILTER_ENV
146
+ - sources:
147
+ - model: Undi95/PsyMedRP-v1-20B
148
+ layer_range: [24, 25]
149
+ <<: *MODEL1_FILTER_ENV
150
+ - sources:
151
+ - model: Undi95/PsyMedRP-v1-20B
152
+ layer_range: [24, 25]
153
+ <<: *MODEL2_FILTER_ENV
154
+ - sources:
155
+ - model: Undi95/PsyMedRP-v1-20B
156
+ layer_range: [25, 26]
157
+ <<: *MODEL1_FILTER_ENV
158
+ - sources:
159
+ - model: Undi95/PsyMedRP-v1-20B
160
+ layer_range: [25, 26]
161
+ <<: *MODEL2_FILTER_ENV
162
+ - sources:
163
+ - model: Undi95/PsyMedRP-v1-20B
164
+ layer_range: [26, 27]
165
+ <<: *MODEL1_FILTER_ENV
166
+ - sources:
167
+ - model: Undi95/PsyMedRP-v1-20B
168
+ layer_range: [26, 27]
169
+ <<: *MODEL2_FILTER_ENV
170
+ - sources:
171
+ - model: Undi95/PsyMedRP-v1-20B
172
+ layer_range: [27, 28]
173
+ <<: *MODEL1_FILTER_ENV
174
+ - sources:
175
+ - model: Undi95/PsyMedRP-v1-20B
176
+ layer_range: [27, 28]
177
+ <<: *MODEL2_FILTER_ENV
178
+ - sources:
179
+ - model: Undi95/PsyMedRP-v1-20B
180
+ layer_range: [28, 29]
181
+ <<: *MODEL1_FILTER_ENV
182
+ - sources:
183
+ - model: Undi95/PsyMedRP-v1-20B
184
+ layer_range: [28, 29]
185
+ <<: *MODEL2_FILTER_ENV
186
+ - sources:
187
+ - model: Undi95/PsyMedRP-v1-20B
188
+ layer_range: [29, 30]
189
+ <<: *MODEL1_FILTER_ENV
190
+ - sources:
191
+ - model: Undi95/PsyMedRP-v1-20B
192
+ layer_range: [29, 30]
193
+ <<: *MODEL2_FILTER_ENV
194
+ - sources:
195
+ - model: Undi95/PsyMedRP-v1-20B
196
+ layer_range: [30, 31]
197
+ <<: *MODEL1_FILTER_ENV
198
+ - sources:
199
+ - model: Undi95/PsyMedRP-v1-20B
200
+ layer_range: [30, 31]
201
+ <<: *MODEL2_FILTER_ENV
202
+ - sources:
203
+ - model: Undi95/PsyMedRP-v1-20B
204
+ layer_range: [31, 32]
205
+ <<: *MODEL1_FILTER_ENV
206
+ - sources:
207
+ - model: Undi95/PsyMedRP-v1-20B
208
+ layer_range: [31, 32]
209
+ <<: *MODEL2_FILTER_ENV
210
+ - sources:
211
+ - model: Undi95/PsyMedRP-v1-20B
212
+ layer_range: [32, 33]
213
+ <<: *MODEL1_FILTER_ENV
214
+ - sources:
215
+ - model: Undi95/PsyMedRP-v1-20B
216
+ layer_range: [32, 33]
217
+ <<: *MODEL2_FILTER_ENV
218
+ - sources:
219
+ - model: Undi95/PsyMedRP-v1-20B
220
+ layer_range: [33, 34]
221
+ <<: *MODEL1_FILTER_ENV
222
+ - sources:
223
+ - model: Undi95/PsyMedRP-v1-20B
224
+ layer_range: [33, 34]
225
+ <<: *MODEL2_FILTER_ENV
226
+ - sources:
227
+ - model: Undi95/PsyMedRP-v1-20B
228
+ layer_range: [34, 35]
229
+ <<: *MODEL1_FILTER_ENV
230
+ - sources:
231
+ - model: Undi95/PsyMedRP-v1-20B
232
+ layer_range: [34, 35]
233
+ <<: *MODEL2_FILTER_ENV
234
+ - sources:
235
+ - model: Undi95/PsyMedRP-v1-20B
236
+ layer_range: [35, 36]
237
+ <<: *MODEL1_FILTER_ENV
238
+ - sources:
239
+ - model: Undi95/PsyMedRP-v1-20B
240
+ layer_range: [35, 36]
241
+ <<: *MODEL2_FILTER_ENV
242
+ - sources:
243
+ - model: Undi95/PsyMedRP-v1-20B
244
+ layer_range: [36, 37]
245
+ <<: *MODEL1_FILTER_ENV
246
+ - sources:
247
+ - model: Undi95/PsyMedRP-v1-20B
248
+ layer_range: [36, 37]
249
+ <<: *MODEL2_FILTER_ENV
250
+ - sources:
251
+ - model: Undi95/PsyMedRP-v1-20B
252
+ layer_range: [37, 38]
253
+ <<: *MODEL1_FILTER_ENV
254
+ - sources:
255
+ - model: Undi95/PsyMedRP-v1-20B
256
+ layer_range: [37, 38]
257
+ <<: *MODEL2_FILTER_ENV
258
+ - sources:
259
+ - model: Undi95/PsyMedRP-v1-20B
260
+ layer_range: [38, 39]
261
+ <<: *MODEL1_FILTER_ENV
262
+ - sources:
263
+ - model: Undi95/PsyMedRP-v1-20B
264
+ layer_range: [38, 39]
265
+ <<: *MODEL2_FILTER_ENV
266
+ - sources:
267
+ - model: Undi95/PsyMedRP-v1-20B
268
+ layer_range: [39, 40]
269
+ <<: *MODEL1_FILTER_ENV
270
+ - sources:
271
+ - model: Undi95/PsyMedRP-v1-20B
272
+ layer_range: [39, 40]
273
+ <<: *MODEL2_FILTER_ENV
274
+ - sources:
275
+ - model: Undi95/PsyMedRP-v1-20B
276
+ layer_range: [40, 41]
277
+ <<: *MODEL1_FILTER_ENV
278
+ - sources:
279
+ - model: Undi95/PsyMedRP-v1-20B
280
+ layer_range: [40, 41]
281
+ <<: *MODEL2_FILTER_ENV
282
+ - sources:
283
+ - model: Undi95/PsyMedRP-v1-20B
284
+ layer_range: [41, 42]
285
+ <<: *MODEL1_FILTER_ENV
286
+ - sources:
287
+ - model: Undi95/PsyMedRP-v1-20B
288
+ layer_range: [41, 42]
289
+ <<: *MODEL2_FILTER_ENV
290
+ - sources:
291
+ - model: Undi95/PsyMedRP-v1-20B
292
+ layer_range: [42, 43]
293
+ <<: *MODEL1_FILTER_ENV
294
+ - sources:
295
+ - model: Undi95/PsyMedRP-v1-20B
296
+ layer_range: [42, 43]
297
+ <<: *MODEL2_FILTER_ENV
298
+ - sources:
299
+ - model: Undi95/PsyMedRP-v1-20B
300
+ layer_range: [43, 44]
301
+ <<: *MODEL1_FILTER_ENV
302
+ - sources:
303
+ - model: Undi95/PsyMedRP-v1-20B
304
+ layer_range: [43, 44]
305
+ <<: *MODEL2_FILTER_ENV
306
+ - sources:
307
+ - model: Undi95/PsyMedRP-v1-20B
308
+ layer_range: [44, 45]
309
+ <<: *MODEL1_FILTER_ENV
310
+ - sources:
311
+ - model: Undi95/PsyMedRP-v1-20B
312
+ layer_range: [44, 45]
313
+ <<: *MODEL2_FILTER_ENV
314
+ - sources:
315
+ - model: Undi95/PsyMedRP-v1-20B
316
+ layer_range: [45, 46]
317
+ <<: *MODEL1_FILTER_ENV
318
+ - sources:
319
+ - model: Undi95/PsyMedRP-v1-20B
320
+ layer_range: [45, 46]
321
+ <<: *MODEL2_FILTER_ENV
322
+ - sources:
323
+ - model: Undi95/PsyMedRP-v1-20B
324
+ layer_range: [46, 47]
325
+ <<: *MODEL1_FILTER_ENV
326
+ - sources:
327
+ - model: Undi95/PsyMedRP-v1-20B
328
+ layer_range: [46, 47]
329
+ <<: *MODEL2_FILTER_ENV
330
+ - sources:
331
+ - model: Undi95/PsyMedRP-v1-20B
332
+ layer_range: [47, 48]
333
+ <<: *MODEL1_FILTER_ENV
334
+ - sources:
335
+ - model: Undi95/PsyMedRP-v1-20B
336
+ layer_range: [47, 48]
337
+ <<: *MODEL2_FILTER_ENV
338
+ - sources:
339
+ - model: Undi95/PsyMedRP-v1-20B
340
+ layer_range: [48, 49]
341
+ <<: *MODEL1_FILTER_ENV
342
+ - sources:
343
+ - model: Undi95/PsyMedRP-v1-20B
344
+ layer_range: [48, 49]
345
+ <<: *MODEL2_FILTER_ENV
346
+ - sources:
347
+ - model: Undi95/PsyMedRP-v1-20B
348
+ layer_range: [49, 50]
349
+ <<: *MODEL1_FILTER_ENV
350
+ - sources:
351
+ - model: Undi95/PsyMedRP-v1-20B
352
+ layer_range: [49, 50]
353
+ <<: *MODEL2_FILTER_ENV
354
+ - sources:
355
+ - model: Undi95/PsyMedRP-v1-20B
356
+ layer_range: [50, 51]
357
+ <<: *MODEL1_FILTER_ENV
358
+ - sources:
359
+ - model: Undi95/PsyMedRP-v1-20B
360
+ layer_range: [50, 51]
361
+ <<: *MODEL2_FILTER_ENV
362
+ - sources:
363
+ - model: Undi95/PsyMedRP-v1-20B
364
+ layer_range: [51, 52]
365
+ <<: *MODEL1_FILTER_ENV
366
+ - sources:
367
+ - model: Undi95/PsyMedRP-v1-20B
368
+ layer_range: [51, 52]
369
+ <<: *MODEL2_FILTER_ENV
370
+
371
+ # The last 10 layers are not duplicated.
372
+ - sources:
373
+ - model: Undi95/PsyMedRP-v1-20B
374
+ layer_range: [52, 62]
375
+
376
+ merge_method: passthrough
377
+ dtype: float16
model-00001-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dee13225e373ff1db366577c54f4dbb5426150837bc85831917a040eeeaf9be5
3
+ size 4886514544
model-00002-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6c7670f687215ada19da28aa78a963e6fd91520b2e41e592ee2e2fe55952b8ce
3
+ size 4933722144
model-00003-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:54c36761260c7fd4d55f9a1d3a0afde827a0b551834cb5f3f54114adedab4cd6
3
+ size 4933711792
model-00004-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7ebd058853626a98bde64177f665842a1a39fdbac4ccaa4cc48a838da44c3566
3
+ size 4865553992
model-00005-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b693a2a6f2f6b078dbb5c98d37874084b7aa1ad4e5fb130639a7c81ec2ef7cce
3
+ size 4933722144
model-00006-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2160d53c0b5f478f29041b7117dcff45ed428ad728f588ab4acebc8555ca4509
3
+ size 4970422184
model-00007-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b89f7909612370f2d8db95581ff16a65b17290272bd3a47087d3454572f69c2c
3
+ size 4970422184
model-00008-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c61c55203470fb932d8f1ea76fd0f331c5b6a56076feec2d90e94f84c80545fb
3
+ size 4933701424
model-00009-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6ab72d900ca214206334e6827cbc27315eefcb820373073fe48ea200ba9d98e9
3
+ size 4949451008
model-00010-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eb1e8a1f67049c0dd49d8f1b4780977bc086cd4f34d8b0e0233f964443bf1712
3
+ size 4970422184
model-00011-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:08706217c6bb98f8f6f77682ec3652a42e5342d0d941186b531134a9b21c6b6a
3
+ size 4970422184
model-00012-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:39ba9dad1e7a6f374bb2e71c78ffa7860c1e890be27409ef36f4272f855f7f5e
3
+ size 4986140688
model-00013-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:231bb6335a4a10ca0d070d6d55e22789f377a5a90e99bbc20cfea4a4f59c6905
3
+ size 4933722160
model-00014-of-00014.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2cab3ab482c23ff061d7cb1043c3cc6a5163d90a8f52a8625f12a4f43fad949b
3
+ size 2396082104
model.safetensors.index.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"metadata": {"mergekit_version": "0.0.4.4", "total_size": 66633902080}, "weight_map": {"lm_head.weight": "model-00001-of-00014.safetensors", "model.embed_tokens.weight": "model-00001-of-00014.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.0.mlp.down_proj.weight": "model-00001-of-00014.safetensors", "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00014.safetensors", "model.layers.0.mlp.up_proj.weight": "model-00001-of-00014.safetensors", "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00014.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00014.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00014.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00014.safetensors", "model.layers.1.input_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.1.mlp.down_proj.weight": "model-00001-of-00014.safetensors", "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00014.safetensors", "model.layers.1.mlp.up_proj.weight": "model-00001-of-00014.safetensors", "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00014.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00014.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00014.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00014.safetensors", "model.layers.11.input_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.10.input_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.11.mlp.down_proj.weight": "model-00001-of-00014.safetensors", "model.layers.10.mlp.down_proj.weight": "model-00001-of-00014.safetensors", "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00014.safetensors", "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00014.safetensors", "model.layers.11.mlp.up_proj.weight": "model-00001-of-00014.safetensors", "model.layers.10.mlp.up_proj.weight": "model-00001-of-00014.safetensors", "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00014.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00014.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00014.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00014.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00014.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00014.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00014.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00014.safetensors", "model.layers.13.input_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.12.input_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.13.mlp.down_proj.weight": "model-00001-of-00014.safetensors", "model.layers.12.mlp.down_proj.weight": "model-00001-of-00014.safetensors", "model.layers.13.mlp.gate_proj.weight": "model-00001-of-00014.safetensors", "model.layers.12.mlp.gate_proj.weight": "model-00001-of-00014.safetensors", "model.layers.13.mlp.up_proj.weight": "model-00001-of-00014.safetensors", "model.layers.12.mlp.up_proj.weight": "model-00001-of-00014.safetensors", "model.layers.13.post_attention_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.12.post_attention_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00014.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00001-of-00014.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00001-of-00014.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00001-of-00014.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00001-of-00014.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00001-of-00014.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00001-of-00014.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00014.safetensors", "model.layers.15.input_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.14.input_layernorm.weight": "model-00001-of-00014.safetensors", "model.layers.15.mlp.down_proj.weight": "model-00001-of-00014.safetensors", "model.layers.14.mlp.down_proj.weight": "model-00001-of-00014.safetensors", "model.layers.15.mlp.gate_proj.weight": "model-00001-of-00014.safetensors", "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00014.safetensors", "model.layers.15.mlp.up_proj.weight": "model-00002-of-00014.safetensors", "model.layers.14.mlp.up_proj.weight": "model-00002-of-00014.safetensors", "model.layers.15.post_attention_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00014.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00014.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00014.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00014.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00014.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00014.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00002-of-00014.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00014.safetensors", "model.layers.17.input_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.16.input_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.17.mlp.down_proj.weight": "model-00002-of-00014.safetensors", "model.layers.16.mlp.down_proj.weight": "model-00002-of-00014.safetensors", "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00014.safetensors", "model.layers.16.mlp.gate_proj.weight": "model-00002-of-00014.safetensors", "model.layers.17.mlp.up_proj.weight": "model-00002-of-00014.safetensors", "model.layers.16.mlp.up_proj.weight": "model-00002-of-00014.safetensors", "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00014.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00014.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00014.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00014.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00002-of-00014.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00014.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00014.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00014.safetensors", "model.layers.19.input_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.18.input_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.19.mlp.down_proj.weight": "model-00002-of-00014.safetensors", "model.layers.18.mlp.down_proj.weight": "model-00002-of-00014.safetensors", "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00014.safetensors", "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00014.safetensors", "model.layers.19.mlp.up_proj.weight": "model-00002-of-00014.safetensors", "model.layers.18.mlp.up_proj.weight": "model-00002-of-00014.safetensors", "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00014.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00014.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00014.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00014.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00014.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00014.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00014.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00014.safetensors", "model.layers.21.input_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.20.input_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.21.mlp.down_proj.weight": "model-00002-of-00014.safetensors", "model.layers.20.mlp.down_proj.weight": "model-00002-of-00014.safetensors", "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00014.safetensors", "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00014.safetensors", "model.layers.21.mlp.up_proj.weight": "model-00002-of-00014.safetensors", "model.layers.20.mlp.up_proj.weight": "model-00002-of-00014.safetensors", "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00014.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00014.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00014.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00014.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00014.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00014.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00014.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00014.safetensors", "model.layers.23.input_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.22.input_layernorm.weight": "model-00002-of-00014.safetensors", "model.layers.23.mlp.down_proj.weight": "model-00002-of-00014.safetensors", "model.layers.22.mlp.down_proj.weight": "model-00002-of-00014.safetensors", "model.layers.23.mlp.gate_proj.weight": "model-00003-of-00014.safetensors", "model.layers.22.mlp.gate_proj.weight": "model-00003-of-00014.safetensors", "model.layers.23.mlp.up_proj.weight": "model-00003-of-00014.safetensors", "model.layers.22.mlp.up_proj.weight": "model-00003-of-00014.safetensors", "model.layers.23.post_attention_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.22.post_attention_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00003-of-00014.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00003-of-00014.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00003-of-00014.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00003-of-00014.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00003-of-00014.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00003-of-00014.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00003-of-00014.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00003-of-00014.safetensors", "model.layers.25.input_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.24.input_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.25.mlp.down_proj.weight": "model-00003-of-00014.safetensors", "model.layers.24.mlp.down_proj.weight": "model-00003-of-00014.safetensors", "model.layers.25.mlp.gate_proj.weight": "model-00003-of-00014.safetensors", "model.layers.24.mlp.gate_proj.weight": "model-00003-of-00014.safetensors", "model.layers.25.mlp.up_proj.weight": "model-00003-of-00014.safetensors", "model.layers.24.mlp.up_proj.weight": "model-00003-of-00014.safetensors", "model.layers.25.post_attention_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.24.post_attention_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00003-of-00014.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00003-of-00014.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00003-of-00014.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00003-of-00014.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00003-of-00014.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00003-of-00014.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00003-of-00014.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00003-of-00014.safetensors", "model.layers.27.input_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.26.input_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.27.mlp.down_proj.weight": "model-00003-of-00014.safetensors", "model.layers.26.mlp.down_proj.weight": "model-00003-of-00014.safetensors", "model.layers.27.mlp.gate_proj.weight": "model-00003-of-00014.safetensors", "model.layers.26.mlp.gate_proj.weight": "model-00003-of-00014.safetensors", "model.layers.27.mlp.up_proj.weight": "model-00003-of-00014.safetensors", "model.layers.26.mlp.up_proj.weight": "model-00003-of-00014.safetensors", "model.layers.27.post_attention_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.26.post_attention_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00003-of-00014.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00003-of-00014.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00003-of-00014.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00003-of-00014.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00003-of-00014.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00003-of-00014.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00003-of-00014.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00003-of-00014.safetensors", "model.layers.29.input_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.28.input_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.29.mlp.down_proj.weight": "model-00003-of-00014.safetensors", "model.layers.28.mlp.down_proj.weight": "model-00003-of-00014.safetensors", "model.layers.29.mlp.gate_proj.weight": "model-00003-of-00014.safetensors", "model.layers.28.mlp.gate_proj.weight": "model-00003-of-00014.safetensors", "model.layers.29.mlp.up_proj.weight": "model-00003-of-00014.safetensors", "model.layers.28.mlp.up_proj.weight": "model-00003-of-00014.safetensors", "model.layers.29.post_attention_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.28.post_attention_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00003-of-00014.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00003-of-00014.safetensors", "model.layers.29.self_attn.o_proj.weight": "model-00003-of-00014.safetensors", "model.layers.28.self_attn.o_proj.weight": "model-00003-of-00014.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00003-of-00014.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00003-of-00014.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00003-of-00014.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00003-of-00014.safetensors", "model.layers.2.input_layernorm.weight": "model-00003-of-00014.safetensors", "model.layers.2.mlp.down_proj.weight": "model-00003-of-00014.safetensors", "model.layers.2.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.2.mlp.up_proj.weight": "model-00004-of-00014.safetensors", "model.layers.2.post_attention_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00004-of-00014.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00004-of-00014.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00004-of-00014.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00004-of-00014.safetensors", "model.layers.31.input_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.30.input_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.31.mlp.down_proj.weight": "model-00004-of-00014.safetensors", "model.layers.30.mlp.down_proj.weight": "model-00004-of-00014.safetensors", "model.layers.31.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.30.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.31.mlp.up_proj.weight": "model-00004-of-00014.safetensors", "model.layers.30.mlp.up_proj.weight": "model-00004-of-00014.safetensors", "model.layers.31.post_attention_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.30.post_attention_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00004-of-00014.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00004-of-00014.safetensors", "model.layers.31.self_attn.o_proj.weight": "model-00004-of-00014.safetensors", "model.layers.30.self_attn.o_proj.weight": "model-00004-of-00014.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00004-of-00014.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00004-of-00014.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00004-of-00014.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00004-of-00014.safetensors", "model.layers.33.input_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.32.input_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.33.mlp.down_proj.weight": "model-00004-of-00014.safetensors", "model.layers.32.mlp.down_proj.weight": "model-00004-of-00014.safetensors", "model.layers.33.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.32.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.33.mlp.up_proj.weight": "model-00004-of-00014.safetensors", "model.layers.32.mlp.up_proj.weight": "model-00004-of-00014.safetensors", "model.layers.33.post_attention_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.32.post_attention_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.33.self_attn.k_proj.weight": "model-00004-of-00014.safetensors", "model.layers.32.self_attn.k_proj.weight": "model-00004-of-00014.safetensors", "model.layers.33.self_attn.o_proj.weight": "model-00004-of-00014.safetensors", "model.layers.32.self_attn.o_proj.weight": "model-00004-of-00014.safetensors", "model.layers.33.self_attn.q_proj.weight": "model-00004-of-00014.safetensors", "model.layers.32.self_attn.q_proj.weight": "model-00004-of-00014.safetensors", "model.layers.33.self_attn.v_proj.weight": "model-00004-of-00014.safetensors", "model.layers.32.self_attn.v_proj.weight": "model-00004-of-00014.safetensors", "model.layers.35.input_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.34.input_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.35.mlp.down_proj.weight": "model-00004-of-00014.safetensors", "model.layers.34.mlp.down_proj.weight": "model-00004-of-00014.safetensors", "model.layers.35.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.34.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.35.mlp.up_proj.weight": "model-00004-of-00014.safetensors", "model.layers.34.mlp.up_proj.weight": "model-00004-of-00014.safetensors", "model.layers.35.post_attention_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.34.post_attention_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.35.self_attn.k_proj.weight": "model-00004-of-00014.safetensors", "model.layers.34.self_attn.k_proj.weight": "model-00004-of-00014.safetensors", "model.layers.35.self_attn.o_proj.weight": "model-00004-of-00014.safetensors", "model.layers.34.self_attn.o_proj.weight": "model-00004-of-00014.safetensors", "model.layers.35.self_attn.q_proj.weight": "model-00004-of-00014.safetensors", "model.layers.34.self_attn.q_proj.weight": "model-00004-of-00014.safetensors", "model.layers.35.self_attn.v_proj.weight": "model-00004-of-00014.safetensors", "model.layers.34.self_attn.v_proj.weight": "model-00004-of-00014.safetensors", "model.layers.37.input_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.36.input_layernorm.weight": "model-00004-of-00014.safetensors", "model.layers.37.mlp.down_proj.weight": "model-00004-of-00014.safetensors", "model.layers.36.mlp.down_proj.weight": "model-00004-of-00014.safetensors", "model.layers.37.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.36.mlp.gate_proj.weight": "model-00004-of-00014.safetensors", "model.layers.37.mlp.up_proj.weight": "model-00005-of-00014.safetensors", "model.layers.36.mlp.up_proj.weight": "model-00005-of-00014.safetensors", "model.layers.37.post_attention_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.36.post_attention_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.37.self_attn.k_proj.weight": "model-00005-of-00014.safetensors", "model.layers.36.self_attn.k_proj.weight": "model-00005-of-00014.safetensors", "model.layers.37.self_attn.o_proj.weight": "model-00005-of-00014.safetensors", "model.layers.36.self_attn.o_proj.weight": "model-00005-of-00014.safetensors", "model.layers.37.self_attn.q_proj.weight": "model-00005-of-00014.safetensors", "model.layers.36.self_attn.q_proj.weight": "model-00005-of-00014.safetensors", "model.layers.37.self_attn.v_proj.weight": "model-00005-of-00014.safetensors", "model.layers.36.self_attn.v_proj.weight": "model-00005-of-00014.safetensors", "model.layers.39.input_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.38.input_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.39.mlp.down_proj.weight": "model-00005-of-00014.safetensors", "model.layers.38.mlp.down_proj.weight": "model-00005-of-00014.safetensors", "model.layers.39.mlp.gate_proj.weight": "model-00005-of-00014.safetensors", "model.layers.38.mlp.gate_proj.weight": "model-00005-of-00014.safetensors", "model.layers.39.mlp.up_proj.weight": "model-00005-of-00014.safetensors", "model.layers.38.mlp.up_proj.weight": "model-00005-of-00014.safetensors", "model.layers.39.post_attention_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.38.post_attention_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.39.self_attn.k_proj.weight": "model-00005-of-00014.safetensors", "model.layers.38.self_attn.k_proj.weight": "model-00005-of-00014.safetensors", "model.layers.39.self_attn.o_proj.weight": "model-00005-of-00014.safetensors", "model.layers.38.self_attn.o_proj.weight": "model-00005-of-00014.safetensors", "model.layers.39.self_attn.q_proj.weight": "model-00005-of-00014.safetensors", "model.layers.38.self_attn.q_proj.weight": "model-00005-of-00014.safetensors", "model.layers.39.self_attn.v_proj.weight": "model-00005-of-00014.safetensors", "model.layers.38.self_attn.v_proj.weight": "model-00005-of-00014.safetensors", "model.layers.41.input_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.40.input_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.41.mlp.down_proj.weight": "model-00005-of-00014.safetensors", "model.layers.40.mlp.down_proj.weight": "model-00005-of-00014.safetensors", "model.layers.41.mlp.gate_proj.weight": "model-00005-of-00014.safetensors", "model.layers.40.mlp.gate_proj.weight": "model-00005-of-00014.safetensors", "model.layers.41.mlp.up_proj.weight": "model-00005-of-00014.safetensors", "model.layers.40.mlp.up_proj.weight": "model-00005-of-00014.safetensors", "model.layers.41.post_attention_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.40.post_attention_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.41.self_attn.k_proj.weight": "model-00005-of-00014.safetensors", "model.layers.40.self_attn.k_proj.weight": "model-00005-of-00014.safetensors", "model.layers.41.self_attn.o_proj.weight": "model-00005-of-00014.safetensors", "model.layers.40.self_attn.o_proj.weight": "model-00005-of-00014.safetensors", "model.layers.41.self_attn.q_proj.weight": "model-00005-of-00014.safetensors", "model.layers.40.self_attn.q_proj.weight": "model-00005-of-00014.safetensors", "model.layers.41.self_attn.v_proj.weight": "model-00005-of-00014.safetensors", "model.layers.40.self_attn.v_proj.weight": "model-00005-of-00014.safetensors", "model.layers.43.input_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.42.input_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.43.mlp.down_proj.weight": "model-00005-of-00014.safetensors", "model.layers.42.mlp.down_proj.weight": "model-00005-of-00014.safetensors", "model.layers.43.mlp.gate_proj.weight": "model-00005-of-00014.safetensors", "model.layers.42.mlp.gate_proj.weight": "model-00005-of-00014.safetensors", "model.layers.43.mlp.up_proj.weight": "model-00005-of-00014.safetensors", "model.layers.42.mlp.up_proj.weight": "model-00005-of-00014.safetensors", "model.layers.43.post_attention_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.42.post_attention_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.43.self_attn.k_proj.weight": "model-00005-of-00014.safetensors", "model.layers.42.self_attn.k_proj.weight": "model-00005-of-00014.safetensors", "model.layers.43.self_attn.o_proj.weight": "model-00005-of-00014.safetensors", "model.layers.42.self_attn.o_proj.weight": "model-00005-of-00014.safetensors", "model.layers.43.self_attn.q_proj.weight": "model-00005-of-00014.safetensors", "model.layers.42.self_attn.q_proj.weight": "model-00005-of-00014.safetensors", "model.layers.43.self_attn.v_proj.weight": "model-00005-of-00014.safetensors", "model.layers.42.self_attn.v_proj.weight": "model-00005-of-00014.safetensors", "model.layers.45.input_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.44.input_layernorm.weight": "model-00005-of-00014.safetensors", "model.layers.45.mlp.down_proj.weight": "model-00005-of-00014.safetensors", "model.layers.44.mlp.down_proj.weight": "model-00005-of-00014.safetensors", "model.layers.45.mlp.gate_proj.weight": "model-00005-of-00014.safetensors", "model.layers.44.mlp.gate_proj.weight": "model-00006-of-00014.safetensors", "model.layers.45.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.44.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.45.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.44.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.45.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.44.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.45.self_attn.o_proj.weight": "model-00006-of-00014.safetensors", "model.layers.44.self_attn.o_proj.weight": "model-00006-of-00014.safetensors", "model.layers.45.self_attn.q_proj.weight": "model-00006-of-00014.safetensors", "model.layers.44.self_attn.q_proj.weight": "model-00006-of-00014.safetensors", "model.layers.45.self_attn.v_proj.weight": "model-00006-of-00014.safetensors", "model.layers.44.self_attn.v_proj.weight": "model-00006-of-00014.safetensors", "model.layers.47.input_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.46.input_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.47.mlp.down_proj.weight": "model-00006-of-00014.safetensors", "model.layers.46.mlp.down_proj.weight": "model-00006-of-00014.safetensors", "model.layers.47.mlp.gate_proj.weight": "model-00006-of-00014.safetensors", "model.layers.46.mlp.gate_proj.weight": "model-00006-of-00014.safetensors", "model.layers.47.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.46.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.47.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.46.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.47.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.46.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.47.self_attn.o_proj.weight": "model-00006-of-00014.safetensors", "model.layers.46.self_attn.o_proj.weight": "model-00006-of-00014.safetensors", "model.layers.47.self_attn.q_proj.weight": "model-00006-of-00014.safetensors", "model.layers.46.self_attn.q_proj.weight": "model-00006-of-00014.safetensors", "model.layers.47.self_attn.v_proj.weight": "model-00006-of-00014.safetensors", "model.layers.46.self_attn.v_proj.weight": "model-00006-of-00014.safetensors", "model.layers.49.input_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.48.input_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.49.mlp.down_proj.weight": "model-00006-of-00014.safetensors", "model.layers.48.mlp.down_proj.weight": "model-00006-of-00014.safetensors", "model.layers.49.mlp.gate_proj.weight": "model-00006-of-00014.safetensors", "model.layers.48.mlp.gate_proj.weight": "model-00006-of-00014.safetensors", "model.layers.49.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.48.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.49.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.48.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.49.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.48.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.49.self_attn.o_proj.weight": "model-00006-of-00014.safetensors", "model.layers.48.self_attn.o_proj.weight": "model-00006-of-00014.safetensors", "model.layers.49.self_attn.q_proj.weight": "model-00006-of-00014.safetensors", "model.layers.48.self_attn.q_proj.weight": "model-00006-of-00014.safetensors", "model.layers.49.self_attn.v_proj.weight": "model-00006-of-00014.safetensors", "model.layers.48.self_attn.v_proj.weight": "model-00006-of-00014.safetensors", "model.layers.3.input_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.3.mlp.down_proj.weight": "model-00006-of-00014.safetensors", "model.layers.3.mlp.gate_proj.weight": "model-00006-of-00014.safetensors", "model.layers.3.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.3.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00006-of-00014.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00006-of-00014.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00006-of-00014.safetensors", "model.layers.51.input_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.50.input_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.51.mlp.down_proj.weight": "model-00006-of-00014.safetensors", "model.layers.50.mlp.down_proj.weight": "model-00006-of-00014.safetensors", "model.layers.51.mlp.gate_proj.weight": "model-00006-of-00014.safetensors", "model.layers.50.mlp.gate_proj.weight": "model-00006-of-00014.safetensors", "model.layers.51.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.50.mlp.up_proj.weight": "model-00006-of-00014.safetensors", "model.layers.51.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.50.post_attention_layernorm.weight": "model-00006-of-00014.safetensors", "model.layers.51.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.50.self_attn.k_proj.weight": "model-00006-of-00014.safetensors", "model.layers.51.self_attn.o_proj.weight": "model-00007-of-00014.safetensors", "model.layers.50.self_attn.o_proj.weight": "model-00007-of-00014.safetensors", "model.layers.51.self_attn.q_proj.weight": "model-00007-of-00014.safetensors", "model.layers.50.self_attn.q_proj.weight": "model-00007-of-00014.safetensors", "model.layers.51.self_attn.v_proj.weight": "model-00007-of-00014.safetensors", "model.layers.50.self_attn.v_proj.weight": "model-00007-of-00014.safetensors", "model.layers.53.input_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.52.input_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.53.mlp.down_proj.weight": "model-00007-of-00014.safetensors", "model.layers.52.mlp.down_proj.weight": "model-00007-of-00014.safetensors", "model.layers.53.mlp.gate_proj.weight": "model-00007-of-00014.safetensors", "model.layers.52.mlp.gate_proj.weight": "model-00007-of-00014.safetensors", "model.layers.53.mlp.up_proj.weight": "model-00007-of-00014.safetensors", "model.layers.52.mlp.up_proj.weight": "model-00007-of-00014.safetensors", "model.layers.53.post_attention_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.52.post_attention_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.53.self_attn.k_proj.weight": "model-00007-of-00014.safetensors", "model.layers.52.self_attn.k_proj.weight": "model-00007-of-00014.safetensors", "model.layers.53.self_attn.o_proj.weight": "model-00007-of-00014.safetensors", "model.layers.52.self_attn.o_proj.weight": "model-00007-of-00014.safetensors", "model.layers.53.self_attn.q_proj.weight": "model-00007-of-00014.safetensors", "model.layers.52.self_attn.q_proj.weight": "model-00007-of-00014.safetensors", "model.layers.53.self_attn.v_proj.weight": "model-00007-of-00014.safetensors", "model.layers.52.self_attn.v_proj.weight": "model-00007-of-00014.safetensors", "model.layers.55.input_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.54.input_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.55.mlp.down_proj.weight": "model-00007-of-00014.safetensors", "model.layers.54.mlp.down_proj.weight": "model-00007-of-00014.safetensors", "model.layers.55.mlp.gate_proj.weight": "model-00007-of-00014.safetensors", "model.layers.54.mlp.gate_proj.weight": "model-00007-of-00014.safetensors", "model.layers.55.mlp.up_proj.weight": "model-00007-of-00014.safetensors", "model.layers.54.mlp.up_proj.weight": "model-00007-of-00014.safetensors", "model.layers.55.post_attention_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.54.post_attention_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.55.self_attn.k_proj.weight": "model-00007-of-00014.safetensors", "model.layers.54.self_attn.k_proj.weight": "model-00007-of-00014.safetensors", "model.layers.55.self_attn.o_proj.weight": "model-00007-of-00014.safetensors", "model.layers.54.self_attn.o_proj.weight": "model-00007-of-00014.safetensors", "model.layers.55.self_attn.q_proj.weight": "model-00007-of-00014.safetensors", "model.layers.54.self_attn.q_proj.weight": "model-00007-of-00014.safetensors", "model.layers.55.self_attn.v_proj.weight": "model-00007-of-00014.safetensors", "model.layers.54.self_attn.v_proj.weight": "model-00007-of-00014.safetensors", "model.layers.57.input_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.56.input_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.57.mlp.down_proj.weight": "model-00007-of-00014.safetensors", "model.layers.56.mlp.down_proj.weight": "model-00007-of-00014.safetensors", "model.layers.57.mlp.gate_proj.weight": "model-00007-of-00014.safetensors", "model.layers.56.mlp.gate_proj.weight": "model-00007-of-00014.safetensors", "model.layers.57.mlp.up_proj.weight": "model-00007-of-00014.safetensors", "model.layers.56.mlp.up_proj.weight": "model-00007-of-00014.safetensors", "model.layers.57.post_attention_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.56.post_attention_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.57.self_attn.k_proj.weight": "model-00007-of-00014.safetensors", "model.layers.56.self_attn.k_proj.weight": "model-00007-of-00014.safetensors", "model.layers.57.self_attn.o_proj.weight": "model-00007-of-00014.safetensors", "model.layers.56.self_attn.o_proj.weight": "model-00007-of-00014.safetensors", "model.layers.57.self_attn.q_proj.weight": "model-00007-of-00014.safetensors", "model.layers.56.self_attn.q_proj.weight": "model-00007-of-00014.safetensors", "model.layers.57.self_attn.v_proj.weight": "model-00007-of-00014.safetensors", "model.layers.56.self_attn.v_proj.weight": "model-00007-of-00014.safetensors", "model.layers.59.input_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.58.input_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.59.mlp.down_proj.weight": "model-00007-of-00014.safetensors", "model.layers.58.mlp.down_proj.weight": "model-00007-of-00014.safetensors", "model.layers.59.mlp.gate_proj.weight": "model-00007-of-00014.safetensors", "model.layers.58.mlp.gate_proj.weight": "model-00007-of-00014.safetensors", "model.layers.59.mlp.up_proj.weight": "model-00007-of-00014.safetensors", "model.layers.58.mlp.up_proj.weight": "model-00007-of-00014.safetensors", "model.layers.59.post_attention_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.58.post_attention_layernorm.weight": "model-00007-of-00014.safetensors", "model.layers.59.self_attn.k_proj.weight": "model-00008-of-00014.safetensors", "model.layers.58.self_attn.k_proj.weight": "model-00008-of-00014.safetensors", "model.layers.59.self_attn.o_proj.weight": "model-00008-of-00014.safetensors", "model.layers.58.self_attn.o_proj.weight": "model-00008-of-00014.safetensors", "model.layers.59.self_attn.q_proj.weight": "model-00008-of-00014.safetensors", "model.layers.58.self_attn.q_proj.weight": "model-00008-of-00014.safetensors", "model.layers.59.self_attn.v_proj.weight": "model-00008-of-00014.safetensors", "model.layers.58.self_attn.v_proj.weight": "model-00008-of-00014.safetensors", "model.layers.61.input_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.60.input_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.61.mlp.down_proj.weight": "model-00008-of-00014.safetensors", "model.layers.60.mlp.down_proj.weight": "model-00008-of-00014.safetensors", "model.layers.61.mlp.gate_proj.weight": "model-00008-of-00014.safetensors", "model.layers.60.mlp.gate_proj.weight": "model-00008-of-00014.safetensors", "model.layers.61.mlp.up_proj.weight": "model-00008-of-00014.safetensors", "model.layers.60.mlp.up_proj.weight": "model-00008-of-00014.safetensors", "model.layers.61.post_attention_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.60.post_attention_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.61.self_attn.k_proj.weight": "model-00008-of-00014.safetensors", "model.layers.60.self_attn.k_proj.weight": "model-00008-of-00014.safetensors", "model.layers.61.self_attn.o_proj.weight": "model-00008-of-00014.safetensors", "model.layers.60.self_attn.o_proj.weight": "model-00008-of-00014.safetensors", "model.layers.61.self_attn.q_proj.weight": "model-00008-of-00014.safetensors", "model.layers.60.self_attn.q_proj.weight": "model-00008-of-00014.safetensors", "model.layers.61.self_attn.v_proj.weight": "model-00008-of-00014.safetensors", "model.layers.60.self_attn.v_proj.weight": "model-00008-of-00014.safetensors", "model.layers.63.input_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.62.input_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.63.mlp.down_proj.weight": "model-00008-of-00014.safetensors", "model.layers.62.mlp.down_proj.weight": "model-00008-of-00014.safetensors", "model.layers.63.mlp.gate_proj.weight": "model-00008-of-00014.safetensors", "model.layers.62.mlp.gate_proj.weight": "model-00008-of-00014.safetensors", "model.layers.63.mlp.up_proj.weight": "model-00008-of-00014.safetensors", "model.layers.62.mlp.up_proj.weight": "model-00008-of-00014.safetensors", "model.layers.63.post_attention_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.62.post_attention_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.63.self_attn.k_proj.weight": "model-00008-of-00014.safetensors", "model.layers.62.self_attn.k_proj.weight": "model-00008-of-00014.safetensors", "model.layers.63.self_attn.o_proj.weight": "model-00008-of-00014.safetensors", "model.layers.62.self_attn.o_proj.weight": "model-00008-of-00014.safetensors", "model.layers.63.self_attn.q_proj.weight": "model-00008-of-00014.safetensors", "model.layers.62.self_attn.q_proj.weight": "model-00008-of-00014.safetensors", "model.layers.63.self_attn.v_proj.weight": "model-00008-of-00014.safetensors", "model.layers.62.self_attn.v_proj.weight": "model-00008-of-00014.safetensors", "model.layers.65.input_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.64.input_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.65.mlp.down_proj.weight": "model-00008-of-00014.safetensors", "model.layers.64.mlp.down_proj.weight": "model-00008-of-00014.safetensors", "model.layers.65.mlp.gate_proj.weight": "model-00008-of-00014.safetensors", "model.layers.64.mlp.gate_proj.weight": "model-00008-of-00014.safetensors", "model.layers.65.mlp.up_proj.weight": "model-00008-of-00014.safetensors", "model.layers.64.mlp.up_proj.weight": "model-00008-of-00014.safetensors", "model.layers.65.post_attention_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.64.post_attention_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.65.self_attn.k_proj.weight": "model-00008-of-00014.safetensors", "model.layers.64.self_attn.k_proj.weight": "model-00008-of-00014.safetensors", "model.layers.65.self_attn.o_proj.weight": "model-00008-of-00014.safetensors", "model.layers.64.self_attn.o_proj.weight": "model-00008-of-00014.safetensors", "model.layers.65.self_attn.q_proj.weight": "model-00008-of-00014.safetensors", "model.layers.64.self_attn.q_proj.weight": "model-00008-of-00014.safetensors", "model.layers.65.self_attn.v_proj.weight": "model-00008-of-00014.safetensors", "model.layers.64.self_attn.v_proj.weight": "model-00008-of-00014.safetensors", "model.layers.67.input_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.66.input_layernorm.weight": "model-00008-of-00014.safetensors", "model.layers.67.mlp.down_proj.weight": "model-00008-of-00014.safetensors", "model.layers.66.mlp.down_proj.weight": "model-00008-of-00014.safetensors", "model.layers.67.mlp.gate_proj.weight": "model-00008-of-00014.safetensors", "model.layers.66.mlp.gate_proj.weight": "model-00008-of-00014.safetensors", "model.layers.67.mlp.up_proj.weight": "model-00008-of-00014.safetensors", "model.layers.66.mlp.up_proj.weight": "model-00009-of-00014.safetensors", "model.layers.67.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.66.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.67.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.66.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.67.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.66.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.67.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.66.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.67.self_attn.v_proj.weight": "model-00009-of-00014.safetensors", "model.layers.66.self_attn.v_proj.weight": "model-00009-of-00014.safetensors", "model.layers.69.input_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.68.input_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.69.mlp.down_proj.weight": "model-00009-of-00014.safetensors", "model.layers.68.mlp.down_proj.weight": "model-00009-of-00014.safetensors", "model.layers.69.mlp.gate_proj.weight": "model-00009-of-00014.safetensors", "model.layers.68.mlp.gate_proj.weight": "model-00009-of-00014.safetensors", "model.layers.69.mlp.up_proj.weight": "model-00009-of-00014.safetensors", "model.layers.68.mlp.up_proj.weight": "model-00009-of-00014.safetensors", "model.layers.69.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.68.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.69.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.68.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.69.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.68.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.69.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.68.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.69.self_attn.v_proj.weight": "model-00009-of-00014.safetensors", "model.layers.68.self_attn.v_proj.weight": "model-00009-of-00014.safetensors", "model.layers.4.input_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.4.mlp.down_proj.weight": "model-00009-of-00014.safetensors", "model.layers.4.mlp.gate_proj.weight": "model-00009-of-00014.safetensors", "model.layers.4.mlp.up_proj.weight": "model-00009-of-00014.safetensors", "model.layers.4.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00009-of-00014.safetensors", "model.layers.71.input_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.70.input_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.71.mlp.down_proj.weight": "model-00009-of-00014.safetensors", "model.layers.70.mlp.down_proj.weight": "model-00009-of-00014.safetensors", "model.layers.71.mlp.gate_proj.weight": "model-00009-of-00014.safetensors", "model.layers.70.mlp.gate_proj.weight": "model-00009-of-00014.safetensors", "model.layers.71.mlp.up_proj.weight": "model-00009-of-00014.safetensors", "model.layers.70.mlp.up_proj.weight": "model-00009-of-00014.safetensors", "model.layers.71.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.70.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.71.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.70.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.71.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.70.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.71.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.70.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.71.self_attn.v_proj.weight": "model-00009-of-00014.safetensors", "model.layers.70.self_attn.v_proj.weight": "model-00009-of-00014.safetensors", "model.layers.73.input_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.72.input_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.73.mlp.down_proj.weight": "model-00009-of-00014.safetensors", "model.layers.72.mlp.down_proj.weight": "model-00009-of-00014.safetensors", "model.layers.73.mlp.gate_proj.weight": "model-00009-of-00014.safetensors", "model.layers.72.mlp.gate_proj.weight": "model-00009-of-00014.safetensors", "model.layers.73.mlp.up_proj.weight": "model-00009-of-00014.safetensors", "model.layers.72.mlp.up_proj.weight": "model-00009-of-00014.safetensors", "model.layers.73.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.72.post_attention_layernorm.weight": "model-00009-of-00014.safetensors", "model.layers.73.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.72.self_attn.k_proj.weight": "model-00009-of-00014.safetensors", "model.layers.73.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.72.self_attn.o_proj.weight": "model-00009-of-00014.safetensors", "model.layers.73.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.72.self_attn.q_proj.weight": "model-00009-of-00014.safetensors", "model.layers.73.self_attn.v_proj.weight": "model-00009-of-00014.safetensors", "model.layers.72.self_attn.v_proj.weight": "model-00010-of-00014.safetensors", "model.layers.75.input_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.74.input_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.75.mlp.down_proj.weight": "model-00010-of-00014.safetensors", "model.layers.74.mlp.down_proj.weight": "model-00010-of-00014.safetensors", "model.layers.75.mlp.gate_proj.weight": "model-00010-of-00014.safetensors", "model.layers.74.mlp.gate_proj.weight": "model-00010-of-00014.safetensors", "model.layers.75.mlp.up_proj.weight": "model-00010-of-00014.safetensors", "model.layers.74.mlp.up_proj.weight": "model-00010-of-00014.safetensors", "model.layers.75.post_attention_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.74.post_attention_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.75.self_attn.k_proj.weight": "model-00010-of-00014.safetensors", "model.layers.74.self_attn.k_proj.weight": "model-00010-of-00014.safetensors", "model.layers.75.self_attn.o_proj.weight": "model-00010-of-00014.safetensors", "model.layers.74.self_attn.o_proj.weight": "model-00010-of-00014.safetensors", "model.layers.75.self_attn.q_proj.weight": "model-00010-of-00014.safetensors", "model.layers.74.self_attn.q_proj.weight": "model-00010-of-00014.safetensors", "model.layers.75.self_attn.v_proj.weight": "model-00010-of-00014.safetensors", "model.layers.74.self_attn.v_proj.weight": "model-00010-of-00014.safetensors", "model.layers.77.input_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.76.input_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.77.mlp.down_proj.weight": "model-00010-of-00014.safetensors", "model.layers.76.mlp.down_proj.weight": "model-00010-of-00014.safetensors", "model.layers.77.mlp.gate_proj.weight": "model-00010-of-00014.safetensors", "model.layers.76.mlp.gate_proj.weight": "model-00010-of-00014.safetensors", "model.layers.77.mlp.up_proj.weight": "model-00010-of-00014.safetensors", "model.layers.76.mlp.up_proj.weight": "model-00010-of-00014.safetensors", "model.layers.77.post_attention_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.76.post_attention_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.77.self_attn.k_proj.weight": "model-00010-of-00014.safetensors", "model.layers.76.self_attn.k_proj.weight": "model-00010-of-00014.safetensors", "model.layers.77.self_attn.o_proj.weight": "model-00010-of-00014.safetensors", "model.layers.76.self_attn.o_proj.weight": "model-00010-of-00014.safetensors", "model.layers.77.self_attn.q_proj.weight": "model-00010-of-00014.safetensors", "model.layers.76.self_attn.q_proj.weight": "model-00010-of-00014.safetensors", "model.layers.77.self_attn.v_proj.weight": "model-00010-of-00014.safetensors", "model.layers.76.self_attn.v_proj.weight": "model-00010-of-00014.safetensors", "model.layers.79.input_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.78.input_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.79.mlp.down_proj.weight": "model-00010-of-00014.safetensors", "model.layers.78.mlp.down_proj.weight": "model-00010-of-00014.safetensors", "model.layers.79.mlp.gate_proj.weight": "model-00010-of-00014.safetensors", "model.layers.78.mlp.gate_proj.weight": "model-00010-of-00014.safetensors", "model.layers.79.mlp.up_proj.weight": "model-00010-of-00014.safetensors", "model.layers.78.mlp.up_proj.weight": "model-00010-of-00014.safetensors", "model.layers.79.post_attention_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.78.post_attention_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.79.self_attn.k_proj.weight": "model-00010-of-00014.safetensors", "model.layers.78.self_attn.k_proj.weight": "model-00010-of-00014.safetensors", "model.layers.79.self_attn.o_proj.weight": "model-00010-of-00014.safetensors", "model.layers.78.self_attn.o_proj.weight": "model-00010-of-00014.safetensors", "model.layers.79.self_attn.q_proj.weight": "model-00010-of-00014.safetensors", "model.layers.78.self_attn.q_proj.weight": "model-00010-of-00014.safetensors", "model.layers.79.self_attn.v_proj.weight": "model-00010-of-00014.safetensors", "model.layers.78.self_attn.v_proj.weight": "model-00010-of-00014.safetensors", "model.layers.81.input_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.80.input_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.81.mlp.down_proj.weight": "model-00010-of-00014.safetensors", "model.layers.80.mlp.down_proj.weight": "model-00010-of-00014.safetensors", "model.layers.81.mlp.gate_proj.weight": "model-00010-of-00014.safetensors", "model.layers.80.mlp.gate_proj.weight": "model-00010-of-00014.safetensors", "model.layers.81.mlp.up_proj.weight": "model-00010-of-00014.safetensors", "model.layers.80.mlp.up_proj.weight": "model-00010-of-00014.safetensors", "model.layers.81.post_attention_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.80.post_attention_layernorm.weight": "model-00010-of-00014.safetensors", "model.layers.81.self_attn.k_proj.weight": "model-00010-of-00014.safetensors", "model.layers.80.self_attn.k_proj.weight": "model-00010-of-00014.safetensors", "model.layers.81.self_attn.o_proj.weight": "model-00010-of-00014.safetensors", "model.layers.80.self_attn.o_proj.weight": "model-00010-of-00014.safetensors", "model.layers.81.self_attn.q_proj.weight": "model-00010-of-00014.safetensors", "model.layers.80.self_attn.q_proj.weight": "model-00011-of-00014.safetensors", "model.layers.81.self_attn.v_proj.weight": "model-00011-of-00014.safetensors", "model.layers.80.self_attn.v_proj.weight": "model-00011-of-00014.safetensors", "model.layers.83.input_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.82.input_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.83.mlp.down_proj.weight": "model-00011-of-00014.safetensors", "model.layers.82.mlp.down_proj.weight": "model-00011-of-00014.safetensors", "model.layers.83.mlp.gate_proj.weight": "model-00011-of-00014.safetensors", "model.layers.82.mlp.gate_proj.weight": "model-00011-of-00014.safetensors", "model.layers.83.mlp.up_proj.weight": "model-00011-of-00014.safetensors", "model.layers.82.mlp.up_proj.weight": "model-00011-of-00014.safetensors", "model.layers.83.post_attention_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.82.post_attention_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.83.self_attn.k_proj.weight": "model-00011-of-00014.safetensors", "model.layers.82.self_attn.k_proj.weight": "model-00011-of-00014.safetensors", "model.layers.83.self_attn.o_proj.weight": "model-00011-of-00014.safetensors", "model.layers.82.self_attn.o_proj.weight": "model-00011-of-00014.safetensors", "model.layers.83.self_attn.q_proj.weight": "model-00011-of-00014.safetensors", "model.layers.82.self_attn.q_proj.weight": "model-00011-of-00014.safetensors", "model.layers.83.self_attn.v_proj.weight": "model-00011-of-00014.safetensors", "model.layers.82.self_attn.v_proj.weight": "model-00011-of-00014.safetensors", "model.layers.85.input_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.84.input_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.85.mlp.down_proj.weight": "model-00011-of-00014.safetensors", "model.layers.84.mlp.down_proj.weight": "model-00011-of-00014.safetensors", "model.layers.85.mlp.gate_proj.weight": "model-00011-of-00014.safetensors", "model.layers.84.mlp.gate_proj.weight": "model-00011-of-00014.safetensors", "model.layers.85.mlp.up_proj.weight": "model-00011-of-00014.safetensors", "model.layers.84.mlp.up_proj.weight": "model-00011-of-00014.safetensors", "model.layers.85.post_attention_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.84.post_attention_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.85.self_attn.k_proj.weight": "model-00011-of-00014.safetensors", "model.layers.84.self_attn.k_proj.weight": "model-00011-of-00014.safetensors", "model.layers.85.self_attn.o_proj.weight": "model-00011-of-00014.safetensors", "model.layers.84.self_attn.o_proj.weight": "model-00011-of-00014.safetensors", "model.layers.85.self_attn.q_proj.weight": "model-00011-of-00014.safetensors", "model.layers.84.self_attn.q_proj.weight": "model-00011-of-00014.safetensors", "model.layers.85.self_attn.v_proj.weight": "model-00011-of-00014.safetensors", "model.layers.84.self_attn.v_proj.weight": "model-00011-of-00014.safetensors", "model.layers.87.input_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.86.input_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.87.mlp.down_proj.weight": "model-00011-of-00014.safetensors", "model.layers.86.mlp.down_proj.weight": "model-00011-of-00014.safetensors", "model.layers.87.mlp.gate_proj.weight": "model-00011-of-00014.safetensors", "model.layers.86.mlp.gate_proj.weight": "model-00011-of-00014.safetensors", "model.layers.87.mlp.up_proj.weight": "model-00011-of-00014.safetensors", "model.layers.86.mlp.up_proj.weight": "model-00011-of-00014.safetensors", "model.layers.87.post_attention_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.86.post_attention_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.87.self_attn.k_proj.weight": "model-00011-of-00014.safetensors", "model.layers.86.self_attn.k_proj.weight": "model-00011-of-00014.safetensors", "model.layers.87.self_attn.o_proj.weight": "model-00011-of-00014.safetensors", "model.layers.86.self_attn.o_proj.weight": "model-00011-of-00014.safetensors", "model.layers.87.self_attn.q_proj.weight": "model-00011-of-00014.safetensors", "model.layers.86.self_attn.q_proj.weight": "model-00011-of-00014.safetensors", "model.layers.87.self_attn.v_proj.weight": "model-00011-of-00014.safetensors", "model.layers.86.self_attn.v_proj.weight": "model-00011-of-00014.safetensors", "model.layers.89.input_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.88.input_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.89.mlp.down_proj.weight": "model-00011-of-00014.safetensors", "model.layers.88.mlp.down_proj.weight": "model-00011-of-00014.safetensors", "model.layers.89.mlp.gate_proj.weight": "model-00011-of-00014.safetensors", "model.layers.88.mlp.gate_proj.weight": "model-00011-of-00014.safetensors", "model.layers.89.mlp.up_proj.weight": "model-00011-of-00014.safetensors", "model.layers.88.mlp.up_proj.weight": "model-00011-of-00014.safetensors", "model.layers.89.post_attention_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.88.post_attention_layernorm.weight": "model-00011-of-00014.safetensors", "model.layers.89.self_attn.k_proj.weight": "model-00011-of-00014.safetensors", "model.layers.88.self_attn.k_proj.weight": "model-00011-of-00014.safetensors", "model.layers.89.self_attn.o_proj.weight": "model-00011-of-00014.safetensors", "model.layers.88.self_attn.o_proj.weight": "model-00012-of-00014.safetensors", "model.layers.89.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.88.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.89.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.88.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.5.input_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.5.mlp.down_proj.weight": "model-00012-of-00014.safetensors", "model.layers.5.mlp.gate_proj.weight": "model-00012-of-00014.safetensors", "model.layers.5.mlp.up_proj.weight": "model-00012-of-00014.safetensors", "model.layers.5.post_attention_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00012-of-00014.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00012-of-00014.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.91.input_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.90.input_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.91.mlp.down_proj.weight": "model-00012-of-00014.safetensors", "model.layers.90.mlp.down_proj.weight": "model-00012-of-00014.safetensors", "model.layers.91.mlp.gate_proj.weight": "model-00012-of-00014.safetensors", "model.layers.90.mlp.gate_proj.weight": "model-00012-of-00014.safetensors", "model.layers.91.mlp.up_proj.weight": "model-00012-of-00014.safetensors", "model.layers.90.mlp.up_proj.weight": "model-00012-of-00014.safetensors", "model.layers.91.post_attention_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.90.post_attention_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.91.self_attn.k_proj.weight": "model-00012-of-00014.safetensors", "model.layers.90.self_attn.k_proj.weight": "model-00012-of-00014.safetensors", "model.layers.91.self_attn.o_proj.weight": "model-00012-of-00014.safetensors", "model.layers.90.self_attn.o_proj.weight": "model-00012-of-00014.safetensors", "model.layers.91.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.90.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.91.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.90.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.93.input_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.92.input_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.93.mlp.down_proj.weight": "model-00012-of-00014.safetensors", "model.layers.92.mlp.down_proj.weight": "model-00012-of-00014.safetensors", "model.layers.93.mlp.gate_proj.weight": "model-00012-of-00014.safetensors", "model.layers.92.mlp.gate_proj.weight": "model-00012-of-00014.safetensors", "model.layers.93.mlp.up_proj.weight": "model-00012-of-00014.safetensors", "model.layers.92.mlp.up_proj.weight": "model-00012-of-00014.safetensors", "model.layers.93.post_attention_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.92.post_attention_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.93.self_attn.k_proj.weight": "model-00012-of-00014.safetensors", "model.layers.92.self_attn.k_proj.weight": "model-00012-of-00014.safetensors", "model.layers.93.self_attn.o_proj.weight": "model-00012-of-00014.safetensors", "model.layers.92.self_attn.o_proj.weight": "model-00012-of-00014.safetensors", "model.layers.93.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.92.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.93.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.92.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.94.input_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.94.mlp.down_proj.weight": "model-00012-of-00014.safetensors", "model.layers.94.mlp.gate_proj.weight": "model-00012-of-00014.safetensors", "model.layers.94.mlp.up_proj.weight": "model-00012-of-00014.safetensors", "model.layers.94.post_attention_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.94.self_attn.k_proj.weight": "model-00012-of-00014.safetensors", "model.layers.94.self_attn.o_proj.weight": "model-00012-of-00014.safetensors", "model.layers.94.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.94.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.95.input_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.95.mlp.down_proj.weight": "model-00012-of-00014.safetensors", "model.layers.95.mlp.gate_proj.weight": "model-00012-of-00014.safetensors", "model.layers.95.mlp.up_proj.weight": "model-00012-of-00014.safetensors", "model.layers.95.post_attention_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.95.self_attn.k_proj.weight": "model-00012-of-00014.safetensors", "model.layers.95.self_attn.o_proj.weight": "model-00012-of-00014.safetensors", "model.layers.95.self_attn.q_proj.weight": "model-00012-of-00014.safetensors", "model.layers.95.self_attn.v_proj.weight": "model-00012-of-00014.safetensors", "model.layers.96.input_layernorm.weight": "model-00012-of-00014.safetensors", "model.layers.96.mlp.down_proj.weight": "model-00012-of-00014.safetensors", "model.layers.96.mlp.gate_proj.weight": "model-00012-of-00014.safetensors", "model.layers.96.mlp.up_proj.weight": "model-00013-of-00014.safetensors", "model.layers.96.post_attention_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.96.self_attn.k_proj.weight": "model-00013-of-00014.safetensors", "model.layers.96.self_attn.o_proj.weight": "model-00013-of-00014.safetensors", "model.layers.96.self_attn.q_proj.weight": "model-00013-of-00014.safetensors", "model.layers.96.self_attn.v_proj.weight": "model-00013-of-00014.safetensors", "model.layers.97.input_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.97.mlp.down_proj.weight": "model-00013-of-00014.safetensors", "model.layers.97.mlp.gate_proj.weight": "model-00013-of-00014.safetensors", "model.layers.97.mlp.up_proj.weight": "model-00013-of-00014.safetensors", "model.layers.97.post_attention_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.97.self_attn.k_proj.weight": "model-00013-of-00014.safetensors", "model.layers.97.self_attn.o_proj.weight": "model-00013-of-00014.safetensors", "model.layers.97.self_attn.q_proj.weight": "model-00013-of-00014.safetensors", "model.layers.97.self_attn.v_proj.weight": "model-00013-of-00014.safetensors", "model.layers.98.input_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.98.mlp.down_proj.weight": "model-00013-of-00014.safetensors", "model.layers.98.mlp.gate_proj.weight": "model-00013-of-00014.safetensors", "model.layers.98.mlp.up_proj.weight": "model-00013-of-00014.safetensors", "model.layers.98.post_attention_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.98.self_attn.k_proj.weight": "model-00013-of-00014.safetensors", "model.layers.98.self_attn.o_proj.weight": "model-00013-of-00014.safetensors", "model.layers.98.self_attn.q_proj.weight": "model-00013-of-00014.safetensors", "model.layers.98.self_attn.v_proj.weight": "model-00013-of-00014.safetensors", "model.layers.99.input_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.99.mlp.down_proj.weight": "model-00013-of-00014.safetensors", "model.layers.99.mlp.gate_proj.weight": "model-00013-of-00014.safetensors", "model.layers.99.mlp.up_proj.weight": "model-00013-of-00014.safetensors", "model.layers.99.post_attention_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.99.self_attn.k_proj.weight": "model-00013-of-00014.safetensors", "model.layers.99.self_attn.o_proj.weight": "model-00013-of-00014.safetensors", "model.layers.99.self_attn.q_proj.weight": "model-00013-of-00014.safetensors", "model.layers.99.self_attn.v_proj.weight": "model-00013-of-00014.safetensors", "model.layers.100.input_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.100.mlp.down_proj.weight": "model-00013-of-00014.safetensors", "model.layers.100.mlp.gate_proj.weight": "model-00013-of-00014.safetensors", "model.layers.100.mlp.up_proj.weight": "model-00013-of-00014.safetensors", "model.layers.100.post_attention_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.100.self_attn.k_proj.weight": "model-00013-of-00014.safetensors", "model.layers.100.self_attn.o_proj.weight": "model-00013-of-00014.safetensors", "model.layers.100.self_attn.q_proj.weight": "model-00013-of-00014.safetensors", "model.layers.100.self_attn.v_proj.weight": "model-00013-of-00014.safetensors", "model.layers.101.input_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.101.mlp.down_proj.weight": "model-00013-of-00014.safetensors", "model.layers.101.mlp.gate_proj.weight": "model-00013-of-00014.safetensors", "model.layers.101.mlp.up_proj.weight": "model-00013-of-00014.safetensors", "model.layers.101.post_attention_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.101.self_attn.k_proj.weight": "model-00013-of-00014.safetensors", "model.layers.101.self_attn.o_proj.weight": "model-00013-of-00014.safetensors", "model.layers.101.self_attn.q_proj.weight": "model-00013-of-00014.safetensors", "model.layers.101.self_attn.v_proj.weight": "model-00013-of-00014.safetensors", "model.layers.6.input_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.6.mlp.down_proj.weight": "model-00013-of-00014.safetensors", "model.layers.6.mlp.gate_proj.weight": "model-00013-of-00014.safetensors", "model.layers.6.mlp.up_proj.weight": "model-00013-of-00014.safetensors", "model.layers.6.post_attention_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00013-of-00014.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00013-of-00014.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00013-of-00014.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00013-of-00014.safetensors", "model.layers.102.input_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.102.mlp.down_proj.weight": "model-00013-of-00014.safetensors", "model.layers.102.mlp.gate_proj.weight": "model-00013-of-00014.safetensors", "model.layers.102.mlp.up_proj.weight": "model-00013-of-00014.safetensors", "model.layers.102.post_attention_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.102.self_attn.k_proj.weight": "model-00013-of-00014.safetensors", "model.layers.102.self_attn.o_proj.weight": "model-00013-of-00014.safetensors", "model.layers.102.self_attn.q_proj.weight": "model-00013-of-00014.safetensors", "model.layers.102.self_attn.v_proj.weight": "model-00013-of-00014.safetensors", "model.layers.103.input_layernorm.weight": "model-00013-of-00014.safetensors", "model.layers.103.mlp.down_proj.weight": "model-00013-of-00014.safetensors", "model.layers.103.mlp.gate_proj.weight": "model-00014-of-00014.safetensors", "model.layers.103.mlp.up_proj.weight": "model-00014-of-00014.safetensors", "model.layers.103.post_attention_layernorm.weight": "model-00014-of-00014.safetensors", "model.layers.103.self_attn.k_proj.weight": "model-00014-of-00014.safetensors", "model.layers.103.self_attn.o_proj.weight": "model-00014-of-00014.safetensors", "model.layers.103.self_attn.q_proj.weight": "model-00014-of-00014.safetensors", "model.layers.103.self_attn.v_proj.weight": "model-00014-of-00014.safetensors", "model.layers.7.input_layernorm.weight": "model-00014-of-00014.safetensors", "model.layers.7.mlp.down_proj.weight": "model-00014-of-00014.safetensors", "model.layers.7.mlp.gate_proj.weight": "model-00014-of-00014.safetensors", "model.layers.7.mlp.up_proj.weight": "model-00014-of-00014.safetensors", "model.layers.7.post_attention_layernorm.weight": "model-00014-of-00014.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00014-of-00014.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00014-of-00014.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00014-of-00014.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00014-of-00014.safetensors", "model.layers.8.input_layernorm.weight": "model-00014-of-00014.safetensors", "model.layers.8.mlp.down_proj.weight": "model-00014-of-00014.safetensors", "model.layers.8.mlp.gate_proj.weight": "model-00014-of-00014.safetensors", "model.layers.8.mlp.up_proj.weight": "model-00014-of-00014.safetensors", "model.layers.8.post_attention_layernorm.weight": "model-00014-of-00014.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00014-of-00014.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00014-of-00014.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00014-of-00014.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00014-of-00014.safetensors", "model.layers.9.input_layernorm.weight": "model-00014-of-00014.safetensors", "model.layers.9.mlp.down_proj.weight": "model-00014-of-00014.safetensors", "model.layers.9.mlp.gate_proj.weight": "model-00014-of-00014.safetensors", "model.layers.9.mlp.up_proj.weight": "model-00014-of-00014.safetensors", "model.layers.9.post_attention_layernorm.weight": "model-00014-of-00014.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00014-of-00014.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00014-of-00014.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00014-of-00014.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00014-of-00014.safetensors", "model.norm.weight": "model-00014-of-00014.safetensors"}}
special_tokens_map.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<unk>",
4
+ "<s>",
5
+ "</s>"
6
+ ],
7
+ "bos_token": {
8
+ "content": "<s>",
9
+ "lstrip": false,
10
+ "normalized": false,
11
+ "rstrip": false,
12
+ "single_word": false
13
+ },
14
+ "eos_token": {
15
+ "content": "</s>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false
20
+ },
21
+ "unk_token": {
22
+ "content": "<unk>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false
27
+ }
28
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
3
+ size 499723
tokenizer_config.json ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "add_prefix_space": null,
5
+ "added_tokens_decoder": {
6
+ "0": {
7
+ "content": "<unk>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false,
12
+ "special": true
13
+ },
14
+ "1": {
15
+ "content": "<s>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false,
20
+ "special": true
21
+ },
22
+ "2": {
23
+ "content": "</s>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": false,
27
+ "single_word": false,
28
+ "special": true
29
+ }
30
+ },
31
+ "additional_special_tokens": [
32
+ "<unk>",
33
+ "<s>",
34
+ "</s>"
35
+ ],
36
+ "bos_token": "<s>",
37
+ "clean_up_tokenization_spaces": false,
38
+ "eos_token": "</s>",
39
+ "legacy": false,
40
+ "model_max_length": 1000000000000000019884624838656,
41
+ "pad_token": null,
42
+ "padding_side": "right",
43
+ "sp_model_kwargs": {},
44
+ "tokenizer_class": "LlamaTokenizer",
45
+ "unk_token": "<unk>",
46
+ "use_default_system_prompt": true
47
+ }