83bd4edb467de03c19556684751a091486d00a381dd8251f60dcf380422d794b
Browse files- README.md +174 -0
- config.json +38 -0
- huggingface-metadata.txt +86 -0
- measurement.json +0 -0
- model.safetensors.index.json +1 -0
- tokenizer.json +0 -0
- tokenizer.model +3 -0
- tokenizer_config.json +0 -0
README.md
ADDED
@@ -0,0 +1,174 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: other
|
3 |
+
license_name: mistral-ai-research-licence
|
4 |
+
license_link: https://mistral.ai/licenses/MRL-0.1.md
|
5 |
+
base_model: []
|
6 |
+
library_name: transformers
|
7 |
+
tags:
|
8 |
+
- mergekit
|
9 |
+
- lumikabra-195B
|
10 |
+
|
11 |
+
---
|
12 |
+
# lumikabra-195B v0.3
|
13 |
+
|
14 |
+
|
15 |
+
<div style="width: auto; margin-left: auto; margin-right: auto; margin-bottom: 3cm">
|
16 |
+
<img src="https://huggingface.co/schnapper79/lumikabra-123B_v0.1/resolve/main/lumikabra.png" alt="Lumikabra" style="width: 100%; min-width: 400px; display: block; margin: auto;">
|
17 |
+
</div>
|
18 |
+
|
19 |
+
This is a big (195B) variant of lumikabra v0.3.
|
20 |
+
|
21 |
+
|
22 |
+
## Merge Details
|
23 |
+
### Merge Method
|
24 |
+
|
25 |
+
This model was merged using the passthrough merge method.
|
26 |
+
|
27 |
+
### Models Merged
|
28 |
+
|
29 |
+
The following models were included in the merge:
|
30 |
+
* /workspace/text-generation-webui/models/schnapper79_lumikabra-123B_v0.3
|
31 |
+
|
32 |
+
### Configuration
|
33 |
+
|
34 |
+
The following YAML configuration was used to produce this model:
|
35 |
+
|
36 |
+
```yaml
|
37 |
+
dtype: bfloat16
|
38 |
+
merge_method: passthrough
|
39 |
+
slices:
|
40 |
+
- sources:
|
41 |
+
- layer_range: [0, 70]
|
42 |
+
model: /workspace/text-generation-webui/models/schnapper79_lumikabra-123B_v0.3
|
43 |
+
- sources:
|
44 |
+
- layer_range: [18, 88]
|
45 |
+
model: /workspace/text-generation-webui/models/schnapper79_lumikabra-123B_v0.3
|
46 |
+
|
47 |
+
```
|
48 |
+
|
49 |
+
# Example Storywriting
|
50 |
+
|
51 |
+
### Settings
|
52 |
+
```
|
53 |
+
max_new_tokens: 4096
|
54 |
+
min_p: 0.05
|
55 |
+
smoothing factor: 0.25
|
56 |
+
```
|
57 |
+
|
58 |
+
<details>
|
59 |
+
<summary><b>Write me a 3,000 word opening chapter of a 'high fantasy' novel, drawing inspiration from the writing styles of Patrick Rothfuss. Use first person personal. Include dialogue and internal monologues. The POV character for the opening chapter should be a 19 year old female elf bard who is looking for adventure. Add darkness to the story</b></summary>
|
60 |
+
|
61 |
+
The tavern door creaked open, and a gust of chill night air followed me inside. I pulled my cloak tighter around my shoulders as I stepped into the dimly lit room, my boots thumping softly on the worn wooden floor. A few patrons glanced up from their mugs, but most paid me no mind. Just another traveler seeking shelter from the cold.
|
62 |
+
|
63 |
+
I made my way to the bar and signaled the bartender, a burly dwarf with a thick beard and arms like tree trunks. He grunted in acknowledgment and ambled over.
|
64 |
+
|
65 |
+
"What'll it be?" His voice was like gravel.
|
66 |
+
|
67 |
+
"Ale, please. And a room for the night, if you have one."
|
68 |
+
|
69 |
+
He nodded and disappeared into the back, returning a moment later with a frothy mug. I fished a few copper coins from my pouch and slid them across the bar.
|
70 |
+
|
71 |
+
"Upstairs, third door on the left," he said, pocketing the coins.
|
72 |
+
|
73 |
+
I thanked him and found a quiet corner to sit and sip my drink. The ale was bitter and lukewarm, but it chased away the chill. I leaned back against the wall and let my eyes wander around the room, taking in the faces of the other patrons. Most were rough-looking sorts - mercenaries, bandits, and thieves, if I had to guess. Not the kind of company a young elven maiden should keep.
|
74 |
+
|
75 |
+
But I was no ordinary maiden. I was a bard, a spinner of tales and a weaver of songs. And I had come to this forsaken corner of the realm in search of a story.
|
76 |
+
|
77 |
+
The tales of old spoke of a forgotten city hidden deep in the Whispering Woods, a place of ancient magic and untold treasures. Most dismissed it as a myth, a bedtime story to frighten children. But I had heard whispers of late, rumors that the city was real and that a map to its location had recently resurfaced.
|
78 |
+
|
79 |
+
I had followed the trail of those whispers to this very tavern, and I had a feeling that the answers I sought were somewhere in this room. I just had to be patient and keep my ears open.
|
80 |
+
|
81 |
+
As if on cue, a trio of rough-looking men entered the tavern, their laughter booming through the room. They pushed their way to the bar and ordered a round of drinks, their voices loud and boisterous. I sipped my ale and listened, trying to pick out snippets of their conversation over the din of the crowd.
|
82 |
+
|
83 |
+
"...swear it's there, just like the old man said..."
|
84 |
+
|
85 |
+
"...lost city of gold, he called it..."
|
86 |
+
|
87 |
+
"...map's the key, but it ain't cheap..."
|
88 |
+
|
89 |
+
I leaned forward, my heart quickening. This was it. This was the lead I had been searching for. I just had to figure out how to get my hands on that map.
|
90 |
+
|
91 |
+
I was still pondering my next move when a shadow fell over my table. I looked up to see a tall, lean man with a scar running down his cheek, his eyes cold and appraising.
|
92 |
+
|
93 |
+
"You look lost, little bird," he said, his voice like honey over steel. "Maybe I can help you find your way."
|
94 |
+
|
95 |
+
I met his gaze evenly, refusing to be intimidated. "I'm not lost. Just passing through."
|
96 |
+
|
97 |
+
He smiled, but it didn't reach his eyes. "Passing through, huh? On your way to somewhere special?"
|
98 |
+
|
99 |
+
I shrugged nonchalantly. "Just wherever the road takes me."
|
100 |
+
|
101 |
+
He chuckled, a low, humorless sound. "The road can be a dangerous place for a pretty little thing like you. Especially at night."
|
102 |
+
|
103 |
+
I felt a chill run down my spine, but I kept my voice steady. "I can take care of myself."
|
104 |
+
|
105 |
+
He leaned in closer, his breath hot on my cheek. "I'm sure you can. But maybe you'd rather have some company."
|
106 |
+
|
107 |
+
I was about to tell him exactly where he could shove his company when a commotion at the bar caught my attention. The three men were on their feet, their voices raised in anger. One of them shoved the other, and a moment later, fists were flying.
|
108 |
+
|
109 |
+
The tavern erupted into chaos as tables overturned and mugs shattered on the floor. I leapt to my feet, my hand instinctively reaching for the dagger at my hip. But before I could draw it, a strong hand clamped down on my wrist.
|
110 |
+
|
111 |
+
"I wouldn't do that if I were you," the scarred man said, his voice low and dangerous. "You might get hurt."
|
112 |
+
|
113 |
+
I tried to wrench my arm free, but his grip was like iron. "Let go of me," I hissed.
|
114 |
+
|
115 |
+
He smiled, a slow, predatory curve of his lips. "Not yet, little bird. You and I have some business to discuss."
|
116 |
+
|
117 |
+
I opened my mouth to reply, but before I could, a deafening roar shook the tavern walls. The scarred man's grip loosened in surprise, and I took the opportunity to twist free. I whirled around just in time to see a massive, shadowy form crash through the tavern door, its eyes glowing like embers in the dark.
|
118 |
+
|
119 |
+
The room fell silent, the brawlers frozen in mid-swing. The creature let out another bone-shaking roar, its breath hot and sulfurous. I caught a whiff of brimstone and decay, and my stomach turned.
|
120 |
+
|
121 |
+
A demon. Here, in this backwater tavern.
|
122 |
+
|
123 |
+
The scarred man cursed and drew a wicked-looking dagger from his belt. "Stay behind me," he growled, pushing me back against the wall.
|
124 |
+
|
125 |
+
I bristled at the command, but I wasn't foolish enough to argue. I had my own dagger out now, the silver blade glinting in the flickering torchlight. But against a demon, it might as well have been a twig.
|
126 |
+
|
127 |
+
The creature stalked into the room, its claws gouging deep furrows in the floorboards. The patrons scrambled out of its way, their bravado evaporating like morning mist. The scarred man held his ground, his eyes narrowed and his blade steady.
|
128 |
+
|
129 |
+
"You don't belong here, hellspawn," he said, his voice cold and hard. "Crawl back to the abyss before I send you there myself."
|
130 |
+
|
131 |
+
The demon hissed, a sound like water on hot coals. "You dare threaten me, mortal?" it rumbled, its voice like distant thunder. "I will feast on your entrails and use your skull as a chamber pot."
|
132 |
+
|
133 |
+
The scarred man smiled, a thin, humorless slash of his lips. "You're welcome to try."
|
134 |
+
|
135 |
+
And then, faster than I could blink, the demon lunged.
|
136 |
+
|
137 |
+
I dove to the side as the creature's massive body slammed into the wall where I had been standing a heartbeat before. The scarred man was already moving, his blade a silver blur as he danced around the demon’s swipes.
|
138 |
+
|
139 |
+
I scrambed to my feet, my heart pounding in my ears. I had to help him. I had to do something. But what could I do against a creature like this?
|
140 |
+
|
141 |
+
The answer came to me in a flash of inspiration. I might not be a warrior, but I was a bard. And bards had their own kind of magic.
|
142 |
+
|
143 |
+
I took a deep breath and began to sing.
|
144 |
+
|
145 |
+
The words flowed from me like a river, ancient and powerful. The song was one of the oldest I knew, a lament for the fallen heroes of a forgotten age. It spoke of courage in the face of darkness, of light that could not be extinguished.
|
146 |
+
|
147 |
+
As I sang, the tavern grew still. The scarred man paused in his dance, his blade hanging motionless in the air. The demon froze, its eyes locked on me, its mouth hanging open in a grotesque parody of awe.
|
148 |
+
|
149 |
+
I sang on, pouring my heart and soul into the words. The song swelled around me, filling the room with light and sound. The demon staggered back, its eyes wide and fearful.
|
150 |
+
|
151 |
+
I stepped forward, my voice rising in triumph. The song was a weapon now, a blade of pure silver fire. I drove it into the demon’s heart, and the creature screamed.
|
152 |
+
|
153 |
+
The sound tore through the air like a dying star, and then the demon was gone, dissolved into a cloud of oily smoke. The scarred man stood motionless, his blade hanging limply at his side.
|
154 |
+
|
155 |
+
I let the last note of the song fade into silence, my chest heaving with the effort. The tavern was still, the only sound the crackle of the fire and the distant howl of the wind.
|
156 |
+
|
157 |
+
The scarred man turned to me, his eyes wide with shock and something that might have been fear. "What are you?" he whispered.
|
158 |
+
|
159 |
+
I smiled, a tired, bitter curve of my lips. "Just a bard,” I said. “Passing through.”
|
160 |
+
|
161 |
+
He stared at me for a long moment, then shook his head and sheathed his blade. "You’re more than that,” he said. “And I think you know it.”
|
162 |
+
|
163 |
+
I shrugged, suddenly weary to my bones. “What I am doesn’t matter. What matters is that we’re alive, and that thing is dead.”
|
164 |
+
|
165 |
+
He nodded slowly. “Thanks to you.”
|
166 |
+
|
167 |
+
I didn’t answer. I just turned and walked away, leaving him standing there in the wreckage of the tavern. I had what I had come for. The map, and the beginnings of a story that would be told for generations.
|
168 |
+
|
169 |
+
As I climbed the stairs to my room, I couldn’t shake the feeling that my adventure was just beginning. And that the road ahead would be darker and more dangerous than anything I had ever imagined.
|
170 |
+
|
171 |
+
But I was a bard, a spinner of tales and a weaver of songs. And I had a story to tell.
|
172 |
+
|
173 |
+
So I stepped into the night, and I followed the song.
|
174 |
+
</details>
|
config.json
ADDED
@@ -0,0 +1,38 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "/workspace/text-generation-webui/models/schnapper79_lumikabra-123B_v0.3",
|
3 |
+
"architectures": [
|
4 |
+
"MistralForCausalLM"
|
5 |
+
],
|
6 |
+
"attention_dropout": 0.0,
|
7 |
+
"bos_token_id": 1,
|
8 |
+
"eos_token_id": 2,
|
9 |
+
"head_dim": 128,
|
10 |
+
"hidden_act": "silu",
|
11 |
+
"hidden_size": 12288,
|
12 |
+
"initializer_range": 0.02,
|
13 |
+
"intermediate_size": 28672,
|
14 |
+
"max_position_embeddings": 131072,
|
15 |
+
"model_type": "mistral",
|
16 |
+
"num_attention_heads": 96,
|
17 |
+
"num_hidden_layers": 140,
|
18 |
+
"num_key_value_heads": 8,
|
19 |
+
"rms_norm_eps": 1e-05,
|
20 |
+
"rope_theta": 1000000.0,
|
21 |
+
"sliding_window": null,
|
22 |
+
"tie_word_embeddings": false,
|
23 |
+
"torch_dtype": "bfloat16",
|
24 |
+
"transformers_version": "4.44.2",
|
25 |
+
"use_cache": true,
|
26 |
+
"vocab_size": 32768,
|
27 |
+
"quantization_config": {
|
28 |
+
"quant_method": "exl2",
|
29 |
+
"version": "0.2.3",
|
30 |
+
"bits": 4.5,
|
31 |
+
"head_bits": 6,
|
32 |
+
"calibration": {
|
33 |
+
"rows": 115,
|
34 |
+
"length": 2048,
|
35 |
+
"dataset": "(default)"
|
36 |
+
}
|
37 |
+
}
|
38 |
+
}
|
huggingface-metadata.txt
ADDED
@@ -0,0 +1,86 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
url: https://huggingface.co/schnapper79/lumikabra-195B_v0.3
|
2 |
+
branch: main
|
3 |
+
download date: 2024-10-13 16:32:59
|
4 |
+
sha256sum:
|
5 |
+
c15fbee97fd6a1053a0242de4b69fcd18d83486aaefb9f9b206e2e66cd7e5132 model-00001-of-00081.safetensors
|
6 |
+
95a1be3519207a56f51deb85a788013c6081794d2af03b58a60c96bfe93735e5 model-00002-of-00081.safetensors
|
7 |
+
561bef5b2fdd3d814639c55b761d2f5d6f248acbf06279820b79295d13e6f7e1 model-00003-of-00081.safetensors
|
8 |
+
f3aea6320a7254f1e10b0b1f68d20429a0bb7068882d1e90de125192762d3546 model-00004-of-00081.safetensors
|
9 |
+
58324e621dec2ae32a2d9cc18d91f29e044d5dc258c9e7f572bab47f64dd6aed model-00005-of-00081.safetensors
|
10 |
+
8b8320b63a92bba5db3d3855737f5f8eda778d79763328c00012b3fcbac26d85 model-00006-of-00081.safetensors
|
11 |
+
7f2d1437d858deab0d4c4b09d7da4f0d775895a8ab94426b7fd54a8cd3206272 model-00007-of-00081.safetensors
|
12 |
+
11e2a84dfd1558d3013b5dc1e92762a9bf49c7f037623d667d21992776f0c386 model-00008-of-00081.safetensors
|
13 |
+
fe2aff6a7f55d8e6efe2ad234e08c00075584e7d53d94f3e63e8edf391c27068 model-00009-of-00081.safetensors
|
14 |
+
d9e2590625d59c96761a878209845df1ac33a698817f888e652bbde5833d9187 model-00010-of-00081.safetensors
|
15 |
+
04194ea711c660b39a8e86f4210e02d64000d9939565f24c63f3ae955819d090 model-00011-of-00081.safetensors
|
16 |
+
dbd864178cc810586ca8516bc5aaf685a77b6899657beae53ba6c928182ed9d6 model-00012-of-00081.safetensors
|
17 |
+
9bf05727f939f688a3f03620d1de4b0246037d133f640a9ec2b613de7214e68a model-00013-of-00081.safetensors
|
18 |
+
4ce8e23daa2867e180aae75943b94140e41b4f140b83fedd7ca2fdabfcd42922 model-00014-of-00081.safetensors
|
19 |
+
1bbf4bc7e0b3e1eb9414da7ee011df36ee3462de37e227349c8df89727483e4b model-00015-of-00081.safetensors
|
20 |
+
b481de6a6c8370473a07f3e1b657ac5f0e2f95834d7d8ca3b1d940853c2ee25b model-00016-of-00081.safetensors
|
21 |
+
8f9bd016258d6bfaeeb8fde6700c904f61821ea44814bcacd59813760a6131ab model-00017-of-00081.safetensors
|
22 |
+
d0ec431f39a57ebd6bb2f98a43638032a45809309add095e14f6cc3f43292611 model-00018-of-00081.safetensors
|
23 |
+
4eeb36daf9f56fe5a38536bf17286bb0ee44c8c4aa375cd14dce7bcf66df70ef model-00019-of-00081.safetensors
|
24 |
+
128f37bd2d41272d08f407cff0ecf45f93dc58cc4f406bf13e2619c8ade32225 model-00020-of-00081.safetensors
|
25 |
+
24ab9a97b9cc6b5db2602520be1964e98d662bb83efdc3af5eb473caf1922c71 model-00021-of-00081.safetensors
|
26 |
+
d412855e58ffd78b4fee38c26fac566c338cf4ce66e986de04466cd182ba053a model-00022-of-00081.safetensors
|
27 |
+
a4b6fe40eb9f9f3b81cc6b8a48e055df5d9995cabbc2105e13147b967e848286 model-00023-of-00081.safetensors
|
28 |
+
3ea0a66ed5efdee19d42cee91e761eb68a7a2ffd39a094ae77e95078c232ce89 model-00024-of-00081.safetensors
|
29 |
+
b1fb4a7c270a439922a7b3725af2c1bc37c7878195120cbef36d41f3893ec2bd model-00025-of-00081.safetensors
|
30 |
+
609fed1eeb8c15924df486a6f86e1cbc01e062cc95b6629f816706dc1c801120 model-00026-of-00081.safetensors
|
31 |
+
2b607a12d403e3cc756c33d1cf0bd19bcb0158727bb1e345b2bdb42580eceee4 model-00027-of-00081.safetensors
|
32 |
+
8c0cc0237b7188ad5cfc4e886c30a8e5b003fba7dd322e7c2ca94a7ceecb6cd9 model-00028-of-00081.safetensors
|
33 |
+
6aca173fa142c66d1b2d53a67f0f9592f57e5e4d20686830df18490d8b5eef8d model-00029-of-00081.safetensors
|
34 |
+
51e99ffc49849d4c8aaaef2b12cc79c90dae07657f6ea000090a406885c07453 model-00030-of-00081.safetensors
|
35 |
+
7c40e13fe6bfa8dd9715c9e9738bde8fe4acc732105fc5b4d4ad89a9b53611c6 model-00031-of-00081.safetensors
|
36 |
+
887267c30cbc0ad6d1f940d939c4010c23916947b7c28ff3454908cdfd0eb610 model-00032-of-00081.safetensors
|
37 |
+
27d8eace77c44a0ac431e63d6ad8220030db62425b4ba677a7c55410b636ebd2 model-00033-of-00081.safetensors
|
38 |
+
926bddeb967097ba17ce1dd3199d87bae513fc1b4e6b9497663d497aed0b0663 model-00034-of-00081.safetensors
|
39 |
+
c5a370038f0757fe1bdb6e597fb1232e46829c85327623cf3a63b9dd78cf73f0 model-00035-of-00081.safetensors
|
40 |
+
268c98cd3354c47c9caa18813970de716f72d0be659616c2035234533d28cffe model-00036-of-00081.safetensors
|
41 |
+
1a5fd7edcdd1e019c3cbedbf71e4ad78f07c53011c4adc75f786083430ad9f88 model-00037-of-00081.safetensors
|
42 |
+
f6538a58c41391235c08e5123fa455a3a4db41497df69f2df6ecdb4641528db7 model-00038-of-00081.safetensors
|
43 |
+
3d17b68ed9eaa95164e1d22f1c5cefe986a78ae1b32babff3c65c74edf1215cc model-00039-of-00081.safetensors
|
44 |
+
9bf2943738d9d92a9a8c76f8bddd97436918b5346ee06948f752dbb125abe418 model-00040-of-00081.safetensors
|
45 |
+
f894c8cbfab158faf9f803fa40b08cbc40659c337405ed2d37ee0257bf2609fc model-00041-of-00081.safetensors
|
46 |
+
5dac82a3b3ce1a7c80d4726661d67b43216fe70391020ae1c3bf4e5b7aae2e5c model-00042-of-00081.safetensors
|
47 |
+
136709b6bb88090a1dd5dc5553a51991dff456ae10db92f33f15e8396b2aade1 model-00043-of-00081.safetensors
|
48 |
+
72c4f9bdc7dce3fdba731752f8f7b74e605b9537bec2affc51e30f2fae016e89 model-00044-of-00081.safetensors
|
49 |
+
4ce0060670267e88d731250d5d9c0d5c8373c0c70519f2025999f9fe461ea302 model-00045-of-00081.safetensors
|
50 |
+
8026a822cb5e5219de14baa25f36d292f94775c05b390e44b8fbb9c2d88a2ce6 model-00046-of-00081.safetensors
|
51 |
+
381707441d26467c6434cf53ace2dc52f6b23b318b23116bda297ae11ec281b0 model-00047-of-00081.safetensors
|
52 |
+
2cf7417c2cca48ecab002d77ba66a55db60b8b60c587361a09ebdbdcd3635a36 model-00048-of-00081.safetensors
|
53 |
+
62ec681752a6f17cffa7d662650b1df08fe6c671b53e1ec358a883cee2f11257 model-00049-of-00081.safetensors
|
54 |
+
48935dc4b1bbd45e235055216cb5141319c9286b3ab19718bc2df0db20784793 model-00050-of-00081.safetensors
|
55 |
+
f5ea4949903bcd35b98eb62df59fbd5ad769de987759b2a4b23b3ff013297580 model-00051-of-00081.safetensors
|
56 |
+
48c405a69f31671bfce3acf89f45b718cfde2bdcdf6655c4fb415b31d5b7ef5a model-00052-of-00081.safetensors
|
57 |
+
de0dd59d743ea48e2f0cf8d509f9cb3c362322832edd8bb67028295401c76d16 model-00053-of-00081.safetensors
|
58 |
+
022f47946dc6c78fe9bf5170f39af0f00331e3cb474d48fc5dcf07a337b1151a model-00054-of-00081.safetensors
|
59 |
+
41f2eff7a0d8b78e773f9b23cda578731462b7fa661d0014bde24ce2f8efb0f0 model-00055-of-00081.safetensors
|
60 |
+
c589ef53b2b172be9b928bae2f0ec25adc0cb528ecfebfd9ef6420b27e9112d6 model-00056-of-00081.safetensors
|
61 |
+
0f1fb88c07b5c71140a420f21df048f1030cc9a2973dda04fc1f78a17becdd62 model-00057-of-00081.safetensors
|
62 |
+
4da5d455b1473bb8ec9383dc7b63e375c9bd581d8565d1609c52f20c890b8670 model-00058-of-00081.safetensors
|
63 |
+
3ae39f48f13b27db133951cf34424c9e78213e2667a1bb4b041ab01dd1fc1b20 model-00059-of-00081.safetensors
|
64 |
+
df9eb2712ada22352c8851466b89b0b3fee09d0bb077d60b4e3f7bb2cdcb0700 model-00060-of-00081.safetensors
|
65 |
+
18aa64f8affb33faedcb6e30705affba37782ab142f5ca9178878ee75759eb74 model-00061-of-00081.safetensors
|
66 |
+
f37ccf1c10021abe824ec1d6356cbda00dbf3ebc6684c342bb3e569dbb1b75da model-00062-of-00081.safetensors
|
67 |
+
fce692ef82533d2bd3155bb99d488087d3cb180d8395b4327f95b59df77c99bf model-00063-of-00081.safetensors
|
68 |
+
e70aaca48f0696be2110f7c47a3b635d2e1e6c7744c4967d66a8dc3848ce06e2 model-00064-of-00081.safetensors
|
69 |
+
c4c970e911de3edf91200754a4a33b992e8c7f702d21c042f17c6d666c0328d2 model-00065-of-00081.safetensors
|
70 |
+
2b0a642510e0804d68e75d3259c0f39c54672adb022af8dc8e95433bf813ee6e model-00066-of-00081.safetensors
|
71 |
+
9d68b073d509592baed89c7c1655a35eb8ccb500fb78a93462fe6f067ea996c8 model-00067-of-00081.safetensors
|
72 |
+
60c17a3f7632a1f4a8e491f3ffe34bdaf025f7004003a5f2d5ec09d3850617fd model-00068-of-00081.safetensors
|
73 |
+
79aa62f6654221c90275a691ffda1446e37a4a09bb4af9b02272ab2384d4de01 model-00069-of-00081.safetensors
|
74 |
+
30c3c9abc6b04ed9bcaac5cf2029a83be6789e5f5ae542603d5246b9f6207f65 model-00070-of-00081.safetensors
|
75 |
+
2948c8fb6793beeb4722cd892eb6ebb5f10663b818cf421e54d039a2354568f7 model-00071-of-00081.safetensors
|
76 |
+
d0e7b6bfea16d8a5060cfd3e5adc9ddeb6ad9be211b9097600226dedceffa180 model-00072-of-00081.safetensors
|
77 |
+
5aa2ba3874442d255bfe3f4433dd34c79676fbdc074268ed81ca2f276fb2bd79 model-00073-of-00081.safetensors
|
78 |
+
084a44534cd22820bb6586a8dbbfc039f121bce794361c131fd5d0646afe2371 model-00074-of-00081.safetensors
|
79 |
+
1740db08b31085a359133269ec303a474c1bc7523d59c67dc83f5b7a9c68b134 model-00075-of-00081.safetensors
|
80 |
+
a5bdd33fa3ff774c594da1f8ba7db34085cce14ab6c98843ca52a3afe68a930f model-00076-of-00081.safetensors
|
81 |
+
9a33992a18cc2e972d449949a4a06199f0c1d2f4af733209cd70d61a51037966 model-00077-of-00081.safetensors
|
82 |
+
7e5145b07b947c9e51eaf9e44533dc8c2043a4a33760ae75c3fc631c2ac11749 model-00078-of-00081.safetensors
|
83 |
+
5968277c99ccb97eaab8fb2ba78d55cf7d90f296bf93af324debfb225a7fbc80 model-00079-of-00081.safetensors
|
84 |
+
09e94db2766ceefc1a570a2aa5adb451f0f11ada760dfcd0a14c3be5603c2e33 model-00080-of-00081.safetensors
|
85 |
+
162d98b60e0949f07458e62a75db776e54a3721df1a4c6062aba63758f9e16b7 model-00081-of-00081.safetensors
|
86 |
+
59f95e28944c062244741268596badc900df86c7f5ded05088d2da22a7379e06 tokenizer.model
|
measurement.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
model.safetensors.index.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"metadata": {"mergekit_version": "0.0.4.4", "total_size": 389171208192}, "weight_map": {"lm_head.weight": "model-00001-of-00081.safetensors", "model.embed_tokens.weight": "model-00001-of-00081.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00081.safetensors", "model.layers.0.mlp.down_proj.weight": "model-00001-of-00081.safetensors", "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00081.safetensors", "model.layers.0.mlp.up_proj.weight": "model-00001-of-00081.safetensors", "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00081.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00081.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00081.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00081.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00081.safetensors", "model.layers.1.input_layernorm.weight": "model-00001-of-00081.safetensors", "model.layers.1.mlp.down_proj.weight": "model-00002-of-00081.safetensors", "model.layers.1.mlp.gate_proj.weight": "model-00002-of-00081.safetensors", "model.layers.1.mlp.up_proj.weight": "model-00002-of-00081.safetensors", "model.layers.1.post_attention_layernorm.weight": "model-00002-of-00081.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00002-of-00081.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00002-of-00081.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00002-of-00081.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00002-of-00081.safetensors", "model.layers.10.input_layernorm.weight": "model-00002-of-00081.safetensors", "model.layers.10.mlp.down_proj.weight": "model-00002-of-00081.safetensors", "model.layers.10.mlp.gate_proj.weight": "model-00002-of-00081.safetensors", "model.layers.10.mlp.up_proj.weight": "model-00002-of-00081.safetensors", "model.layers.10.post_attention_layernorm.weight": "model-00002-of-00081.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00002-of-00081.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00003-of-00081.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00003-of-00081.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00003-of-00081.safetensors", "model.layers.11.input_layernorm.weight": "model-00003-of-00081.safetensors", "model.layers.11.mlp.down_proj.weight": "model-00003-of-00081.safetensors", "model.layers.11.mlp.gate_proj.weight": "model-00003-of-00081.safetensors", "model.layers.11.mlp.up_proj.weight": "model-00003-of-00081.safetensors", "model.layers.11.post_attention_layernorm.weight": "model-00003-of-00081.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00003-of-00081.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00003-of-00081.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00003-of-00081.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00003-of-00081.safetensors", "model.layers.12.input_layernorm.weight": "model-00003-of-00081.safetensors", "model.layers.12.mlp.down_proj.weight": "model-00003-of-00081.safetensors", "model.layers.12.mlp.gate_proj.weight": "model-00003-of-00081.safetensors", "model.layers.12.mlp.up_proj.weight": "model-00004-of-00081.safetensors", "model.layers.12.post_attention_layernorm.weight": "model-00004-of-00081.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00004-of-00081.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00004-of-00081.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00004-of-00081.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00004-of-00081.safetensors", "model.layers.13.input_layernorm.weight": "model-00004-of-00081.safetensors", "model.layers.13.mlp.down_proj.weight": "model-00004-of-00081.safetensors", "model.layers.13.mlp.gate_proj.weight": "model-00004-of-00081.safetensors", "model.layers.13.mlp.up_proj.weight": "model-00004-of-00081.safetensors", "model.layers.13.post_attention_layernorm.weight": "model-00004-of-00081.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00004-of-00081.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00004-of-00081.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00004-of-00081.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00004-of-00081.safetensors", "model.layers.14.input_layernorm.weight": "model-00004-of-00081.safetensors", "model.layers.14.mlp.down_proj.weight": "model-00004-of-00081.safetensors", "model.layers.14.mlp.gate_proj.weight": "model-00005-of-00081.safetensors", "model.layers.14.mlp.up_proj.weight": "model-00005-of-00081.safetensors", "model.layers.14.post_attention_layernorm.weight": "model-00005-of-00081.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00005-of-00081.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00005-of-00081.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00005-of-00081.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00005-of-00081.safetensors", "model.layers.15.input_layernorm.weight": "model-00005-of-00081.safetensors", "model.layers.15.mlp.down_proj.weight": "model-00005-of-00081.safetensors", "model.layers.15.mlp.gate_proj.weight": "model-00005-of-00081.safetensors", "model.layers.15.mlp.up_proj.weight": "model-00005-of-00081.safetensors", "model.layers.15.post_attention_layernorm.weight": "model-00005-of-00081.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00005-of-00081.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00005-of-00081.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00005-of-00081.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00005-of-00081.safetensors", "model.layers.16.input_layernorm.weight": "model-00005-of-00081.safetensors", "model.layers.16.mlp.down_proj.weight": "model-00006-of-00081.safetensors", "model.layers.16.mlp.gate_proj.weight": "model-00006-of-00081.safetensors", "model.layers.16.mlp.up_proj.weight": "model-00006-of-00081.safetensors", "model.layers.16.post_attention_layernorm.weight": "model-00006-of-00081.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00006-of-00081.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00006-of-00081.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00006-of-00081.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00006-of-00081.safetensors", "model.layers.17.input_layernorm.weight": "model-00006-of-00081.safetensors", "model.layers.17.mlp.down_proj.weight": "model-00006-of-00081.safetensors", "model.layers.17.mlp.gate_proj.weight": "model-00006-of-00081.safetensors", "model.layers.17.mlp.up_proj.weight": "model-00006-of-00081.safetensors", "model.layers.17.post_attention_layernorm.weight": "model-00006-of-00081.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00006-of-00081.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00007-of-00081.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00007-of-00081.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00007-of-00081.safetensors", "model.layers.70.input_layernorm.weight": "model-00007-of-00081.safetensors", "model.layers.18.input_layernorm.weight": "model-00007-of-00081.safetensors", "model.layers.70.mlp.down_proj.weight": "model-00007-of-00081.safetensors", "model.layers.18.mlp.down_proj.weight": "model-00007-of-00081.safetensors", "model.layers.70.mlp.gate_proj.weight": "model-00007-of-00081.safetensors", "model.layers.18.mlp.gate_proj.weight": "model-00007-of-00081.safetensors", "model.layers.70.mlp.up_proj.weight": "model-00007-of-00081.safetensors", "model.layers.18.mlp.up_proj.weight": "model-00007-of-00081.safetensors", "model.layers.70.post_attention_layernorm.weight": "model-00007-of-00081.safetensors", "model.layers.18.post_attention_layernorm.weight": "model-00007-of-00081.safetensors", "model.layers.70.self_attn.k_proj.weight": "model-00007-of-00081.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00007-of-00081.safetensors", "model.layers.70.self_attn.o_proj.weight": "model-00008-of-00081.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00008-of-00081.safetensors", "model.layers.70.self_attn.q_proj.weight": "model-00008-of-00081.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00008-of-00081.safetensors", "model.layers.70.self_attn.v_proj.weight": "model-00008-of-00081.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00008-of-00081.safetensors", "model.layers.71.input_layernorm.weight": "model-00008-of-00081.safetensors", "model.layers.19.input_layernorm.weight": "model-00008-of-00081.safetensors", "model.layers.71.mlp.down_proj.weight": "model-00008-of-00081.safetensors", "model.layers.19.mlp.down_proj.weight": "model-00008-of-00081.safetensors", "model.layers.71.mlp.gate_proj.weight": "model-00008-of-00081.safetensors", "model.layers.19.mlp.gate_proj.weight": "model-00008-of-00081.safetensors", "model.layers.71.mlp.up_proj.weight": "model-00008-of-00081.safetensors", "model.layers.19.mlp.up_proj.weight": "model-00009-of-00081.safetensors", "model.layers.71.post_attention_layernorm.weight": "model-00009-of-00081.safetensors", "model.layers.19.post_attention_layernorm.weight": "model-00009-of-00081.safetensors", "model.layers.71.self_attn.k_proj.weight": "model-00009-of-00081.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00009-of-00081.safetensors", "model.layers.71.self_attn.o_proj.weight": "model-00009-of-00081.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00009-of-00081.safetensors", "model.layers.71.self_attn.q_proj.weight": "model-00009-of-00081.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00009-of-00081.safetensors", "model.layers.71.self_attn.v_proj.weight": "model-00009-of-00081.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00009-of-00081.safetensors", "model.layers.2.input_layernorm.weight": "model-00009-of-00081.safetensors", "model.layers.2.mlp.down_proj.weight": "model-00009-of-00081.safetensors", "model.layers.2.mlp.gate_proj.weight": "model-00009-of-00081.safetensors", "model.layers.2.mlp.up_proj.weight": "model-00009-of-00081.safetensors", "model.layers.2.post_attention_layernorm.weight": "model-00009-of-00081.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00009-of-00081.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00009-of-00081.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00009-of-00081.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00009-of-00081.safetensors", "model.layers.72.input_layernorm.weight": "model-00009-of-00081.safetensors", "model.layers.20.input_layernorm.weight": "model-00009-of-00081.safetensors", "model.layers.72.mlp.down_proj.weight": "model-00010-of-00081.safetensors", "model.layers.20.mlp.down_proj.weight": "model-00010-of-00081.safetensors", "model.layers.72.mlp.gate_proj.weight": "model-00010-of-00081.safetensors", "model.layers.20.mlp.gate_proj.weight": "model-00010-of-00081.safetensors", "model.layers.72.mlp.up_proj.weight": "model-00010-of-00081.safetensors", "model.layers.20.mlp.up_proj.weight": "model-00010-of-00081.safetensors", "model.layers.72.post_attention_layernorm.weight": "model-00010-of-00081.safetensors", "model.layers.20.post_attention_layernorm.weight": "model-00010-of-00081.safetensors", "model.layers.72.self_attn.k_proj.weight": "model-00010-of-00081.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00010-of-00081.safetensors", "model.layers.72.self_attn.o_proj.weight": "model-00010-of-00081.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00010-of-00081.safetensors", "model.layers.72.self_attn.q_proj.weight": "model-00011-of-00081.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00011-of-00081.safetensors", "model.layers.72.self_attn.v_proj.weight": "model-00011-of-00081.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00011-of-00081.safetensors", "model.layers.73.input_layernorm.weight": "model-00011-of-00081.safetensors", "model.layers.21.input_layernorm.weight": "model-00011-of-00081.safetensors", "model.layers.73.mlp.down_proj.weight": "model-00011-of-00081.safetensors", "model.layers.21.mlp.down_proj.weight": "model-00011-of-00081.safetensors", "model.layers.73.mlp.gate_proj.weight": "model-00011-of-00081.safetensors", "model.layers.21.mlp.gate_proj.weight": "model-00011-of-00081.safetensors", "model.layers.73.mlp.up_proj.weight": "model-00011-of-00081.safetensors", "model.layers.21.mlp.up_proj.weight": "model-00011-of-00081.safetensors", "model.layers.73.post_attention_layernorm.weight": "model-00011-of-00081.safetensors", "model.layers.21.post_attention_layernorm.weight": "model-00011-of-00081.safetensors", "model.layers.73.self_attn.k_proj.weight": "model-00011-of-00081.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00011-of-00081.safetensors", "model.layers.73.self_attn.o_proj.weight": "model-00012-of-00081.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00012-of-00081.safetensors", "model.layers.73.self_attn.q_proj.weight": "model-00012-of-00081.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00012-of-00081.safetensors", "model.layers.73.self_attn.v_proj.weight": "model-00012-of-00081.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00012-of-00081.safetensors", "model.layers.74.input_layernorm.weight": "model-00012-of-00081.safetensors", "model.layers.22.input_layernorm.weight": "model-00012-of-00081.safetensors", "model.layers.74.mlp.down_proj.weight": "model-00012-of-00081.safetensors", "model.layers.22.mlp.down_proj.weight": "model-00012-of-00081.safetensors", "model.layers.74.mlp.gate_proj.weight": "model-00012-of-00081.safetensors", "model.layers.22.mlp.gate_proj.weight": "model-00012-of-00081.safetensors", "model.layers.74.mlp.up_proj.weight": "model-00012-of-00081.safetensors", "model.layers.22.mlp.up_proj.weight": "model-00013-of-00081.safetensors", "model.layers.74.post_attention_layernorm.weight": "model-00013-of-00081.safetensors", "model.layers.22.post_attention_layernorm.weight": "model-00013-of-00081.safetensors", "model.layers.74.self_attn.k_proj.weight": "model-00013-of-00081.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00013-of-00081.safetensors", "model.layers.74.self_attn.o_proj.weight": "model-00013-of-00081.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00013-of-00081.safetensors", "model.layers.74.self_attn.q_proj.weight": "model-00013-of-00081.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00013-of-00081.safetensors", "model.layers.74.self_attn.v_proj.weight": "model-00013-of-00081.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00013-of-00081.safetensors", "model.layers.75.input_layernorm.weight": "model-00013-of-00081.safetensors", "model.layers.23.input_layernorm.weight": "model-00013-of-00081.safetensors", "model.layers.75.mlp.down_proj.weight": "model-00013-of-00081.safetensors", "model.layers.23.mlp.down_proj.weight": "model-00013-of-00081.safetensors", "model.layers.75.mlp.gate_proj.weight": "model-00013-of-00081.safetensors", "model.layers.23.mlp.gate_proj.weight": "model-00013-of-00081.safetensors", "model.layers.75.mlp.up_proj.weight": "model-00014-of-00081.safetensors", "model.layers.23.mlp.up_proj.weight": "model-00014-of-00081.safetensors", "model.layers.75.post_attention_layernorm.weight": "model-00014-of-00081.safetensors", "model.layers.23.post_attention_layernorm.weight": "model-00014-of-00081.safetensors", "model.layers.75.self_attn.k_proj.weight": "model-00014-of-00081.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00014-of-00081.safetensors", "model.layers.75.self_attn.o_proj.weight": "model-00014-of-00081.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00014-of-00081.safetensors", "model.layers.75.self_attn.q_proj.weight": "model-00014-of-00081.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00014-of-00081.safetensors", "model.layers.75.self_attn.v_proj.weight": "model-00014-of-00081.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00014-of-00081.safetensors", "model.layers.76.input_layernorm.weight": "model-00014-of-00081.safetensors", "model.layers.24.input_layernorm.weight": "model-00014-of-00081.safetensors", "model.layers.76.mlp.down_proj.weight": "model-00014-of-00081.safetensors", "model.layers.24.mlp.down_proj.weight": "model-00014-of-00081.safetensors", "model.layers.76.mlp.gate_proj.weight": "model-00014-of-00081.safetensors", "model.layers.24.mlp.gate_proj.weight": "model-00015-of-00081.safetensors", "model.layers.76.mlp.up_proj.weight": "model-00015-of-00081.safetensors", "model.layers.24.mlp.up_proj.weight": "model-00015-of-00081.safetensors", "model.layers.76.post_attention_layernorm.weight": "model-00015-of-00081.safetensors", "model.layers.24.post_attention_layernorm.weight": "model-00015-of-00081.safetensors", "model.layers.76.self_attn.k_proj.weight": "model-00015-of-00081.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00015-of-00081.safetensors", "model.layers.76.self_attn.o_proj.weight": "model-00015-of-00081.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00015-of-00081.safetensors", "model.layers.76.self_attn.q_proj.weight": "model-00015-of-00081.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00015-of-00081.safetensors", "model.layers.76.self_attn.v_proj.weight": "model-00015-of-00081.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00015-of-00081.safetensors", "model.layers.77.input_layernorm.weight": "model-00015-of-00081.safetensors", "model.layers.25.input_layernorm.weight": "model-00015-of-00081.safetensors", "model.layers.77.mlp.down_proj.weight": "model-00015-of-00081.safetensors", "model.layers.25.mlp.down_proj.weight": "model-00015-of-00081.safetensors", "model.layers.77.mlp.gate_proj.weight": "model-00016-of-00081.safetensors", "model.layers.25.mlp.gate_proj.weight": "model-00016-of-00081.safetensors", "model.layers.77.mlp.up_proj.weight": "model-00016-of-00081.safetensors", "model.layers.25.mlp.up_proj.weight": "model-00016-of-00081.safetensors", "model.layers.77.post_attention_layernorm.weight": "model-00016-of-00081.safetensors", "model.layers.25.post_attention_layernorm.weight": "model-00016-of-00081.safetensors", "model.layers.77.self_attn.k_proj.weight": "model-00016-of-00081.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00016-of-00081.safetensors", "model.layers.77.self_attn.o_proj.weight": "model-00016-of-00081.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00016-of-00081.safetensors", "model.layers.77.self_attn.q_proj.weight": "model-00016-of-00081.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00016-of-00081.safetensors", "model.layers.77.self_attn.v_proj.weight": "model-00016-of-00081.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00016-of-00081.safetensors", "model.layers.78.input_layernorm.weight": "model-00016-of-00081.safetensors", "model.layers.26.input_layernorm.weight": "model-00016-of-00081.safetensors", "model.layers.78.mlp.down_proj.weight": "model-00016-of-00081.safetensors", "model.layers.26.mlp.down_proj.weight": "model-00017-of-00081.safetensors", "model.layers.78.mlp.gate_proj.weight": "model-00017-of-00081.safetensors", "model.layers.26.mlp.gate_proj.weight": "model-00017-of-00081.safetensors", "model.layers.78.mlp.up_proj.weight": "model-00017-of-00081.safetensors", "model.layers.26.mlp.up_proj.weight": "model-00017-of-00081.safetensors", "model.layers.78.post_attention_layernorm.weight": "model-00017-of-00081.safetensors", "model.layers.26.post_attention_layernorm.weight": "model-00017-of-00081.safetensors", "model.layers.78.self_attn.k_proj.weight": "model-00017-of-00081.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00017-of-00081.safetensors", "model.layers.78.self_attn.o_proj.weight": "model-00017-of-00081.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00017-of-00081.safetensors", "model.layers.78.self_attn.q_proj.weight": "model-00017-of-00081.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00017-of-00081.safetensors", "model.layers.78.self_attn.v_proj.weight": "model-00017-of-00081.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00017-of-00081.safetensors", "model.layers.79.input_layernorm.weight": "model-00017-of-00081.safetensors", "model.layers.27.input_layernorm.weight": "model-00017-of-00081.safetensors", "model.layers.79.mlp.down_proj.weight": "model-00018-of-00081.safetensors", "model.layers.27.mlp.down_proj.weight": "model-00018-of-00081.safetensors", "model.layers.79.mlp.gate_proj.weight": "model-00018-of-00081.safetensors", "model.layers.27.mlp.gate_proj.weight": "model-00018-of-00081.safetensors", "model.layers.79.mlp.up_proj.weight": "model-00018-of-00081.safetensors", "model.layers.27.mlp.up_proj.weight": "model-00018-of-00081.safetensors", "model.layers.79.post_attention_layernorm.weight": "model-00018-of-00081.safetensors", "model.layers.27.post_attention_layernorm.weight": "model-00018-of-00081.safetensors", "model.layers.79.self_attn.k_proj.weight": "model-00018-of-00081.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00018-of-00081.safetensors", "model.layers.79.self_attn.o_proj.weight": "model-00018-of-00081.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00018-of-00081.safetensors", "model.layers.79.self_attn.q_proj.weight": "model-00019-of-00081.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00019-of-00081.safetensors", "model.layers.79.self_attn.v_proj.weight": "model-00019-of-00081.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00019-of-00081.safetensors", "model.layers.80.input_layernorm.weight": "model-00019-of-00081.safetensors", "model.layers.28.input_layernorm.weight": "model-00019-of-00081.safetensors", "model.layers.80.mlp.down_proj.weight": "model-00019-of-00081.safetensors", "model.layers.28.mlp.down_proj.weight": "model-00019-of-00081.safetensors", "model.layers.80.mlp.gate_proj.weight": "model-00019-of-00081.safetensors", "model.layers.28.mlp.gate_proj.weight": "model-00019-of-00081.safetensors", "model.layers.80.mlp.up_proj.weight": "model-00019-of-00081.safetensors", "model.layers.28.mlp.up_proj.weight": "model-00019-of-00081.safetensors", "model.layers.80.post_attention_layernorm.weight": "model-00019-of-00081.safetensors", "model.layers.28.post_attention_layernorm.weight": "model-00019-of-00081.safetensors", "model.layers.80.self_attn.k_proj.weight": "model-00019-of-00081.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00019-of-00081.safetensors", "model.layers.80.self_attn.o_proj.weight": "model-00020-of-00081.safetensors", "model.layers.28.self_attn.o_proj.weight": "model-00020-of-00081.safetensors", "model.layers.80.self_attn.q_proj.weight": "model-00020-of-00081.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00020-of-00081.safetensors", "model.layers.80.self_attn.v_proj.weight": "model-00020-of-00081.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00020-of-00081.safetensors", "model.layers.81.input_layernorm.weight": "model-00020-of-00081.safetensors", "model.layers.29.input_layernorm.weight": "model-00020-of-00081.safetensors", "model.layers.81.mlp.down_proj.weight": "model-00020-of-00081.safetensors", "model.layers.29.mlp.down_proj.weight": "model-00020-of-00081.safetensors", "model.layers.81.mlp.gate_proj.weight": "model-00020-of-00081.safetensors", "model.layers.29.mlp.gate_proj.weight": "model-00020-of-00081.safetensors", "model.layers.81.mlp.up_proj.weight": "model-00020-of-00081.safetensors", "model.layers.29.mlp.up_proj.weight": "model-00021-of-00081.safetensors", "model.layers.81.post_attention_layernorm.weight": "model-00021-of-00081.safetensors", "model.layers.29.post_attention_layernorm.weight": "model-00021-of-00081.safetensors", "model.layers.81.self_attn.k_proj.weight": "model-00021-of-00081.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00021-of-00081.safetensors", "model.layers.81.self_attn.o_proj.weight": "model-00021-of-00081.safetensors", "model.layers.29.self_attn.o_proj.weight": "model-00021-of-00081.safetensors", "model.layers.81.self_attn.q_proj.weight": "model-00021-of-00081.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00021-of-00081.safetensors", "model.layers.81.self_attn.v_proj.weight": "model-00021-of-00081.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00021-of-00081.safetensors", "model.layers.3.input_layernorm.weight": "model-00021-of-00081.safetensors", "model.layers.3.mlp.down_proj.weight": "model-00021-of-00081.safetensors", "model.layers.3.mlp.gate_proj.weight": "model-00021-of-00081.safetensors", "model.layers.3.mlp.up_proj.weight": "model-00021-of-00081.safetensors", "model.layers.3.post_attention_layernorm.weight": "model-00021-of-00081.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00021-of-00081.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00021-of-00081.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00021-of-00081.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00021-of-00081.safetensors", "model.layers.82.input_layernorm.weight": "model-00021-of-00081.safetensors", "model.layers.30.input_layernorm.weight": "model-00021-of-00081.safetensors", "model.layers.82.mlp.down_proj.weight": "model-00022-of-00081.safetensors", "model.layers.30.mlp.down_proj.weight": "model-00022-of-00081.safetensors", "model.layers.82.mlp.gate_proj.weight": "model-00022-of-00081.safetensors", "model.layers.30.mlp.gate_proj.weight": "model-00022-of-00081.safetensors", "model.layers.82.mlp.up_proj.weight": "model-00022-of-00081.safetensors", "model.layers.30.mlp.up_proj.weight": "model-00022-of-00081.safetensors", "model.layers.82.post_attention_layernorm.weight": "model-00022-of-00081.safetensors", "model.layers.30.post_attention_layernorm.weight": "model-00022-of-00081.safetensors", "model.layers.82.self_attn.k_proj.weight": "model-00022-of-00081.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00022-of-00081.safetensors", "model.layers.82.self_attn.o_proj.weight": "model-00022-of-00081.safetensors", "model.layers.30.self_attn.o_proj.weight": "model-00022-of-00081.safetensors", "model.layers.82.self_attn.q_proj.weight": "model-00023-of-00081.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00023-of-00081.safetensors", "model.layers.82.self_attn.v_proj.weight": "model-00023-of-00081.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00023-of-00081.safetensors", "model.layers.83.input_layernorm.weight": "model-00023-of-00081.safetensors", "model.layers.31.input_layernorm.weight": "model-00023-of-00081.safetensors", "model.layers.83.mlp.down_proj.weight": "model-00023-of-00081.safetensors", "model.layers.31.mlp.down_proj.weight": "model-00023-of-00081.safetensors", "model.layers.83.mlp.gate_proj.weight": "model-00023-of-00081.safetensors", "model.layers.31.mlp.gate_proj.weight": "model-00023-of-00081.safetensors", "model.layers.83.mlp.up_proj.weight": "model-00023-of-00081.safetensors", "model.layers.31.mlp.up_proj.weight": "model-00023-of-00081.safetensors", "model.layers.83.post_attention_layernorm.weight": "model-00023-of-00081.safetensors", "model.layers.31.post_attention_layernorm.weight": "model-00023-of-00081.safetensors", "model.layers.83.self_attn.k_proj.weight": "model-00023-of-00081.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00023-of-00081.safetensors", "model.layers.83.self_attn.o_proj.weight": "model-00024-of-00081.safetensors", "model.layers.31.self_attn.o_proj.weight": "model-00024-of-00081.safetensors", "model.layers.83.self_attn.q_proj.weight": "model-00024-of-00081.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00024-of-00081.safetensors", "model.layers.83.self_attn.v_proj.weight": "model-00024-of-00081.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00024-of-00081.safetensors", "model.layers.84.input_layernorm.weight": "model-00024-of-00081.safetensors", "model.layers.32.input_layernorm.weight": "model-00024-of-00081.safetensors", "model.layers.84.mlp.down_proj.weight": "model-00024-of-00081.safetensors", "model.layers.32.mlp.down_proj.weight": "model-00024-of-00081.safetensors", "model.layers.84.mlp.gate_proj.weight": "model-00024-of-00081.safetensors", "model.layers.32.mlp.gate_proj.weight": "model-00024-of-00081.safetensors", "model.layers.84.mlp.up_proj.weight": "model-00024-of-00081.safetensors", "model.layers.32.mlp.up_proj.weight": "model-00025-of-00081.safetensors", "model.layers.84.post_attention_layernorm.weight": "model-00025-of-00081.safetensors", "model.layers.32.post_attention_layernorm.weight": "model-00025-of-00081.safetensors", "model.layers.84.self_attn.k_proj.weight": "model-00025-of-00081.safetensors", "model.layers.32.self_attn.k_proj.weight": "model-00025-of-00081.safetensors", "model.layers.84.self_attn.o_proj.weight": "model-00025-of-00081.safetensors", "model.layers.32.self_attn.o_proj.weight": "model-00025-of-00081.safetensors", "model.layers.84.self_attn.q_proj.weight": "model-00025-of-00081.safetensors", "model.layers.32.self_attn.q_proj.weight": "model-00025-of-00081.safetensors", "model.layers.84.self_attn.v_proj.weight": "model-00025-of-00081.safetensors", "model.layers.32.self_attn.v_proj.weight": "model-00025-of-00081.safetensors", "model.layers.85.input_layernorm.weight": "model-00025-of-00081.safetensors", "model.layers.33.input_layernorm.weight": "model-00025-of-00081.safetensors", "model.layers.85.mlp.down_proj.weight": "model-00025-of-00081.safetensors", "model.layers.33.mlp.down_proj.weight": "model-00025-of-00081.safetensors", "model.layers.85.mlp.gate_proj.weight": "model-00025-of-00081.safetensors", "model.layers.33.mlp.gate_proj.weight": "model-00025-of-00081.safetensors", "model.layers.85.mlp.up_proj.weight": "model-00026-of-00081.safetensors", "model.layers.33.mlp.up_proj.weight": "model-00026-of-00081.safetensors", "model.layers.85.post_attention_layernorm.weight": "model-00026-of-00081.safetensors", "model.layers.33.post_attention_layernorm.weight": "model-00026-of-00081.safetensors", "model.layers.85.self_attn.k_proj.weight": "model-00026-of-00081.safetensors", "model.layers.33.self_attn.k_proj.weight": "model-00026-of-00081.safetensors", "model.layers.85.self_attn.o_proj.weight": "model-00026-of-00081.safetensors", "model.layers.33.self_attn.o_proj.weight": "model-00026-of-00081.safetensors", "model.layers.85.self_attn.q_proj.weight": "model-00026-of-00081.safetensors", "model.layers.33.self_attn.q_proj.weight": "model-00026-of-00081.safetensors", "model.layers.85.self_attn.v_proj.weight": "model-00026-of-00081.safetensors", "model.layers.33.self_attn.v_proj.weight": "model-00026-of-00081.safetensors", "model.layers.86.input_layernorm.weight": "model-00026-of-00081.safetensors", "model.layers.34.input_layernorm.weight": "model-00026-of-00081.safetensors", "model.layers.86.mlp.down_proj.weight": "model-00026-of-00081.safetensors", "model.layers.34.mlp.down_proj.weight": "model-00026-of-00081.safetensors", "model.layers.86.mlp.gate_proj.weight": "model-00026-of-00081.safetensors", "model.layers.34.mlp.gate_proj.weight": "model-00027-of-00081.safetensors", "model.layers.86.mlp.up_proj.weight": "model-00027-of-00081.safetensors", "model.layers.34.mlp.up_proj.weight": "model-00027-of-00081.safetensors", "model.layers.86.post_attention_layernorm.weight": "model-00027-of-00081.safetensors", "model.layers.34.post_attention_layernorm.weight": "model-00027-of-00081.safetensors", "model.layers.86.self_attn.k_proj.weight": "model-00027-of-00081.safetensors", "model.layers.34.self_attn.k_proj.weight": "model-00027-of-00081.safetensors", "model.layers.86.self_attn.o_proj.weight": "model-00027-of-00081.safetensors", "model.layers.34.self_attn.o_proj.weight": "model-00027-of-00081.safetensors", "model.layers.86.self_attn.q_proj.weight": "model-00027-of-00081.safetensors", "model.layers.34.self_attn.q_proj.weight": "model-00027-of-00081.safetensors", "model.layers.86.self_attn.v_proj.weight": "model-00027-of-00081.safetensors", "model.layers.34.self_attn.v_proj.weight": "model-00027-of-00081.safetensors", "model.layers.87.input_layernorm.weight": "model-00027-of-00081.safetensors", "model.layers.35.input_layernorm.weight": "model-00027-of-00081.safetensors", "model.layers.87.mlp.down_proj.weight": "model-00027-of-00081.safetensors", "model.layers.35.mlp.down_proj.weight": "model-00027-of-00081.safetensors", "model.layers.87.mlp.gate_proj.weight": "model-00028-of-00081.safetensors", "model.layers.35.mlp.gate_proj.weight": "model-00028-of-00081.safetensors", "model.layers.87.mlp.up_proj.weight": "model-00028-of-00081.safetensors", "model.layers.35.mlp.up_proj.weight": "model-00028-of-00081.safetensors", "model.layers.87.post_attention_layernorm.weight": "model-00028-of-00081.safetensors", "model.layers.35.post_attention_layernorm.weight": "model-00028-of-00081.safetensors", "model.layers.87.self_attn.k_proj.weight": "model-00028-of-00081.safetensors", "model.layers.35.self_attn.k_proj.weight": "model-00028-of-00081.safetensors", "model.layers.87.self_attn.o_proj.weight": "model-00028-of-00081.safetensors", "model.layers.35.self_attn.o_proj.weight": "model-00028-of-00081.safetensors", "model.layers.87.self_attn.q_proj.weight": "model-00028-of-00081.safetensors", "model.layers.35.self_attn.q_proj.weight": "model-00028-of-00081.safetensors", "model.layers.87.self_attn.v_proj.weight": "model-00028-of-00081.safetensors", "model.layers.35.self_attn.v_proj.weight": "model-00028-of-00081.safetensors", "model.layers.88.input_layernorm.weight": "model-00028-of-00081.safetensors", "model.layers.36.input_layernorm.weight": "model-00028-of-00081.safetensors", "model.layers.88.mlp.down_proj.weight": "model-00028-of-00081.safetensors", "model.layers.36.mlp.down_proj.weight": "model-00029-of-00081.safetensors", "model.layers.88.mlp.gate_proj.weight": "model-00029-of-00081.safetensors", "model.layers.36.mlp.gate_proj.weight": "model-00029-of-00081.safetensors", "model.layers.88.mlp.up_proj.weight": "model-00029-of-00081.safetensors", "model.layers.36.mlp.up_proj.weight": "model-00029-of-00081.safetensors", "model.layers.88.post_attention_layernorm.weight": "model-00029-of-00081.safetensors", "model.layers.36.post_attention_layernorm.weight": "model-00029-of-00081.safetensors", "model.layers.88.self_attn.k_proj.weight": "model-00029-of-00081.safetensors", "model.layers.36.self_attn.k_proj.weight": "model-00029-of-00081.safetensors", "model.layers.88.self_attn.o_proj.weight": "model-00029-of-00081.safetensors", "model.layers.36.self_attn.o_proj.weight": "model-00029-of-00081.safetensors", "model.layers.88.self_attn.q_proj.weight": "model-00029-of-00081.safetensors", "model.layers.36.self_attn.q_proj.weight": "model-00029-of-00081.safetensors", "model.layers.88.self_attn.v_proj.weight": "model-00029-of-00081.safetensors", "model.layers.36.self_attn.v_proj.weight": "model-00029-of-00081.safetensors", "model.layers.89.input_layernorm.weight": "model-00029-of-00081.safetensors", "model.layers.37.input_layernorm.weight": "model-00029-of-00081.safetensors", "model.layers.89.mlp.down_proj.weight": "model-00030-of-00081.safetensors", "model.layers.37.mlp.down_proj.weight": "model-00030-of-00081.safetensors", "model.layers.89.mlp.gate_proj.weight": "model-00030-of-00081.safetensors", "model.layers.37.mlp.gate_proj.weight": "model-00030-of-00081.safetensors", "model.layers.89.mlp.up_proj.weight": "model-00030-of-00081.safetensors", "model.layers.37.mlp.up_proj.weight": "model-00030-of-00081.safetensors", "model.layers.89.post_attention_layernorm.weight": "model-00030-of-00081.safetensors", "model.layers.37.post_attention_layernorm.weight": "model-00030-of-00081.safetensors", "model.layers.89.self_attn.k_proj.weight": "model-00030-of-00081.safetensors", "model.layers.37.self_attn.k_proj.weight": "model-00030-of-00081.safetensors", "model.layers.89.self_attn.o_proj.weight": "model-00030-of-00081.safetensors", "model.layers.37.self_attn.o_proj.weight": "model-00030-of-00081.safetensors", "model.layers.89.self_attn.q_proj.weight": "model-00031-of-00081.safetensors", "model.layers.37.self_attn.q_proj.weight": "model-00031-of-00081.safetensors", "model.layers.89.self_attn.v_proj.weight": "model-00031-of-00081.safetensors", "model.layers.37.self_attn.v_proj.weight": "model-00031-of-00081.safetensors", "model.layers.90.input_layernorm.weight": "model-00031-of-00081.safetensors", "model.layers.38.input_layernorm.weight": "model-00031-of-00081.safetensors", "model.layers.90.mlp.down_proj.weight": "model-00031-of-00081.safetensors", "model.layers.38.mlp.down_proj.weight": "model-00031-of-00081.safetensors", "model.layers.90.mlp.gate_proj.weight": "model-00031-of-00081.safetensors", "model.layers.38.mlp.gate_proj.weight": "model-00031-of-00081.safetensors", "model.layers.90.mlp.up_proj.weight": "model-00031-of-00081.safetensors", "model.layers.38.mlp.up_proj.weight": "model-00031-of-00081.safetensors", "model.layers.90.post_attention_layernorm.weight": "model-00031-of-00081.safetensors", "model.layers.38.post_attention_layernorm.weight": "model-00031-of-00081.safetensors", "model.layers.90.self_attn.k_proj.weight": "model-00031-of-00081.safetensors", "model.layers.38.self_attn.k_proj.weight": "model-00031-of-00081.safetensors", "model.layers.90.self_attn.o_proj.weight": "model-00032-of-00081.safetensors", "model.layers.38.self_attn.o_proj.weight": "model-00032-of-00081.safetensors", "model.layers.90.self_attn.q_proj.weight": "model-00032-of-00081.safetensors", "model.layers.38.self_attn.q_proj.weight": "model-00032-of-00081.safetensors", "model.layers.90.self_attn.v_proj.weight": "model-00032-of-00081.safetensors", "model.layers.38.self_attn.v_proj.weight": "model-00032-of-00081.safetensors", "model.layers.91.input_layernorm.weight": "model-00032-of-00081.safetensors", "model.layers.39.input_layernorm.weight": "model-00032-of-00081.safetensors", "model.layers.91.mlp.down_proj.weight": "model-00032-of-00081.safetensors", "model.layers.39.mlp.down_proj.weight": "model-00032-of-00081.safetensors", "model.layers.91.mlp.gate_proj.weight": "model-00032-of-00081.safetensors", "model.layers.39.mlp.gate_proj.weight": "model-00032-of-00081.safetensors", "model.layers.91.mlp.up_proj.weight": "model-00032-of-00081.safetensors", "model.layers.39.mlp.up_proj.weight": "model-00033-of-00081.safetensors", "model.layers.91.post_attention_layernorm.weight": "model-00033-of-00081.safetensors", "model.layers.39.post_attention_layernorm.weight": "model-00033-of-00081.safetensors", "model.layers.91.self_attn.k_proj.weight": "model-00033-of-00081.safetensors", "model.layers.39.self_attn.k_proj.weight": "model-00033-of-00081.safetensors", "model.layers.91.self_attn.o_proj.weight": "model-00033-of-00081.safetensors", "model.layers.39.self_attn.o_proj.weight": "model-00033-of-00081.safetensors", "model.layers.91.self_attn.q_proj.weight": "model-00033-of-00081.safetensors", "model.layers.39.self_attn.q_proj.weight": "model-00033-of-00081.safetensors", "model.layers.91.self_attn.v_proj.weight": "model-00033-of-00081.safetensors", "model.layers.39.self_attn.v_proj.weight": "model-00033-of-00081.safetensors", "model.layers.4.input_layernorm.weight": "model-00033-of-00081.safetensors", "model.layers.4.mlp.down_proj.weight": "model-00033-of-00081.safetensors", "model.layers.4.mlp.gate_proj.weight": "model-00033-of-00081.safetensors", "model.layers.4.mlp.up_proj.weight": "model-00033-of-00081.safetensors", "model.layers.4.post_attention_layernorm.weight": "model-00033-of-00081.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00033-of-00081.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00033-of-00081.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00033-of-00081.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00033-of-00081.safetensors", "model.layers.92.input_layernorm.weight": "model-00033-of-00081.safetensors", "model.layers.40.input_layernorm.weight": "model-00033-of-00081.safetensors", "model.layers.92.mlp.down_proj.weight": "model-00034-of-00081.safetensors", "model.layers.40.mlp.down_proj.weight": "model-00034-of-00081.safetensors", "model.layers.92.mlp.gate_proj.weight": "model-00034-of-00081.safetensors", "model.layers.40.mlp.gate_proj.weight": "model-00034-of-00081.safetensors", "model.layers.92.mlp.up_proj.weight": "model-00034-of-00081.safetensors", "model.layers.40.mlp.up_proj.weight": "model-00034-of-00081.safetensors", "model.layers.92.post_attention_layernorm.weight": "model-00034-of-00081.safetensors", "model.layers.40.post_attention_layernorm.weight": "model-00034-of-00081.safetensors", "model.layers.92.self_attn.k_proj.weight": "model-00034-of-00081.safetensors", "model.layers.40.self_attn.k_proj.weight": "model-00034-of-00081.safetensors", "model.layers.92.self_attn.o_proj.weight": "model-00034-of-00081.safetensors", "model.layers.40.self_attn.o_proj.weight": "model-00034-of-00081.safetensors", "model.layers.92.self_attn.q_proj.weight": "model-00035-of-00081.safetensors", "model.layers.40.self_attn.q_proj.weight": "model-00035-of-00081.safetensors", "model.layers.92.self_attn.v_proj.weight": "model-00035-of-00081.safetensors", "model.layers.40.self_attn.v_proj.weight": "model-00035-of-00081.safetensors", "model.layers.93.input_layernorm.weight": "model-00035-of-00081.safetensors", "model.layers.41.input_layernorm.weight": "model-00035-of-00081.safetensors", "model.layers.93.mlp.down_proj.weight": "model-00035-of-00081.safetensors", "model.layers.41.mlp.down_proj.weight": "model-00035-of-00081.safetensors", "model.layers.93.mlp.gate_proj.weight": "model-00035-of-00081.safetensors", "model.layers.41.mlp.gate_proj.weight": "model-00035-of-00081.safetensors", "model.layers.93.mlp.up_proj.weight": "model-00035-of-00081.safetensors", "model.layers.41.mlp.up_proj.weight": "model-00035-of-00081.safetensors", "model.layers.93.post_attention_layernorm.weight": "model-00035-of-00081.safetensors", "model.layers.41.post_attention_layernorm.weight": "model-00035-of-00081.safetensors", "model.layers.93.self_attn.k_proj.weight": "model-00035-of-00081.safetensors", "model.layers.41.self_attn.k_proj.weight": "model-00035-of-00081.safetensors", "model.layers.93.self_attn.o_proj.weight": "model-00036-of-00081.safetensors", "model.layers.41.self_attn.o_proj.weight": "model-00036-of-00081.safetensors", "model.layers.93.self_attn.q_proj.weight": "model-00036-of-00081.safetensors", "model.layers.41.self_attn.q_proj.weight": "model-00036-of-00081.safetensors", "model.layers.93.self_attn.v_proj.weight": "model-00036-of-00081.safetensors", "model.layers.41.self_attn.v_proj.weight": "model-00036-of-00081.safetensors", "model.layers.94.input_layernorm.weight": "model-00036-of-00081.safetensors", "model.layers.42.input_layernorm.weight": "model-00036-of-00081.safetensors", "model.layers.94.mlp.down_proj.weight": "model-00036-of-00081.safetensors", "model.layers.42.mlp.down_proj.weight": "model-00036-of-00081.safetensors", "model.layers.94.mlp.gate_proj.weight": "model-00036-of-00081.safetensors", "model.layers.42.mlp.gate_proj.weight": "model-00036-of-00081.safetensors", "model.layers.94.mlp.up_proj.weight": "model-00036-of-00081.safetensors", "model.layers.42.mlp.up_proj.weight": "model-00037-of-00081.safetensors", "model.layers.94.post_attention_layernorm.weight": "model-00037-of-00081.safetensors", "model.layers.42.post_attention_layernorm.weight": "model-00037-of-00081.safetensors", "model.layers.94.self_attn.k_proj.weight": "model-00037-of-00081.safetensors", "model.layers.42.self_attn.k_proj.weight": "model-00037-of-00081.safetensors", "model.layers.94.self_attn.o_proj.weight": "model-00037-of-00081.safetensors", "model.layers.42.self_attn.o_proj.weight": "model-00037-of-00081.safetensors", "model.layers.94.self_attn.q_proj.weight": "model-00037-of-00081.safetensors", "model.layers.42.self_attn.q_proj.weight": "model-00037-of-00081.safetensors", "model.layers.94.self_attn.v_proj.weight": "model-00037-of-00081.safetensors", "model.layers.42.self_attn.v_proj.weight": "model-00037-of-00081.safetensors", "model.layers.95.input_layernorm.weight": "model-00037-of-00081.safetensors", "model.layers.43.input_layernorm.weight": "model-00037-of-00081.safetensors", "model.layers.95.mlp.down_proj.weight": "model-00037-of-00081.safetensors", "model.layers.43.mlp.down_proj.weight": "model-00037-of-00081.safetensors", "model.layers.95.mlp.gate_proj.weight": "model-00037-of-00081.safetensors", "model.layers.43.mlp.gate_proj.weight": "model-00037-of-00081.safetensors", "model.layers.95.mlp.up_proj.weight": "model-00038-of-00081.safetensors", "model.layers.43.mlp.up_proj.weight": "model-00038-of-00081.safetensors", "model.layers.95.post_attention_layernorm.weight": "model-00038-of-00081.safetensors", "model.layers.43.post_attention_layernorm.weight": "model-00038-of-00081.safetensors", "model.layers.95.self_attn.k_proj.weight": "model-00038-of-00081.safetensors", "model.layers.43.self_attn.k_proj.weight": "model-00038-of-00081.safetensors", "model.layers.95.self_attn.o_proj.weight": "model-00038-of-00081.safetensors", "model.layers.43.self_attn.o_proj.weight": "model-00038-of-00081.safetensors", "model.layers.95.self_attn.q_proj.weight": "model-00038-of-00081.safetensors", "model.layers.43.self_attn.q_proj.weight": "model-00038-of-00081.safetensors", "model.layers.95.self_attn.v_proj.weight": "model-00038-of-00081.safetensors", "model.layers.43.self_attn.v_proj.weight": "model-00038-of-00081.safetensors", "model.layers.96.input_layernorm.weight": "model-00038-of-00081.safetensors", "model.layers.44.input_layernorm.weight": "model-00038-of-00081.safetensors", "model.layers.96.mlp.down_proj.weight": "model-00038-of-00081.safetensors", "model.layers.44.mlp.down_proj.weight": "model-00038-of-00081.safetensors", "model.layers.96.mlp.gate_proj.weight": "model-00038-of-00081.safetensors", "model.layers.44.mlp.gate_proj.weight": "model-00039-of-00081.safetensors", "model.layers.96.mlp.up_proj.weight": "model-00039-of-00081.safetensors", "model.layers.44.mlp.up_proj.weight": "model-00039-of-00081.safetensors", "model.layers.96.post_attention_layernorm.weight": "model-00039-of-00081.safetensors", "model.layers.44.post_attention_layernorm.weight": "model-00039-of-00081.safetensors", "model.layers.96.self_attn.k_proj.weight": "model-00039-of-00081.safetensors", "model.layers.44.self_attn.k_proj.weight": "model-00039-of-00081.safetensors", "model.layers.96.self_attn.o_proj.weight": "model-00039-of-00081.safetensors", "model.layers.44.self_attn.o_proj.weight": "model-00039-of-00081.safetensors", "model.layers.96.self_attn.q_proj.weight": "model-00039-of-00081.safetensors", "model.layers.44.self_attn.q_proj.weight": "model-00039-of-00081.safetensors", "model.layers.96.self_attn.v_proj.weight": "model-00039-of-00081.safetensors", "model.layers.44.self_attn.v_proj.weight": "model-00039-of-00081.safetensors", "model.layers.97.input_layernorm.weight": "model-00039-of-00081.safetensors", "model.layers.45.input_layernorm.weight": "model-00039-of-00081.safetensors", "model.layers.97.mlp.down_proj.weight": "model-00039-of-00081.safetensors", "model.layers.45.mlp.down_proj.weight": "model-00039-of-00081.safetensors", "model.layers.97.mlp.gate_proj.weight": "model-00040-of-00081.safetensors", "model.layers.45.mlp.gate_proj.weight": "model-00040-of-00081.safetensors", "model.layers.97.mlp.up_proj.weight": "model-00040-of-00081.safetensors", "model.layers.45.mlp.up_proj.weight": "model-00040-of-00081.safetensors", "model.layers.97.post_attention_layernorm.weight": "model-00040-of-00081.safetensors", "model.layers.45.post_attention_layernorm.weight": "model-00040-of-00081.safetensors", "model.layers.97.self_attn.k_proj.weight": "model-00040-of-00081.safetensors", "model.layers.45.self_attn.k_proj.weight": "model-00040-of-00081.safetensors", "model.layers.97.self_attn.o_proj.weight": "model-00040-of-00081.safetensors", "model.layers.45.self_attn.o_proj.weight": "model-00040-of-00081.safetensors", "model.layers.97.self_attn.q_proj.weight": "model-00040-of-00081.safetensors", "model.layers.45.self_attn.q_proj.weight": "model-00040-of-00081.safetensors", "model.layers.97.self_attn.v_proj.weight": "model-00040-of-00081.safetensors", "model.layers.45.self_attn.v_proj.weight": "model-00040-of-00081.safetensors", "model.layers.98.input_layernorm.weight": "model-00040-of-00081.safetensors", "model.layers.46.input_layernorm.weight": "model-00040-of-00081.safetensors", "model.layers.98.mlp.down_proj.weight": "model-00040-of-00081.safetensors", "model.layers.46.mlp.down_proj.weight": "model-00041-of-00081.safetensors", "model.layers.98.mlp.gate_proj.weight": "model-00041-of-00081.safetensors", "model.layers.46.mlp.gate_proj.weight": "model-00041-of-00081.safetensors", "model.layers.98.mlp.up_proj.weight": "model-00041-of-00081.safetensors", "model.layers.46.mlp.up_proj.weight": "model-00041-of-00081.safetensors", "model.layers.98.post_attention_layernorm.weight": "model-00041-of-00081.safetensors", "model.layers.46.post_attention_layernorm.weight": "model-00041-of-00081.safetensors", "model.layers.98.self_attn.k_proj.weight": "model-00041-of-00081.safetensors", "model.layers.46.self_attn.k_proj.weight": "model-00041-of-00081.safetensors", "model.layers.98.self_attn.o_proj.weight": "model-00041-of-00081.safetensors", "model.layers.46.self_attn.o_proj.weight": "model-00041-of-00081.safetensors", "model.layers.98.self_attn.q_proj.weight": "model-00041-of-00081.safetensors", "model.layers.46.self_attn.q_proj.weight": "model-00041-of-00081.safetensors", "model.layers.98.self_attn.v_proj.weight": "model-00041-of-00081.safetensors", "model.layers.46.self_attn.v_proj.weight": "model-00041-of-00081.safetensors", "model.layers.99.input_layernorm.weight": "model-00041-of-00081.safetensors", "model.layers.47.input_layernorm.weight": "model-00041-of-00081.safetensors", "model.layers.99.mlp.down_proj.weight": "model-00042-of-00081.safetensors", "model.layers.47.mlp.down_proj.weight": "model-00042-of-00081.safetensors", "model.layers.99.mlp.gate_proj.weight": "model-00042-of-00081.safetensors", "model.layers.47.mlp.gate_proj.weight": "model-00042-of-00081.safetensors", "model.layers.99.mlp.up_proj.weight": "model-00042-of-00081.safetensors", "model.layers.47.mlp.up_proj.weight": "model-00042-of-00081.safetensors", "model.layers.99.post_attention_layernorm.weight": "model-00042-of-00081.safetensors", "model.layers.47.post_attention_layernorm.weight": "model-00042-of-00081.safetensors", "model.layers.99.self_attn.k_proj.weight": "model-00042-of-00081.safetensors", "model.layers.47.self_attn.k_proj.weight": "model-00042-of-00081.safetensors", "model.layers.99.self_attn.o_proj.weight": "model-00042-of-00081.safetensors", "model.layers.47.self_attn.o_proj.weight": "model-00042-of-00081.safetensors", "model.layers.99.self_attn.q_proj.weight": "model-00043-of-00081.safetensors", "model.layers.47.self_attn.q_proj.weight": "model-00043-of-00081.safetensors", "model.layers.99.self_attn.v_proj.weight": "model-00043-of-00081.safetensors", "model.layers.47.self_attn.v_proj.weight": "model-00043-of-00081.safetensors", "model.layers.100.input_layernorm.weight": "model-00043-of-00081.safetensors", "model.layers.48.input_layernorm.weight": "model-00043-of-00081.safetensors", "model.layers.100.mlp.down_proj.weight": "model-00043-of-00081.safetensors", "model.layers.48.mlp.down_proj.weight": "model-00043-of-00081.safetensors", "model.layers.100.mlp.gate_proj.weight": "model-00043-of-00081.safetensors", "model.layers.48.mlp.gate_proj.weight": "model-00043-of-00081.safetensors", "model.layers.100.mlp.up_proj.weight": "model-00043-of-00081.safetensors", "model.layers.48.mlp.up_proj.weight": "model-00043-of-00081.safetensors", "model.layers.100.post_attention_layernorm.weight": "model-00043-of-00081.safetensors", "model.layers.48.post_attention_layernorm.weight": "model-00043-of-00081.safetensors", "model.layers.100.self_attn.k_proj.weight": "model-00043-of-00081.safetensors", "model.layers.48.self_attn.k_proj.weight": "model-00043-of-00081.safetensors", "model.layers.100.self_attn.o_proj.weight": "model-00044-of-00081.safetensors", "model.layers.48.self_attn.o_proj.weight": "model-00044-of-00081.safetensors", "model.layers.100.self_attn.q_proj.weight": "model-00044-of-00081.safetensors", "model.layers.48.self_attn.q_proj.weight": "model-00044-of-00081.safetensors", "model.layers.100.self_attn.v_proj.weight": "model-00044-of-00081.safetensors", "model.layers.48.self_attn.v_proj.weight": "model-00044-of-00081.safetensors", "model.layers.101.input_layernorm.weight": "model-00044-of-00081.safetensors", "model.layers.49.input_layernorm.weight": "model-00044-of-00081.safetensors", "model.layers.101.mlp.down_proj.weight": "model-00044-of-00081.safetensors", "model.layers.49.mlp.down_proj.weight": "model-00044-of-00081.safetensors", "model.layers.101.mlp.gate_proj.weight": "model-00044-of-00081.safetensors", "model.layers.49.mlp.gate_proj.weight": "model-00044-of-00081.safetensors", "model.layers.101.mlp.up_proj.weight": "model-00044-of-00081.safetensors", "model.layers.49.mlp.up_proj.weight": "model-00045-of-00081.safetensors", "model.layers.101.post_attention_layernorm.weight": "model-00045-of-00081.safetensors", "model.layers.49.post_attention_layernorm.weight": "model-00045-of-00081.safetensors", "model.layers.101.self_attn.k_proj.weight": "model-00045-of-00081.safetensors", "model.layers.49.self_attn.k_proj.weight": "model-00045-of-00081.safetensors", "model.layers.101.self_attn.o_proj.weight": "model-00045-of-00081.safetensors", "model.layers.49.self_attn.o_proj.weight": "model-00045-of-00081.safetensors", "model.layers.101.self_attn.q_proj.weight": "model-00045-of-00081.safetensors", "model.layers.49.self_attn.q_proj.weight": "model-00045-of-00081.safetensors", "model.layers.101.self_attn.v_proj.weight": "model-00045-of-00081.safetensors", "model.layers.49.self_attn.v_proj.weight": "model-00045-of-00081.safetensors", "model.layers.5.input_layernorm.weight": "model-00045-of-00081.safetensors", "model.layers.5.mlp.down_proj.weight": "model-00045-of-00081.safetensors", "model.layers.5.mlp.gate_proj.weight": "model-00045-of-00081.safetensors", "model.layers.5.mlp.up_proj.weight": "model-00045-of-00081.safetensors", "model.layers.5.post_attention_layernorm.weight": "model-00045-of-00081.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00045-of-00081.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00045-of-00081.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00045-of-00081.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00045-of-00081.safetensors", "model.layers.102.input_layernorm.weight": "model-00045-of-00081.safetensors", "model.layers.50.input_layernorm.weight": "model-00045-of-00081.safetensors", "model.layers.102.mlp.down_proj.weight": "model-00046-of-00081.safetensors", "model.layers.50.mlp.down_proj.weight": "model-00046-of-00081.safetensors", "model.layers.102.mlp.gate_proj.weight": "model-00046-of-00081.safetensors", "model.layers.50.mlp.gate_proj.weight": "model-00046-of-00081.safetensors", "model.layers.102.mlp.up_proj.weight": "model-00046-of-00081.safetensors", "model.layers.50.mlp.up_proj.weight": "model-00046-of-00081.safetensors", "model.layers.102.post_attention_layernorm.weight": "model-00046-of-00081.safetensors", "model.layers.50.post_attention_layernorm.weight": "model-00046-of-00081.safetensors", "model.layers.102.self_attn.k_proj.weight": "model-00046-of-00081.safetensors", "model.layers.50.self_attn.k_proj.weight": "model-00046-of-00081.safetensors", "model.layers.102.self_attn.o_proj.weight": "model-00046-of-00081.safetensors", "model.layers.50.self_attn.o_proj.weight": "model-00046-of-00081.safetensors", "model.layers.102.self_attn.q_proj.weight": "model-00047-of-00081.safetensors", "model.layers.50.self_attn.q_proj.weight": "model-00047-of-00081.safetensors", "model.layers.102.self_attn.v_proj.weight": "model-00047-of-00081.safetensors", "model.layers.50.self_attn.v_proj.weight": "model-00047-of-00081.safetensors", "model.layers.103.input_layernorm.weight": "model-00047-of-00081.safetensors", "model.layers.51.input_layernorm.weight": "model-00047-of-00081.safetensors", "model.layers.103.mlp.down_proj.weight": "model-00047-of-00081.safetensors", "model.layers.51.mlp.down_proj.weight": "model-00047-of-00081.safetensors", "model.layers.103.mlp.gate_proj.weight": "model-00047-of-00081.safetensors", "model.layers.51.mlp.gate_proj.weight": "model-00047-of-00081.safetensors", "model.layers.103.mlp.up_proj.weight": "model-00047-of-00081.safetensors", "model.layers.51.mlp.up_proj.weight": "model-00047-of-00081.safetensors", "model.layers.103.post_attention_layernorm.weight": "model-00047-of-00081.safetensors", "model.layers.51.post_attention_layernorm.weight": "model-00047-of-00081.safetensors", "model.layers.103.self_attn.k_proj.weight": "model-00047-of-00081.safetensors", "model.layers.51.self_attn.k_proj.weight": "model-00047-of-00081.safetensors", "model.layers.103.self_attn.o_proj.weight": "model-00048-of-00081.safetensors", "model.layers.51.self_attn.o_proj.weight": "model-00048-of-00081.safetensors", "model.layers.103.self_attn.q_proj.weight": "model-00048-of-00081.safetensors", "model.layers.51.self_attn.q_proj.weight": "model-00048-of-00081.safetensors", "model.layers.103.self_attn.v_proj.weight": "model-00048-of-00081.safetensors", "model.layers.51.self_attn.v_proj.weight": "model-00048-of-00081.safetensors", "model.layers.104.input_layernorm.weight": "model-00048-of-00081.safetensors", "model.layers.52.input_layernorm.weight": "model-00048-of-00081.safetensors", "model.layers.104.mlp.down_proj.weight": "model-00048-of-00081.safetensors", "model.layers.52.mlp.down_proj.weight": "model-00048-of-00081.safetensors", "model.layers.104.mlp.gate_proj.weight": "model-00048-of-00081.safetensors", "model.layers.52.mlp.gate_proj.weight": "model-00048-of-00081.safetensors", "model.layers.104.mlp.up_proj.weight": "model-00048-of-00081.safetensors", "model.layers.52.mlp.up_proj.weight": "model-00049-of-00081.safetensors", "model.layers.104.post_attention_layernorm.weight": "model-00049-of-00081.safetensors", "model.layers.52.post_attention_layernorm.weight": "model-00049-of-00081.safetensors", "model.layers.104.self_attn.k_proj.weight": "model-00049-of-00081.safetensors", "model.layers.52.self_attn.k_proj.weight": "model-00049-of-00081.safetensors", "model.layers.104.self_attn.o_proj.weight": "model-00049-of-00081.safetensors", "model.layers.52.self_attn.o_proj.weight": "model-00049-of-00081.safetensors", "model.layers.104.self_attn.q_proj.weight": "model-00049-of-00081.safetensors", "model.layers.52.self_attn.q_proj.weight": "model-00049-of-00081.safetensors", "model.layers.104.self_attn.v_proj.weight": "model-00049-of-00081.safetensors", "model.layers.52.self_attn.v_proj.weight": "model-00049-of-00081.safetensors", "model.layers.105.input_layernorm.weight": "model-00049-of-00081.safetensors", "model.layers.53.input_layernorm.weight": "model-00049-of-00081.safetensors", "model.layers.105.mlp.down_proj.weight": "model-00049-of-00081.safetensors", "model.layers.53.mlp.down_proj.weight": "model-00049-of-00081.safetensors", "model.layers.105.mlp.gate_proj.weight": "model-00049-of-00081.safetensors", "model.layers.53.mlp.gate_proj.weight": "model-00049-of-00081.safetensors", "model.layers.105.mlp.up_proj.weight": "model-00050-of-00081.safetensors", "model.layers.53.mlp.up_proj.weight": "model-00050-of-00081.safetensors", "model.layers.105.post_attention_layernorm.weight": "model-00050-of-00081.safetensors", "model.layers.53.post_attention_layernorm.weight": "model-00050-of-00081.safetensors", "model.layers.105.self_attn.k_proj.weight": "model-00050-of-00081.safetensors", "model.layers.53.self_attn.k_proj.weight": "model-00050-of-00081.safetensors", "model.layers.105.self_attn.o_proj.weight": "model-00050-of-00081.safetensors", "model.layers.53.self_attn.o_proj.weight": "model-00050-of-00081.safetensors", "model.layers.105.self_attn.q_proj.weight": "model-00050-of-00081.safetensors", "model.layers.53.self_attn.q_proj.weight": "model-00050-of-00081.safetensors", "model.layers.105.self_attn.v_proj.weight": "model-00050-of-00081.safetensors", "model.layers.53.self_attn.v_proj.weight": "model-00050-of-00081.safetensors", "model.layers.106.input_layernorm.weight": "model-00050-of-00081.safetensors", "model.layers.54.input_layernorm.weight": "model-00050-of-00081.safetensors", "model.layers.106.mlp.down_proj.weight": "model-00050-of-00081.safetensors", "model.layers.54.mlp.down_proj.weight": "model-00050-of-00081.safetensors", "model.layers.106.mlp.gate_proj.weight": "model-00050-of-00081.safetensors", "model.layers.54.mlp.gate_proj.weight": "model-00051-of-00081.safetensors", "model.layers.106.mlp.up_proj.weight": "model-00051-of-00081.safetensors", "model.layers.54.mlp.up_proj.weight": "model-00051-of-00081.safetensors", "model.layers.106.post_attention_layernorm.weight": "model-00051-of-00081.safetensors", "model.layers.54.post_attention_layernorm.weight": "model-00051-of-00081.safetensors", "model.layers.106.self_attn.k_proj.weight": "model-00051-of-00081.safetensors", "model.layers.54.self_attn.k_proj.weight": "model-00051-of-00081.safetensors", "model.layers.106.self_attn.o_proj.weight": "model-00051-of-00081.safetensors", "model.layers.54.self_attn.o_proj.weight": "model-00051-of-00081.safetensors", "model.layers.106.self_attn.q_proj.weight": "model-00051-of-00081.safetensors", "model.layers.54.self_attn.q_proj.weight": "model-00051-of-00081.safetensors", "model.layers.106.self_attn.v_proj.weight": "model-00051-of-00081.safetensors", "model.layers.54.self_attn.v_proj.weight": "model-00051-of-00081.safetensors", "model.layers.107.input_layernorm.weight": "model-00051-of-00081.safetensors", "model.layers.55.input_layernorm.weight": "model-00051-of-00081.safetensors", "model.layers.107.mlp.down_proj.weight": "model-00051-of-00081.safetensors", "model.layers.55.mlp.down_proj.weight": "model-00051-of-00081.safetensors", "model.layers.107.mlp.gate_proj.weight": "model-00052-of-00081.safetensors", "model.layers.55.mlp.gate_proj.weight": "model-00052-of-00081.safetensors", "model.layers.107.mlp.up_proj.weight": "model-00052-of-00081.safetensors", "model.layers.55.mlp.up_proj.weight": "model-00052-of-00081.safetensors", "model.layers.107.post_attention_layernorm.weight": "model-00052-of-00081.safetensors", "model.layers.55.post_attention_layernorm.weight": "model-00052-of-00081.safetensors", "model.layers.107.self_attn.k_proj.weight": "model-00052-of-00081.safetensors", "model.layers.55.self_attn.k_proj.weight": "model-00052-of-00081.safetensors", "model.layers.107.self_attn.o_proj.weight": "model-00052-of-00081.safetensors", "model.layers.55.self_attn.o_proj.weight": "model-00052-of-00081.safetensors", "model.layers.107.self_attn.q_proj.weight": "model-00052-of-00081.safetensors", "model.layers.55.self_attn.q_proj.weight": "model-00052-of-00081.safetensors", "model.layers.107.self_attn.v_proj.weight": "model-00052-of-00081.safetensors", "model.layers.55.self_attn.v_proj.weight": "model-00052-of-00081.safetensors", "model.layers.108.input_layernorm.weight": "model-00052-of-00081.safetensors", "model.layers.56.input_layernorm.weight": "model-00052-of-00081.safetensors", "model.layers.108.mlp.down_proj.weight": "model-00052-of-00081.safetensors", "model.layers.56.mlp.down_proj.weight": "model-00053-of-00081.safetensors", "model.layers.108.mlp.gate_proj.weight": "model-00053-of-00081.safetensors", "model.layers.56.mlp.gate_proj.weight": "model-00053-of-00081.safetensors", "model.layers.108.mlp.up_proj.weight": "model-00053-of-00081.safetensors", "model.layers.56.mlp.up_proj.weight": "model-00053-of-00081.safetensors", "model.layers.108.post_attention_layernorm.weight": "model-00053-of-00081.safetensors", "model.layers.56.post_attention_layernorm.weight": "model-00053-of-00081.safetensors", "model.layers.108.self_attn.k_proj.weight": "model-00053-of-00081.safetensors", "model.layers.56.self_attn.k_proj.weight": "model-00053-of-00081.safetensors", "model.layers.108.self_attn.o_proj.weight": "model-00053-of-00081.safetensors", "model.layers.56.self_attn.o_proj.weight": "model-00053-of-00081.safetensors", "model.layers.108.self_attn.q_proj.weight": "model-00053-of-00081.safetensors", "model.layers.56.self_attn.q_proj.weight": "model-00053-of-00081.safetensors", "model.layers.108.self_attn.v_proj.weight": "model-00053-of-00081.safetensors", "model.layers.56.self_attn.v_proj.weight": "model-00053-of-00081.safetensors", "model.layers.109.input_layernorm.weight": "model-00053-of-00081.safetensors", "model.layers.57.input_layernorm.weight": "model-00053-of-00081.safetensors", "model.layers.109.mlp.down_proj.weight": "model-00054-of-00081.safetensors", "model.layers.57.mlp.down_proj.weight": "model-00054-of-00081.safetensors", "model.layers.109.mlp.gate_proj.weight": "model-00054-of-00081.safetensors", "model.layers.57.mlp.gate_proj.weight": "model-00054-of-00081.safetensors", "model.layers.109.mlp.up_proj.weight": "model-00054-of-00081.safetensors", "model.layers.57.mlp.up_proj.weight": "model-00054-of-00081.safetensors", "model.layers.109.post_attention_layernorm.weight": "model-00054-of-00081.safetensors", "model.layers.57.post_attention_layernorm.weight": "model-00054-of-00081.safetensors", "model.layers.109.self_attn.k_proj.weight": "model-00054-of-00081.safetensors", "model.layers.57.self_attn.k_proj.weight": "model-00054-of-00081.safetensors", "model.layers.109.self_attn.o_proj.weight": "model-00054-of-00081.safetensors", "model.layers.57.self_attn.o_proj.weight": "model-00054-of-00081.safetensors", "model.layers.109.self_attn.q_proj.weight": "model-00055-of-00081.safetensors", "model.layers.57.self_attn.q_proj.weight": "model-00055-of-00081.safetensors", "model.layers.109.self_attn.v_proj.weight": "model-00055-of-00081.safetensors", "model.layers.57.self_attn.v_proj.weight": "model-00055-of-00081.safetensors", "model.layers.110.input_layernorm.weight": "model-00055-of-00081.safetensors", "model.layers.58.input_layernorm.weight": "model-00055-of-00081.safetensors", "model.layers.110.mlp.down_proj.weight": "model-00055-of-00081.safetensors", "model.layers.58.mlp.down_proj.weight": "model-00055-of-00081.safetensors", "model.layers.110.mlp.gate_proj.weight": "model-00055-of-00081.safetensors", "model.layers.58.mlp.gate_proj.weight": "model-00055-of-00081.safetensors", "model.layers.110.mlp.up_proj.weight": "model-00055-of-00081.safetensors", "model.layers.58.mlp.up_proj.weight": "model-00055-of-00081.safetensors", "model.layers.110.post_attention_layernorm.weight": "model-00055-of-00081.safetensors", "model.layers.58.post_attention_layernorm.weight": "model-00055-of-00081.safetensors", "model.layers.110.self_attn.k_proj.weight": "model-00055-of-00081.safetensors", "model.layers.58.self_attn.k_proj.weight": "model-00055-of-00081.safetensors", "model.layers.110.self_attn.o_proj.weight": "model-00056-of-00081.safetensors", "model.layers.58.self_attn.o_proj.weight": "model-00056-of-00081.safetensors", "model.layers.110.self_attn.q_proj.weight": "model-00056-of-00081.safetensors", "model.layers.58.self_attn.q_proj.weight": "model-00056-of-00081.safetensors", "model.layers.110.self_attn.v_proj.weight": "model-00056-of-00081.safetensors", "model.layers.58.self_attn.v_proj.weight": "model-00056-of-00081.safetensors", "model.layers.111.input_layernorm.weight": "model-00056-of-00081.safetensors", "model.layers.59.input_layernorm.weight": "model-00056-of-00081.safetensors", "model.layers.111.mlp.down_proj.weight": "model-00056-of-00081.safetensors", "model.layers.59.mlp.down_proj.weight": "model-00056-of-00081.safetensors", "model.layers.111.mlp.gate_proj.weight": "model-00056-of-00081.safetensors", "model.layers.59.mlp.gate_proj.weight": "model-00056-of-00081.safetensors", "model.layers.111.mlp.up_proj.weight": "model-00056-of-00081.safetensors", "model.layers.59.mlp.up_proj.weight": "model-00057-of-00081.safetensors", "model.layers.111.post_attention_layernorm.weight": "model-00057-of-00081.safetensors", "model.layers.59.post_attention_layernorm.weight": "model-00057-of-00081.safetensors", "model.layers.111.self_attn.k_proj.weight": "model-00057-of-00081.safetensors", "model.layers.59.self_attn.k_proj.weight": "model-00057-of-00081.safetensors", "model.layers.111.self_attn.o_proj.weight": "model-00057-of-00081.safetensors", "model.layers.59.self_attn.o_proj.weight": "model-00057-of-00081.safetensors", "model.layers.111.self_attn.q_proj.weight": "model-00057-of-00081.safetensors", "model.layers.59.self_attn.q_proj.weight": "model-00057-of-00081.safetensors", "model.layers.111.self_attn.v_proj.weight": "model-00057-of-00081.safetensors", "model.layers.59.self_attn.v_proj.weight": "model-00057-of-00081.safetensors", "model.layers.6.input_layernorm.weight": "model-00057-of-00081.safetensors", "model.layers.6.mlp.down_proj.weight": "model-00057-of-00081.safetensors", "model.layers.6.mlp.gate_proj.weight": "model-00057-of-00081.safetensors", "model.layers.6.mlp.up_proj.weight": "model-00057-of-00081.safetensors", "model.layers.6.post_attention_layernorm.weight": "model-00057-of-00081.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00057-of-00081.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00057-of-00081.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00057-of-00081.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00057-of-00081.safetensors", "model.layers.112.input_layernorm.weight": "model-00057-of-00081.safetensors", "model.layers.60.input_layernorm.weight": "model-00057-of-00081.safetensors", "model.layers.112.mlp.down_proj.weight": "model-00058-of-00081.safetensors", "model.layers.60.mlp.down_proj.weight": "model-00058-of-00081.safetensors", "model.layers.112.mlp.gate_proj.weight": "model-00058-of-00081.safetensors", "model.layers.60.mlp.gate_proj.weight": "model-00058-of-00081.safetensors", "model.layers.112.mlp.up_proj.weight": "model-00058-of-00081.safetensors", "model.layers.60.mlp.up_proj.weight": "model-00058-of-00081.safetensors", "model.layers.112.post_attention_layernorm.weight": "model-00058-of-00081.safetensors", "model.layers.60.post_attention_layernorm.weight": "model-00058-of-00081.safetensors", "model.layers.112.self_attn.k_proj.weight": "model-00058-of-00081.safetensors", "model.layers.60.self_attn.k_proj.weight": "model-00058-of-00081.safetensors", "model.layers.112.self_attn.o_proj.weight": "model-00058-of-00081.safetensors", "model.layers.60.self_attn.o_proj.weight": "model-00058-of-00081.safetensors", "model.layers.112.self_attn.q_proj.weight": "model-00059-of-00081.safetensors", "model.layers.60.self_attn.q_proj.weight": "model-00059-of-00081.safetensors", "model.layers.112.self_attn.v_proj.weight": "model-00059-of-00081.safetensors", "model.layers.60.self_attn.v_proj.weight": "model-00059-of-00081.safetensors", "model.layers.113.input_layernorm.weight": "model-00059-of-00081.safetensors", "model.layers.61.input_layernorm.weight": "model-00059-of-00081.safetensors", "model.layers.113.mlp.down_proj.weight": "model-00059-of-00081.safetensors", "model.layers.61.mlp.down_proj.weight": "model-00059-of-00081.safetensors", "model.layers.113.mlp.gate_proj.weight": "model-00059-of-00081.safetensors", "model.layers.61.mlp.gate_proj.weight": "model-00059-of-00081.safetensors", "model.layers.113.mlp.up_proj.weight": "model-00059-of-00081.safetensors", "model.layers.61.mlp.up_proj.weight": "model-00059-of-00081.safetensors", "model.layers.113.post_attention_layernorm.weight": "model-00059-of-00081.safetensors", "model.layers.61.post_attention_layernorm.weight": "model-00059-of-00081.safetensors", "model.layers.113.self_attn.k_proj.weight": "model-00059-of-00081.safetensors", "model.layers.61.self_attn.k_proj.weight": "model-00059-of-00081.safetensors", "model.layers.113.self_attn.o_proj.weight": "model-00060-of-00081.safetensors", "model.layers.61.self_attn.o_proj.weight": "model-00060-of-00081.safetensors", "model.layers.113.self_attn.q_proj.weight": "model-00060-of-00081.safetensors", "model.layers.61.self_attn.q_proj.weight": "model-00060-of-00081.safetensors", "model.layers.113.self_attn.v_proj.weight": "model-00060-of-00081.safetensors", "model.layers.61.self_attn.v_proj.weight": "model-00060-of-00081.safetensors", "model.layers.114.input_layernorm.weight": "model-00060-of-00081.safetensors", "model.layers.62.input_layernorm.weight": "model-00060-of-00081.safetensors", "model.layers.114.mlp.down_proj.weight": "model-00060-of-00081.safetensors", "model.layers.62.mlp.down_proj.weight": "model-00060-of-00081.safetensors", "model.layers.114.mlp.gate_proj.weight": "model-00060-of-00081.safetensors", "model.layers.62.mlp.gate_proj.weight": "model-00060-of-00081.safetensors", "model.layers.114.mlp.up_proj.weight": "model-00060-of-00081.safetensors", "model.layers.62.mlp.up_proj.weight": "model-00061-of-00081.safetensors", "model.layers.114.post_attention_layernorm.weight": "model-00061-of-00081.safetensors", "model.layers.62.post_attention_layernorm.weight": "model-00061-of-00081.safetensors", "model.layers.114.self_attn.k_proj.weight": "model-00061-of-00081.safetensors", "model.layers.62.self_attn.k_proj.weight": "model-00061-of-00081.safetensors", "model.layers.114.self_attn.o_proj.weight": "model-00061-of-00081.safetensors", "model.layers.62.self_attn.o_proj.weight": "model-00061-of-00081.safetensors", "model.layers.114.self_attn.q_proj.weight": "model-00061-of-00081.safetensors", "model.layers.62.self_attn.q_proj.weight": "model-00061-of-00081.safetensors", "model.layers.114.self_attn.v_proj.weight": "model-00061-of-00081.safetensors", "model.layers.62.self_attn.v_proj.weight": "model-00061-of-00081.safetensors", "model.layers.115.input_layernorm.weight": "model-00061-of-00081.safetensors", "model.layers.63.input_layernorm.weight": "model-00061-of-00081.safetensors", "model.layers.115.mlp.down_proj.weight": "model-00061-of-00081.safetensors", "model.layers.63.mlp.down_proj.weight": "model-00061-of-00081.safetensors", "model.layers.115.mlp.gate_proj.weight": "model-00061-of-00081.safetensors", "model.layers.63.mlp.gate_proj.weight": "model-00061-of-00081.safetensors", "model.layers.115.mlp.up_proj.weight": "model-00062-of-00081.safetensors", "model.layers.63.mlp.up_proj.weight": "model-00062-of-00081.safetensors", "model.layers.115.post_attention_layernorm.weight": "model-00062-of-00081.safetensors", "model.layers.63.post_attention_layernorm.weight": "model-00062-of-00081.safetensors", "model.layers.115.self_attn.k_proj.weight": "model-00062-of-00081.safetensors", "model.layers.63.self_attn.k_proj.weight": "model-00062-of-00081.safetensors", "model.layers.115.self_attn.o_proj.weight": "model-00062-of-00081.safetensors", "model.layers.63.self_attn.o_proj.weight": "model-00062-of-00081.safetensors", "model.layers.115.self_attn.q_proj.weight": "model-00062-of-00081.safetensors", "model.layers.63.self_attn.q_proj.weight": "model-00062-of-00081.safetensors", "model.layers.115.self_attn.v_proj.weight": "model-00062-of-00081.safetensors", "model.layers.63.self_attn.v_proj.weight": "model-00062-of-00081.safetensors", "model.layers.116.input_layernorm.weight": "model-00062-of-00081.safetensors", "model.layers.64.input_layernorm.weight": "model-00062-of-00081.safetensors", "model.layers.116.mlp.down_proj.weight": "model-00062-of-00081.safetensors", "model.layers.64.mlp.down_proj.weight": "model-00062-of-00081.safetensors", "model.layers.116.mlp.gate_proj.weight": "model-00062-of-00081.safetensors", "model.layers.64.mlp.gate_proj.weight": "model-00063-of-00081.safetensors", "model.layers.116.mlp.up_proj.weight": "model-00063-of-00081.safetensors", "model.layers.64.mlp.up_proj.weight": "model-00063-of-00081.safetensors", "model.layers.116.post_attention_layernorm.weight": "model-00063-of-00081.safetensors", "model.layers.64.post_attention_layernorm.weight": "model-00063-of-00081.safetensors", "model.layers.116.self_attn.k_proj.weight": "model-00063-of-00081.safetensors", "model.layers.64.self_attn.k_proj.weight": "model-00063-of-00081.safetensors", "model.layers.116.self_attn.o_proj.weight": "model-00063-of-00081.safetensors", "model.layers.64.self_attn.o_proj.weight": "model-00063-of-00081.safetensors", "model.layers.116.self_attn.q_proj.weight": "model-00063-of-00081.safetensors", "model.layers.64.self_attn.q_proj.weight": "model-00063-of-00081.safetensors", "model.layers.116.self_attn.v_proj.weight": "model-00063-of-00081.safetensors", "model.layers.64.self_attn.v_proj.weight": "model-00063-of-00081.safetensors", "model.layers.117.input_layernorm.weight": "model-00063-of-00081.safetensors", "model.layers.65.input_layernorm.weight": "model-00063-of-00081.safetensors", "model.layers.117.mlp.down_proj.weight": "model-00063-of-00081.safetensors", "model.layers.65.mlp.down_proj.weight": "model-00063-of-00081.safetensors", "model.layers.117.mlp.gate_proj.weight": "model-00064-of-00081.safetensors", "model.layers.65.mlp.gate_proj.weight": "model-00064-of-00081.safetensors", "model.layers.117.mlp.up_proj.weight": "model-00064-of-00081.safetensors", "model.layers.65.mlp.up_proj.weight": "model-00064-of-00081.safetensors", "model.layers.117.post_attention_layernorm.weight": "model-00064-of-00081.safetensors", "model.layers.65.post_attention_layernorm.weight": "model-00064-of-00081.safetensors", "model.layers.117.self_attn.k_proj.weight": "model-00064-of-00081.safetensors", "model.layers.65.self_attn.k_proj.weight": "model-00064-of-00081.safetensors", "model.layers.117.self_attn.o_proj.weight": "model-00064-of-00081.safetensors", "model.layers.65.self_attn.o_proj.weight": "model-00064-of-00081.safetensors", "model.layers.117.self_attn.q_proj.weight": "model-00064-of-00081.safetensors", "model.layers.65.self_attn.q_proj.weight": "model-00064-of-00081.safetensors", "model.layers.117.self_attn.v_proj.weight": "model-00064-of-00081.safetensors", "model.layers.65.self_attn.v_proj.weight": "model-00064-of-00081.safetensors", "model.layers.118.input_layernorm.weight": "model-00064-of-00081.safetensors", "model.layers.66.input_layernorm.weight": "model-00064-of-00081.safetensors", "model.layers.118.mlp.down_proj.weight": "model-00064-of-00081.safetensors", "model.layers.66.mlp.down_proj.weight": "model-00065-of-00081.safetensors", "model.layers.118.mlp.gate_proj.weight": "model-00065-of-00081.safetensors", "model.layers.66.mlp.gate_proj.weight": "model-00065-of-00081.safetensors", "model.layers.118.mlp.up_proj.weight": "model-00065-of-00081.safetensors", "model.layers.66.mlp.up_proj.weight": "model-00065-of-00081.safetensors", "model.layers.118.post_attention_layernorm.weight": "model-00065-of-00081.safetensors", "model.layers.66.post_attention_layernorm.weight": "model-00065-of-00081.safetensors", "model.layers.118.self_attn.k_proj.weight": "model-00065-of-00081.safetensors", "model.layers.66.self_attn.k_proj.weight": "model-00065-of-00081.safetensors", "model.layers.118.self_attn.o_proj.weight": "model-00065-of-00081.safetensors", "model.layers.66.self_attn.o_proj.weight": "model-00065-of-00081.safetensors", "model.layers.118.self_attn.q_proj.weight": "model-00065-of-00081.safetensors", "model.layers.66.self_attn.q_proj.weight": "model-00065-of-00081.safetensors", "model.layers.118.self_attn.v_proj.weight": "model-00065-of-00081.safetensors", "model.layers.66.self_attn.v_proj.weight": "model-00065-of-00081.safetensors", "model.layers.119.input_layernorm.weight": "model-00065-of-00081.safetensors", "model.layers.67.input_layernorm.weight": "model-00065-of-00081.safetensors", "model.layers.119.mlp.down_proj.weight": "model-00066-of-00081.safetensors", "model.layers.67.mlp.down_proj.weight": "model-00066-of-00081.safetensors", "model.layers.119.mlp.gate_proj.weight": "model-00066-of-00081.safetensors", "model.layers.67.mlp.gate_proj.weight": "model-00066-of-00081.safetensors", "model.layers.119.mlp.up_proj.weight": "model-00066-of-00081.safetensors", "model.layers.67.mlp.up_proj.weight": "model-00066-of-00081.safetensors", "model.layers.119.post_attention_layernorm.weight": "model-00066-of-00081.safetensors", "model.layers.67.post_attention_layernorm.weight": "model-00066-of-00081.safetensors", "model.layers.119.self_attn.k_proj.weight": "model-00066-of-00081.safetensors", "model.layers.67.self_attn.k_proj.weight": "model-00066-of-00081.safetensors", "model.layers.119.self_attn.o_proj.weight": "model-00066-of-00081.safetensors", "model.layers.67.self_attn.o_proj.weight": "model-00066-of-00081.safetensors", "model.layers.119.self_attn.q_proj.weight": "model-00067-of-00081.safetensors", "model.layers.67.self_attn.q_proj.weight": "model-00067-of-00081.safetensors", "model.layers.119.self_attn.v_proj.weight": "model-00067-of-00081.safetensors", "model.layers.67.self_attn.v_proj.weight": "model-00067-of-00081.safetensors", "model.layers.120.input_layernorm.weight": "model-00067-of-00081.safetensors", "model.layers.68.input_layernorm.weight": "model-00067-of-00081.safetensors", "model.layers.120.mlp.down_proj.weight": "model-00067-of-00081.safetensors", "model.layers.68.mlp.down_proj.weight": "model-00067-of-00081.safetensors", "model.layers.120.mlp.gate_proj.weight": "model-00067-of-00081.safetensors", "model.layers.68.mlp.gate_proj.weight": "model-00067-of-00081.safetensors", "model.layers.120.mlp.up_proj.weight": "model-00067-of-00081.safetensors", "model.layers.68.mlp.up_proj.weight": "model-00067-of-00081.safetensors", "model.layers.120.post_attention_layernorm.weight": "model-00067-of-00081.safetensors", "model.layers.68.post_attention_layernorm.weight": "model-00067-of-00081.safetensors", "model.layers.120.self_attn.k_proj.weight": "model-00067-of-00081.safetensors", "model.layers.68.self_attn.k_proj.weight": "model-00067-of-00081.safetensors", "model.layers.120.self_attn.o_proj.weight": "model-00068-of-00081.safetensors", "model.layers.68.self_attn.o_proj.weight": "model-00068-of-00081.safetensors", "model.layers.120.self_attn.q_proj.weight": "model-00068-of-00081.safetensors", "model.layers.68.self_attn.q_proj.weight": "model-00068-of-00081.safetensors", "model.layers.120.self_attn.v_proj.weight": "model-00068-of-00081.safetensors", "model.layers.68.self_attn.v_proj.weight": "model-00068-of-00081.safetensors", "model.layers.121.input_layernorm.weight": "model-00068-of-00081.safetensors", "model.layers.69.input_layernorm.weight": "model-00068-of-00081.safetensors", "model.layers.121.mlp.down_proj.weight": "model-00068-of-00081.safetensors", "model.layers.69.mlp.down_proj.weight": "model-00068-of-00081.safetensors", "model.layers.121.mlp.gate_proj.weight": "model-00068-of-00081.safetensors", "model.layers.69.mlp.gate_proj.weight": "model-00068-of-00081.safetensors", "model.layers.121.mlp.up_proj.weight": "model-00068-of-00081.safetensors", "model.layers.69.mlp.up_proj.weight": "model-00069-of-00081.safetensors", "model.layers.121.post_attention_layernorm.weight": "model-00069-of-00081.safetensors", "model.layers.69.post_attention_layernorm.weight": "model-00069-of-00081.safetensors", "model.layers.121.self_attn.k_proj.weight": "model-00069-of-00081.safetensors", "model.layers.69.self_attn.k_proj.weight": "model-00069-of-00081.safetensors", "model.layers.121.self_attn.o_proj.weight": "model-00069-of-00081.safetensors", "model.layers.69.self_attn.o_proj.weight": "model-00069-of-00081.safetensors", "model.layers.121.self_attn.q_proj.weight": "model-00069-of-00081.safetensors", "model.layers.69.self_attn.q_proj.weight": "model-00069-of-00081.safetensors", "model.layers.121.self_attn.v_proj.weight": "model-00069-of-00081.safetensors", "model.layers.69.self_attn.v_proj.weight": "model-00069-of-00081.safetensors", "model.layers.7.input_layernorm.weight": "model-00069-of-00081.safetensors", "model.layers.7.mlp.down_proj.weight": "model-00069-of-00081.safetensors", "model.layers.7.mlp.gate_proj.weight": "model-00069-of-00081.safetensors", "model.layers.7.mlp.up_proj.weight": "model-00069-of-00081.safetensors", "model.layers.7.post_attention_layernorm.weight": "model-00069-of-00081.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00069-of-00081.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00069-of-00081.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00069-of-00081.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00069-of-00081.safetensors", "model.layers.122.input_layernorm.weight": "model-00069-of-00081.safetensors", "model.layers.122.mlp.down_proj.weight": "model-00070-of-00081.safetensors", "model.layers.122.mlp.gate_proj.weight": "model-00070-of-00081.safetensors", "model.layers.122.mlp.up_proj.weight": "model-00070-of-00081.safetensors", "model.layers.122.post_attention_layernorm.weight": "model-00070-of-00081.safetensors", "model.layers.122.self_attn.k_proj.weight": "model-00070-of-00081.safetensors", "model.layers.122.self_attn.o_proj.weight": "model-00070-of-00081.safetensors", "model.layers.122.self_attn.q_proj.weight": "model-00070-of-00081.safetensors", "model.layers.122.self_attn.v_proj.weight": "model-00070-of-00081.safetensors", "model.layers.123.input_layernorm.weight": "model-00070-of-00081.safetensors", "model.layers.123.mlp.down_proj.weight": "model-00070-of-00081.safetensors", "model.layers.123.mlp.gate_proj.weight": "model-00070-of-00081.safetensors", "model.layers.123.mlp.up_proj.weight": "model-00070-of-00081.safetensors", "model.layers.123.post_attention_layernorm.weight": "model-00070-of-00081.safetensors", "model.layers.123.self_attn.k_proj.weight": "model-00070-of-00081.safetensors", "model.layers.123.self_attn.o_proj.weight": "model-00071-of-00081.safetensors", "model.layers.123.self_attn.q_proj.weight": "model-00071-of-00081.safetensors", "model.layers.123.self_attn.v_proj.weight": "model-00071-of-00081.safetensors", "model.layers.124.input_layernorm.weight": "model-00071-of-00081.safetensors", "model.layers.124.mlp.down_proj.weight": "model-00071-of-00081.safetensors", "model.layers.124.mlp.gate_proj.weight": "model-00071-of-00081.safetensors", "model.layers.124.mlp.up_proj.weight": "model-00071-of-00081.safetensors", "model.layers.124.post_attention_layernorm.weight": "model-00071-of-00081.safetensors", "model.layers.124.self_attn.k_proj.weight": "model-00071-of-00081.safetensors", "model.layers.124.self_attn.o_proj.weight": "model-00071-of-00081.safetensors", "model.layers.124.self_attn.q_proj.weight": "model-00071-of-00081.safetensors", "model.layers.124.self_attn.v_proj.weight": "model-00071-of-00081.safetensors", "model.layers.125.input_layernorm.weight": "model-00071-of-00081.safetensors", "model.layers.125.mlp.down_proj.weight": "model-00071-of-00081.safetensors", "model.layers.125.mlp.gate_proj.weight": "model-00071-of-00081.safetensors", "model.layers.125.mlp.up_proj.weight": "model-00072-of-00081.safetensors", "model.layers.125.post_attention_layernorm.weight": "model-00072-of-00081.safetensors", "model.layers.125.self_attn.k_proj.weight": "model-00072-of-00081.safetensors", "model.layers.125.self_attn.o_proj.weight": "model-00072-of-00081.safetensors", "model.layers.125.self_attn.q_proj.weight": "model-00072-of-00081.safetensors", "model.layers.125.self_attn.v_proj.weight": "model-00072-of-00081.safetensors", "model.layers.126.input_layernorm.weight": "model-00072-of-00081.safetensors", "model.layers.126.mlp.down_proj.weight": "model-00072-of-00081.safetensors", "model.layers.126.mlp.gate_proj.weight": "model-00072-of-00081.safetensors", "model.layers.126.mlp.up_proj.weight": "model-00072-of-00081.safetensors", "model.layers.126.post_attention_layernorm.weight": "model-00072-of-00081.safetensors", "model.layers.126.self_attn.k_proj.weight": "model-00072-of-00081.safetensors", "model.layers.126.self_attn.o_proj.weight": "model-00072-of-00081.safetensors", "model.layers.126.self_attn.q_proj.weight": "model-00072-of-00081.safetensors", "model.layers.126.self_attn.v_proj.weight": "model-00072-of-00081.safetensors", "model.layers.127.input_layernorm.weight": "model-00072-of-00081.safetensors", "model.layers.127.mlp.down_proj.weight": "model-00072-of-00081.safetensors", "model.layers.127.mlp.gate_proj.weight": "model-00073-of-00081.safetensors", "model.layers.127.mlp.up_proj.weight": "model-00073-of-00081.safetensors", "model.layers.127.post_attention_layernorm.weight": "model-00073-of-00081.safetensors", "model.layers.127.self_attn.k_proj.weight": "model-00073-of-00081.safetensors", "model.layers.127.self_attn.o_proj.weight": "model-00073-of-00081.safetensors", "model.layers.127.self_attn.q_proj.weight": "model-00073-of-00081.safetensors", "model.layers.127.self_attn.v_proj.weight": "model-00073-of-00081.safetensors", "model.layers.128.input_layernorm.weight": "model-00073-of-00081.safetensors", "model.layers.128.mlp.down_proj.weight": "model-00073-of-00081.safetensors", "model.layers.128.mlp.gate_proj.weight": "model-00073-of-00081.safetensors", "model.layers.128.mlp.up_proj.weight": "model-00073-of-00081.safetensors", "model.layers.128.post_attention_layernorm.weight": "model-00073-of-00081.safetensors", "model.layers.128.self_attn.k_proj.weight": "model-00073-of-00081.safetensors", "model.layers.128.self_attn.o_proj.weight": "model-00073-of-00081.safetensors", "model.layers.128.self_attn.q_proj.weight": "model-00073-of-00081.safetensors", "model.layers.128.self_attn.v_proj.weight": "model-00073-of-00081.safetensors", "model.layers.129.input_layernorm.weight": "model-00073-of-00081.safetensors", "model.layers.129.mlp.down_proj.weight": "model-00074-of-00081.safetensors", "model.layers.129.mlp.gate_proj.weight": "model-00074-of-00081.safetensors", "model.layers.129.mlp.up_proj.weight": "model-00074-of-00081.safetensors", "model.layers.129.post_attention_layernorm.weight": "model-00074-of-00081.safetensors", "model.layers.129.self_attn.k_proj.weight": "model-00074-of-00081.safetensors", "model.layers.129.self_attn.o_proj.weight": "model-00074-of-00081.safetensors", "model.layers.129.self_attn.q_proj.weight": "model-00074-of-00081.safetensors", "model.layers.129.self_attn.v_proj.weight": "model-00074-of-00081.safetensors", "model.layers.130.input_layernorm.weight": "model-00074-of-00081.safetensors", "model.layers.130.mlp.down_proj.weight": "model-00074-of-00081.safetensors", "model.layers.130.mlp.gate_proj.weight": "model-00074-of-00081.safetensors", "model.layers.130.mlp.up_proj.weight": "model-00074-of-00081.safetensors", "model.layers.130.post_attention_layernorm.weight": "model-00074-of-00081.safetensors", "model.layers.130.self_attn.k_proj.weight": "model-00074-of-00081.safetensors", "model.layers.130.self_attn.o_proj.weight": "model-00075-of-00081.safetensors", "model.layers.130.self_attn.q_proj.weight": "model-00075-of-00081.safetensors", "model.layers.130.self_attn.v_proj.weight": "model-00075-of-00081.safetensors", "model.layers.131.input_layernorm.weight": "model-00075-of-00081.safetensors", "model.layers.131.mlp.down_proj.weight": "model-00075-of-00081.safetensors", "model.layers.131.mlp.gate_proj.weight": "model-00075-of-00081.safetensors", "model.layers.131.mlp.up_proj.weight": "model-00075-of-00081.safetensors", "model.layers.131.post_attention_layernorm.weight": "model-00075-of-00081.safetensors", "model.layers.131.self_attn.k_proj.weight": "model-00075-of-00081.safetensors", "model.layers.131.self_attn.o_proj.weight": "model-00075-of-00081.safetensors", "model.layers.131.self_attn.q_proj.weight": "model-00075-of-00081.safetensors", "model.layers.131.self_attn.v_proj.weight": "model-00075-of-00081.safetensors", "model.layers.8.input_layernorm.weight": "model-00075-of-00081.safetensors", "model.layers.8.mlp.down_proj.weight": "model-00075-of-00081.safetensors", "model.layers.8.mlp.gate_proj.weight": "model-00075-of-00081.safetensors", "model.layers.8.mlp.up_proj.weight": "model-00076-of-00081.safetensors", "model.layers.8.post_attention_layernorm.weight": "model-00076-of-00081.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00076-of-00081.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00076-of-00081.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00076-of-00081.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00076-of-00081.safetensors", "model.layers.132.input_layernorm.weight": "model-00076-of-00081.safetensors", "model.layers.132.mlp.down_proj.weight": "model-00076-of-00081.safetensors", "model.layers.132.mlp.gate_proj.weight": "model-00076-of-00081.safetensors", "model.layers.132.mlp.up_proj.weight": "model-00076-of-00081.safetensors", "model.layers.132.post_attention_layernorm.weight": "model-00076-of-00081.safetensors", "model.layers.132.self_attn.k_proj.weight": "model-00076-of-00081.safetensors", "model.layers.132.self_attn.o_proj.weight": "model-00076-of-00081.safetensors", "model.layers.132.self_attn.q_proj.weight": "model-00076-of-00081.safetensors", "model.layers.132.self_attn.v_proj.weight": "model-00076-of-00081.safetensors", "model.layers.133.input_layernorm.weight": "model-00076-of-00081.safetensors", "model.layers.133.mlp.down_proj.weight": "model-00076-of-00081.safetensors", "model.layers.133.mlp.gate_proj.weight": "model-00077-of-00081.safetensors", "model.layers.133.mlp.up_proj.weight": "model-00077-of-00081.safetensors", "model.layers.133.post_attention_layernorm.weight": "model-00077-of-00081.safetensors", "model.layers.133.self_attn.k_proj.weight": "model-00077-of-00081.safetensors", "model.layers.133.self_attn.o_proj.weight": "model-00077-of-00081.safetensors", "model.layers.133.self_attn.q_proj.weight": "model-00077-of-00081.safetensors", "model.layers.133.self_attn.v_proj.weight": "model-00077-of-00081.safetensors", "model.layers.134.input_layernorm.weight": "model-00077-of-00081.safetensors", "model.layers.134.mlp.down_proj.weight": "model-00077-of-00081.safetensors", "model.layers.134.mlp.gate_proj.weight": "model-00077-of-00081.safetensors", "model.layers.134.mlp.up_proj.weight": "model-00077-of-00081.safetensors", "model.layers.134.post_attention_layernorm.weight": "model-00077-of-00081.safetensors", "model.layers.134.self_attn.k_proj.weight": "model-00077-of-00081.safetensors", "model.layers.134.self_attn.o_proj.weight": "model-00077-of-00081.safetensors", "model.layers.134.self_attn.q_proj.weight": "model-00077-of-00081.safetensors", "model.layers.134.self_attn.v_proj.weight": "model-00077-of-00081.safetensors", "model.layers.135.input_layernorm.weight": "model-00077-of-00081.safetensors", "model.layers.135.mlp.down_proj.weight": "model-00078-of-00081.safetensors", "model.layers.135.mlp.gate_proj.weight": "model-00078-of-00081.safetensors", "model.layers.135.mlp.up_proj.weight": "model-00078-of-00081.safetensors", "model.layers.135.post_attention_layernorm.weight": "model-00078-of-00081.safetensors", "model.layers.135.self_attn.k_proj.weight": "model-00078-of-00081.safetensors", "model.layers.135.self_attn.o_proj.weight": "model-00078-of-00081.safetensors", "model.layers.135.self_attn.q_proj.weight": "model-00078-of-00081.safetensors", "model.layers.135.self_attn.v_proj.weight": "model-00078-of-00081.safetensors", "model.layers.136.input_layernorm.weight": "model-00078-of-00081.safetensors", "model.layers.136.mlp.down_proj.weight": "model-00078-of-00081.safetensors", "model.layers.136.mlp.gate_proj.weight": "model-00078-of-00081.safetensors", "model.layers.136.mlp.up_proj.weight": "model-00078-of-00081.safetensors", "model.layers.136.post_attention_layernorm.weight": "model-00078-of-00081.safetensors", "model.layers.136.self_attn.k_proj.weight": "model-00078-of-00081.safetensors", "model.layers.136.self_attn.o_proj.weight": "model-00079-of-00081.safetensors", "model.layers.136.self_attn.q_proj.weight": "model-00079-of-00081.safetensors", "model.layers.136.self_attn.v_proj.weight": "model-00079-of-00081.safetensors", "model.layers.137.input_layernorm.weight": "model-00079-of-00081.safetensors", "model.layers.137.mlp.down_proj.weight": "model-00079-of-00081.safetensors", "model.layers.137.mlp.gate_proj.weight": "model-00079-of-00081.safetensors", "model.layers.137.mlp.up_proj.weight": "model-00079-of-00081.safetensors", "model.layers.137.post_attention_layernorm.weight": "model-00079-of-00081.safetensors", "model.layers.137.self_attn.k_proj.weight": "model-00079-of-00081.safetensors", "model.layers.137.self_attn.o_proj.weight": "model-00079-of-00081.safetensors", "model.layers.137.self_attn.q_proj.weight": "model-00079-of-00081.safetensors", "model.layers.137.self_attn.v_proj.weight": "model-00079-of-00081.safetensors", "model.layers.138.input_layernorm.weight": "model-00079-of-00081.safetensors", "model.layers.138.mlp.down_proj.weight": "model-00079-of-00081.safetensors", "model.layers.138.mlp.gate_proj.weight": "model-00079-of-00081.safetensors", "model.layers.138.mlp.up_proj.weight": "model-00080-of-00081.safetensors", "model.layers.138.post_attention_layernorm.weight": "model-00080-of-00081.safetensors", "model.layers.138.self_attn.k_proj.weight": "model-00080-of-00081.safetensors", "model.layers.138.self_attn.o_proj.weight": "model-00080-of-00081.safetensors", "model.layers.138.self_attn.q_proj.weight": "model-00080-of-00081.safetensors", "model.layers.138.self_attn.v_proj.weight": "model-00080-of-00081.safetensors", "model.layers.139.input_layernorm.weight": "model-00080-of-00081.safetensors", "model.layers.139.mlp.down_proj.weight": "model-00080-of-00081.safetensors", "model.layers.139.mlp.gate_proj.weight": "model-00080-of-00081.safetensors", "model.layers.139.mlp.up_proj.weight": "model-00080-of-00081.safetensors", "model.layers.139.post_attention_layernorm.weight": "model-00080-of-00081.safetensors", "model.layers.139.self_attn.k_proj.weight": "model-00080-of-00081.safetensors", "model.layers.139.self_attn.o_proj.weight": "model-00080-of-00081.safetensors", "model.layers.139.self_attn.q_proj.weight": "model-00080-of-00081.safetensors", "model.layers.139.self_attn.v_proj.weight": "model-00080-of-00081.safetensors", "model.layers.9.input_layernorm.weight": "model-00080-of-00081.safetensors", "model.layers.9.mlp.down_proj.weight": "model-00080-of-00081.safetensors", "model.layers.9.mlp.gate_proj.weight": "model-00081-of-00081.safetensors", "model.layers.9.mlp.up_proj.weight": "model-00081-of-00081.safetensors", "model.layers.9.post_attention_layernorm.weight": "model-00081-of-00081.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00081-of-00081.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00081-of-00081.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00081-of-00081.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00081-of-00081.safetensors", "model.norm.weight": "model-00081-of-00081.safetensors"}}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer.model
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:59f95e28944c062244741268596badc900df86c7f5ded05088d2da22a7379e06
|
3 |
+
size 587583
|
tokenizer_config.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|