Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,91 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: llama2
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
pipeline_tag: conversational
|
6 |
+
tags:
|
7 |
+
- Xwin
|
8 |
+
- Euryale 1.3
|
9 |
+
- Platypus2
|
10 |
+
- WinterGoddess
|
11 |
+
- frankenmerge
|
12 |
+
- dare
|
13 |
+
- ties
|
14 |
+
- 90b
|
15 |
+
---
|
16 |
+
# BigWeave v12 90B
|
17 |
+
|
18 |
+
<img src="https://cdn-uploads.huggingface.co/production/uploads/65a6db055c58475cf9e6def1/4CbbAN-X7ZWj702JrcCGH.png" width=600>
|
19 |
+
|
20 |
+
The BigWeave models aim to identify merge settings equaling or surpassing the performance of Goliath-120b. The version number merely tracks various attempts and is not a quality indicator. Only results demonstrating good performance are retained and shared.
|
21 |
+
|
22 |
+
This version is a DARE-TIES merge of two passthrough merges: Xwin-LM-70b-v0.1 + Euryale-1.3-70b ([BigWeave v6](https://huggingface.co/llmixer/BigWeave-v6-90b)) and Platypus2-70b-instruct + WinterGoddess-1.4x-70b (BigWeave v8). Both models individually show strong performance, and the merged model achieves even lower perplexity than each model separately.
|
23 |
+
|
24 |
+
The 90b size allows for 4bit quants to fit into 48GB of VRAM.
|
25 |
+
|
26 |
+
# Prompting Format
|
27 |
+
Vicuna and Alpaca.
|
28 |
+
|
29 |
+
# Merge process
|
30 |
+
The models used in the merge are [Xwin-LM-70b-v0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1), [Euryale-1.3-70b](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B), [Platypus2-70b-instruct](https://huggingface.co/garage-bAInd/Platypus2-70B-instruct) and [WinterGoddess-1.4x-70b](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2).
|
31 |
+
|
32 |
+
Merge configuration:
|
33 |
+
```
|
34 |
+
slices:
|
35 |
+
- sources:
|
36 |
+
- model: Xwin-LM/Xwin-LM-70B-V0.1
|
37 |
+
layer_range: [0,12]
|
38 |
+
- sources:
|
39 |
+
- model: Sao10K/Euryale-1.3-L2-70B
|
40 |
+
layer_range: [9,14]
|
41 |
+
- sources:
|
42 |
+
- model: Xwin-LM/Xwin-LM-70B-V0.1
|
43 |
+
layer_range: [12,62]
|
44 |
+
- sources:
|
45 |
+
- model: Sao10K/Euryale-1.3-L2-70B
|
46 |
+
layer_range: [54,71]
|
47 |
+
- sources:
|
48 |
+
- model: Xwin-LM/Xwin-LM-70B-V0.1
|
49 |
+
layer_range: [62,80]
|
50 |
+
merge_method: passthrough
|
51 |
+
dtype: float16
|
52 |
+
---
|
53 |
+
slices:
|
54 |
+
- sources:
|
55 |
+
- model: garage-bAInd/Platypus2-70B-instruct
|
56 |
+
layer_range: [0,12]
|
57 |
+
- sources:
|
58 |
+
- model: Sao10K/WinterGoddess-1.4x-70B-L2
|
59 |
+
layer_range: [9,14]
|
60 |
+
- sources:
|
61 |
+
- model: garage-bAInd/Platypus2-70B-instruct
|
62 |
+
layer_range: [12,62]
|
63 |
+
- sources:
|
64 |
+
- model: Sao10/WinterGoddess-1.4x-70B-L2
|
65 |
+
layer_range: [54,71]
|
66 |
+
- sources:
|
67 |
+
- model: garage-bAInd/Platypus2-70B-instruct
|
68 |
+
layer_range: [62,80]
|
69 |
+
merge_method: passthrough
|
70 |
+
dtype: float16
|
71 |
+
---
|
72 |
+
models:
|
73 |
+
- model: llmixer/BigWeave-v8-90b
|
74 |
+
parameters:
|
75 |
+
weight: 0.5
|
76 |
+
density: 0.25
|
77 |
+
merge_method: dare_ties
|
78 |
+
base_model: llmixer/BigWeave-v6-90b
|
79 |
+
dtype: float16
|
80 |
+
```
|
81 |
+
|
82 |
+
# Acknowledgements
|
83 |
+
[@Xwin-LM](https://huggingface.co/Xwin-LM) For creating Xwin
|
84 |
+
|
85 |
+
[@Sao10K](https://huggingface.co/Sao10K) For creating Euryale and WinterGoddess
|
86 |
+
|
87 |
+
[@garage-bAInd](https://huggingface.co/garage-bAInd) For creating Platypus2
|
88 |
+
|
89 |
+
[@alpindale](https://huggingface.co/alpindale) For creating the original Goliath
|
90 |
+
|
91 |
+
[@chargoddard](https://huggingface.co/chargoddard) For developing [mergekit](https://github.com/cg123/mergekit).
|