Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ First experimental merge of Noromaid 8x7b (Instruct) and dolphin 8x7b. The idea
|
|
9 |
|
10 |
Merged Dolphin 2.7 with Mixtral Base (Dolphin was at 1.0 weight) to get rid of ChatLM, and then I merged Noromaid 8x7b with the output, SLERP method.
|
11 |
|
12 |
-
This model feel better on the IQ chart and have the ~same average score than Noromaid 8x7b, but it's softer and more prude too, it also have the typical Mixtral repeat issue at some point. Choose your poison.
|
13 |
|
14 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63ab1241ad514ca8d1430003/uZlU0PEPtKPZPLzXcoqJ_.png)
|
15 |
|
|
|
9 |
|
10 |
Merged Dolphin 2.7 with Mixtral Base (Dolphin was at 1.0 weight) to get rid of ChatLM, and then I merged Noromaid 8x7b with the output, SLERP method.
|
11 |
|
12 |
+
This model feel better on the IQ chart and have the ~same average ERP score on ayumi bench' than Noromaid 8x7b, but it's softer and more prude too, it also have the typical Mixtral repeat issue at some point. Choose your poison.
|
13 |
|
14 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63ab1241ad514ca8d1430003/uZlU0PEPtKPZPLzXcoqJ_.png)
|
15 |
|