Update README.md
Browse files
README.md
CHANGED
@@ -23,8 +23,12 @@ The config looks like this...(detailed version is in the files and versions):
|
|
23 |
- [macadeliccc/WestLake-7B-v2-laser-truthy-dpo](https://huggingface.co/macadeliccc/WestLake-7B-v2-laser-truthy-dpo) - expert #3
|
24 |
- [Kquant03/Samlagast-7B-laser-bf16](https://huggingface.co/Kquant03/Triunvirato-7b-laser) - expert #4
|
25 |
|
26 |
-
#
|
27 |
|
|
|
|
|
|
|
|
|
28 |
|
29 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
30 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
|
|
23 |
- [macadeliccc/WestLake-7B-v2-laser-truthy-dpo](https://huggingface.co/macadeliccc/WestLake-7B-v2-laser-truthy-dpo) - expert #3
|
24 |
- [Kquant03/Samlagast-7B-laser-bf16](https://huggingface.co/Kquant03/Triunvirato-7b-laser) - expert #4
|
25 |
|
26 |
+
# Huge improvement upon the base Buttercup model!!!!
|
27 |
|
28 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ZjNod8J9bmnhL9mM4znQv.png)
|
29 |
+
|
30 |
+
# Rank 2 in the world for Roleplay.
|
31 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/L1AwFoaVbN-bO3CkuqW5Z.png)
|
32 |
|
33 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
34 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|