Update README.md
Browse files
README.md
CHANGED
@@ -28,7 +28,7 @@ The config looks like this...(detailed version is in the files and versions):
|
|
28 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/f2kxGJNw2iVBExYHiydB0.png)
|
29 |
|
30 |
# Rank 3 in the world for Roleplay.
|
31 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/
|
32 |
|
33 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
34 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
|
|
28 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/f2kxGJNw2iVBExYHiydB0.png)
|
29 |
|
30 |
# Rank 3 in the world for Roleplay.
|
31 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/w4-E6jF1nwd_HAO96Ldko.png)
|
32 |
|
33 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
34 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|