Kquant03 commited on
Commit
2fd7c43
1 Parent(s): 594e888

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -28,7 +28,7 @@ The config looks like this...(detailed version is in the files and versions):
28
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/f2kxGJNw2iVBExYHiydB0.png)
29
 
30
  # Rank 3 in the world for Roleplay.
31
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/L1AwFoaVbN-bO3CkuqW5Z.png)
32
 
33
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
34
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
 
28
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/f2kxGJNw2iVBExYHiydB0.png)
29
 
30
  # Rank 3 in the world for Roleplay.
31
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/w4-E6jF1nwd_HAO96Ldko.png)
32
 
33
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
34
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)