Update README.md
Browse files
README.md
CHANGED
@@ -4,6 +4,9 @@
|
|
4 |
|
5 |
# Yi based MOE 2x34B with mixtral architecture
|
6 |
|
|
|
|
|
|
|
7 |
This is an English & Chinese MoE Model , slightly different with [cloudyu/Mixtral_34Bx2_MoE_60B](https://huggingface.co/cloudyu/Mixtral_34Bx2_MoE_60B), and also based on
|
8 |
* [jondurbin/bagel-dpo-34b-v0.2]
|
9 |
* [SUSTech/SUS-Chat-34B]
|
|
|
4 |
|
5 |
# Yi based MOE 2x34B with mixtral architecture
|
6 |
|
7 |
+
Highest score Model ranked by Open LLM Leaderboard (2024-01-11)
|
8 |
+
* [Average Score 76.72](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
9 |
+
|
10 |
This is an English & Chinese MoE Model , slightly different with [cloudyu/Mixtral_34Bx2_MoE_60B](https://huggingface.co/cloudyu/Mixtral_34Bx2_MoE_60B), and also based on
|
11 |
* [jondurbin/bagel-dpo-34b-v0.2]
|
12 |
* [SUSTech/SUS-Chat-34B]
|