Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,12 @@
|
|
1 |
---
|
2 |
license: mit
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: mit
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
pipeline_tag: text-generation
|
6 |
+
tags:
|
7 |
+
- moe
|
8 |
---
|
9 |
+
# FusionNet_7Bx2_MoE_v0.1
|
10 |
+
Fine-tuned model on English language using MoE method. The improved version from FusionNet_7Bx2_MoE_14B.
|
11 |
+
## Model description
|
12 |
+
The FusionNet_7Bx2_MoE_v0.1 is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The FusionNet has 12.9B parameters, and this model is fine-tuned. Enjoy!
|