OrionZheng
commited on
Commit
•
306c455
1
Parent(s):
ec74ec4
Update README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
-
[Here](https://colab.research.google.com/drive/1xIfIVafnlCP2XVICmRwkUFK3cwTJYjCY#scrollTo=bG1ed2WQfoU0) is
|
5 |
|
6 |
Please note that openmoe-base is only a small, debugging-focused model and hasn't been extensively trained. Oftentimes, its output is not valid.
|
7 |
If you want to see a decent performance on QAs, please try the [openmoe-8b](https://huggingface.co/OrionZheng/openmoe-8B-chat)(with compute comparable to a 1.6B LLaMA), or the openmoe-34B version later.
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
+
[Here](https://colab.research.google.com/drive/1xIfIVafnlCP2XVICmRwkUFK3cwTJYjCY#scrollTo=bG1ed2WQfoU0) is an inference demo on Colab.
|
5 |
|
6 |
Please note that openmoe-base is only a small, debugging-focused model and hasn't been extensively trained. Oftentimes, its output is not valid.
|
7 |
If you want to see a decent performance on QAs, please try the [openmoe-8b](https://huggingface.co/OrionZheng/openmoe-8B-chat)(with compute comparable to a 1.6B LLaMA), or the openmoe-34B version later.
|