File size: 512 Bytes
9a88672 306c455 ec74ec4 56d9f12 ec74ec4 |
1 2 3 4 5 6 7 |
---
license: apache-2.0
---
[Here](https://colab.research.google.com/drive/1xIfIVafnlCP2XVICmRwkUFK3cwTJYjCY#scrollTo=bG1ed2WQfoU0) is an inference demo on Colab.
Please note that openmoe-base is only a small, debugging-focused model and hasn't been extensively trained. Oftentimes, its output is invalid.
If you want to see a decent performance on QAs, please try the [openmoe-8b](https://huggingface.co/OrionZheng/openmoe-8B-chat)(with compute comparable to a 1.6B LLaMA), or the openmoe-34B version later. |