openmoe-base / README.md
OrionZheng's picture
Update README.md
56d9f12 verified
|
raw
history blame
512 Bytes
metadata
license: apache-2.0

Here is an inference demo on Colab.

Please note that openmoe-base is only a small, debugging-focused model and hasn't been extensively trained. Oftentimes, its output is invalid.
If you want to see a decent performance on QAs, please try the openmoe-8b(with compute comparable to a 1.6B LLaMA), or the openmoe-34B version later.