OpenMoE_8B / README.md
fuzhao's picture
Update README.md
23ecc67
---
license: bigcode-openrail-m
---
# OpenMoE
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
Please see this [link](https://github.com/XueFuzhao/OpenMoE/tree/main) for detailed information.