OpenMoE_Base / README.md
fuzhao's picture
Update README.md
16e8f72
|
raw
history blame
217 Bytes
metadata
license: bigcode-openrail-m

OpenMoE

A family of open-sourced Mixture-of-Experts (MoE) Large Language Models

Please see this link for detailed information.