chinese-mixtral / README.md
hfl-rc's picture
Update README.md
a57bec8 verified
|
raw
history blame
867 Bytes
metadata
license: apache-2.0
language:
  - zh
  - en
tags:
  - moe

Chinese-Mixtral

Chinese Mixtral GitHub repository: https://github.com/ymcui/Chinese-Mixtral

This repository contains Chinese-Mixtral, which is further pre-trained on Mixtral-8x7B-v0.1.

Note: this is a foundation model, which is not suitable for conversation, QA, etc.

Please refer to https://github.com/ymcui/Chinese-Mixtral/ for more details.