metadata
license: apache-2.0
language:
- zh
- en
tags:
- moe
Chinese-Mixtral
This repository contains Chinese-Mixtral, which is further pre-trained on Mixtral-8x7B-v0.1.
Note: this is a foundation model, which is not suitable for conversation, QA, etc.
For LoRA-only model, please see: https://huggingface.co/hfl/chinese-mixtral-lora
For GGUF model (llama.cpp compatible), please see: https://huggingface.co/hfl/chinese-mixtral-gguf
Please refer to https://github.com/ymcui/Chinese-Mixtral/ for more details.