Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yunconglong
/
Mixtral_7Bx2_MoE_13B_DPO
like
0
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
conversational
text-generation-inference
Inference Endpoints
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
4
Train
Deploy
Use this model
Edit model card
Mixtral MOE 2x7B
Mixtral MOE 2x7B
MOE the following models by mergekit and then fine tuned by DPO.
mistralai/Mistral-7B-Instruct-v0.2
NurtureAI/neural-chat-7b-v3-16k
jondurbin/bagel-dpo-7b-v0.1
Downloads last month
84
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference Examples
Text Generation
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to
Inference Endpoints (dedicated)
instead.