Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yunconglong
/
Mixtral_7Bx2_MoE_13B_DPO
like
0
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
conversational
text-generation-inference
Inference Endpoints
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
4
Train
Deploy
Use this model
main
Mixtral_7Bx2_MoE_13B_DPO
/
README.md
yunconglong
Update README.md
8b1ebda
verified
10 months ago
preview
code
|
raw
Copy download link
history
blame
contribute
delete
Safe
410 Bytes
metadata
license:
cc-by-nc-4.0
tags:
-
moe
Mixtral MOE 2x7B
MOE the following models by mergekit and then fine tuned by DPO.
mistralai/Mistral-7B-Instruct-v0.2
NurtureAI/neural-chat-7b-v3-16k
jondurbin/bagel-dpo-7b-v0.1