Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
mergekit
Merge
chinese
arabic
english
multilingual
german
french
gagan3012/MetaModel
jeonsworld/CarbonVillain-en-10.7B-v2
jeonsworld/CarbonVillain-en-10.7B-v4
TomGrc/FusionNet_linear
DopeorNope/SOLARC-M-10.7B
VAGOsolutions/SauerkrautLM-SOLAR-Instruct
upstage/SOLAR-10.7B-Instruct-v1.0
fblgit/UNA-SOLAR-10.7B-Instruct-v1.0
conversational
text-generation-inference
Inference Endpoints
ERROR: For each expert, `positive_prompts` must contain one or more example prompt reflecting what should be routed to that expert.
#1
by
h2m
- opened
"""
base_model: leveldevai/TurdusBeagle-7B
gate_mode: hidden
dtype: bfloat16
experts:
- source_model: leveldevai/TurdusBeagle-7B
positive_prompts: [""] - source_model: udkai/Turdus
positive_prompts: [""] - source_model: nfaheem/Marcoroni-7b-DPO-Merge
positive_prompts: [""] - source_model: Toten5/Marcoroni-neural-chat-7B-v2
positive_prompts: [""]
"""
how do you create that model?