WizardLM-2-4x7B-MoE-exl2-3_0bpw / mergekit_moe_config.yml
Skylaude's picture
Upload 7 files
b0c5a5c verified
raw
history blame contribute delete
251 Bytes
base_model: models/WizardLM-2-7B
gate_mode: random
dtype: float16
experts_per_token: 4
experts:
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B