Ph3della-14B / mergekit_config.yml
allknowingroger's picture
Upload folder using huggingface_hub
fdded08 verified
raw
history blame contribute delete
408 Bytes
models:
- model: jpacifico/Chocolatine-14B-Instruct-DPO-v1.2
parameters:
weight: 0.5
density: 0.8
- model: failspy/Phi-3-medium-4k-instruct-abliterated-v3
parameters:
weight: 0.5
density: 0.8
merge_method: della_linear
base_model: jpacifico/Chocolatine-14B-Instruct-DPO-v1.2
parameters:
epsilon: 0.05
lambda: 1
int8_mask: true
dtype: bfloat16
tokenzer_source: union