jp-gpt-1b-dareties / mergekit_config.yml
aipib's picture
Upload folder using huggingface_hub
6f62a09 verified
raw
history blame
447 Bytes
slices:
- sources:
- layer_range: [0, 24]
model: ce-lery/dolly-japanese-gpt-1b-clone
parameters:
density: [1, 0.7, 0.1]
weight: 1.0
- layer_range: [0, 24]
model: rinna/japanese-gpt-1b
parameters:
density: 0.33
weight:
- filter: mlp
value: 0.5
- value: 0
merge_method: dare_ties
base_model: rinna/japanese-gpt-1b
parameters:
normalize: true
int8_mask: true
dtype: bfloat16