bigllama3.2-3b-to-7b / mergekit_config.yml
deltanym's picture
Upload folder using huggingface_hub
46665d2 verified
raw
history blame contribute delete
452 Bytes
slices:
- sources:
- layer_range: [0, 12]
model: meta-llama/Llama-3.2-3B-Instruct
- sources:
- layer_range: [4, 16]
model: meta-llama/Llama-3.2-3B-Instruct
- sources:
- layer_range: [8, 20]
model: meta-llama/Llama-3.2-3B-Instruct
- sources:
- layer_range: [12, 24]
model: meta-llama/Llama-3.2-3B-Instruct
- sources:
- layer_range: [16, 28]
model: meta-llama/Llama-3.2-3B-Instruct
merge_method: passthrough
dtype: bfloat16