Maths-llm / mergekit_config.yml
AnonY0324's picture
Upload folder using huggingface_hub
8582d00 verified
raw
history blame contribute delete
246 Bytes
base_model: unsloth/llama-3-8b
dtype: bfloat16
merge_method: task_arithmetic
slices:
- sources:
- layer_range: [0, 32]
model: unsloth/llama-3-8b
- layer_range: [0, 32]
model: Kukedlc/LLama-3-8b-Maths
parameters:
weight: 0.75