metadata
base_model:
- EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1
- anthracite-org/magnum-v4-72b
library_name: transformers
tags:
- mergekit
- merge
SteyrCannon-Qwen2.5-72b
This is a merge of pre-trained language models created using mergekit.
Thanks to Auri for the suggestion!
This recipe follows closely to StarCannon-v3 but with the desnity for magnum-v4 reduced even further. Magnum-v4 feels a bit overcooked.
Quants & Hosts
Merge Details
Merge Method
This model was merged using the TIES merge method using EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: anthracite-org/magnum-v4-72b
parameters:
density: 0.25
weight: 0.5
- model: EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1
parameters:
density: 0.75
weight: 0.5
merge_method: ties
base_model: EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1
parameters:
normalize: true
dtype: bfloat16