license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- lazymergekit | |
- FelixChao/WestSeverus-7B-DPO-v2 | |
- CultriX/Wernicke-7B-v9 | |
- mlabonne/NeuralBeagle14-7B | |
# RandomMergeNoNorm-7B-DARETIES | |
RandomMergeNoNorm-7B-DARETIES is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [FelixChao/WestSeverus-7B-DPO-v2](https://huggingface.co/FelixChao/WestSeverus-7B-DPO-v2) | |
* [CultriX/Wernicke-7B-v9](https://huggingface.co/CultriX/Wernicke-7B-v9) | |
* [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) | |
## 🧩 Configuration | |
```yaml | |
models: | |
- model: FelixChao/WestSeverus-7B-DPO-v2 | |
# No parameters necessary for base model | |
- model: FelixChao/WestSeverus-7B-DPO-v2 | |
parameters: | |
density: 0.45 | |
weight: 0.35 | |
- model: CultriX/Wernicke-7B-v9 | |
parameters: | |
density: 0.45 | |
weight: 0.35 | |
- model: mlabonne/NeuralBeagle14-7B | |
parameters: | |
density: 0.55 | |
weight: 0.3 | |
merge_method: dare_ties | |
base_model: FelixChao/WestSeverus-7B-DPO-v2 | |
parameters: | |
int8_mask: true | |
dtype: float16 | |
``` |