--- base_model: - TheDrummer/Anubis-70B-v1 - EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.0 - Blackroot/Mirai-3.0-70B - Sao10K/L3.3-70B-Euryale-v2.3 library_name: transformers tags: - mergekit - merge --- # Prikol > I don't even know anymore ![Меня нужно изолировать от общества](https://files.catbox.moe/x9t3zo.png) ### Overview A merge of some Llama 3.3 models because um uh yeah Propmt format: Llama3 Samplers: [This kinda works](https://files.catbox.moe/1vtti7.json) ### Quants [Static](https://huggingface.co/mradermacher/L3.3-Prikol-70B-v0.1a-GGUF) [Imatrix](https://huggingface.co/mradermacher/L3.3-Prikol-70B-v0.1a-i1-GGUF) ## Merge Details ### Merge Method This model was merged using the linear [DELLA](https://arxiv.org/abs/2406.11617) merge method using [TheDrummer/Anubis-70B-v1](https://huggingface.co/TheDrummer/Anubis-70B-v1) as a base. ### Models Merged The following models were included in the merge: * [EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.0](https://huggingface.co/EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.0) * [Blackroot/Mirai-3.0-70B](https://huggingface.co/Blackroot/Mirai-3.0-70B) * [Sao10K/L3.3-70B-Euryale-v2.3](https://huggingface.co/Sao10K/L3.3-70B-Euryale-v2.3) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: TheDrummer/Anubis-70B-v1 parameters: epsilon: 0.04 lambda: 1.05 int8_mask: true rescale: true normalize: false dtype: bfloat16 tokenizer_source: union merge_method: della_linear models: - model: TheDrummer/Anubis-70B-v1 parameters: weight: [0.2, 0.3, 0.2, 0.3, 0.2] density: [0.45, 0.55, 0.45, 0.55, 0.45] - model: Blackroot/Mirai-3.0-70B parameters: weight: [0.01768, -0.01675, 0.01285, -0.01696, 0.01421] density: [0.6, 0.4, 0.5, 0.4, 0.6] - model: Sao10K/L3.3-70B-Euryale-v2.3 parameters: weight: [0.208, 0.139, 0.139, 0.139, 0.208] density: [0.7] - model: EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.0 parameters: weight: [0.33] density: [0.45, 0.55, 0.45, 0.55, 0.45] ```