--- base_model: - jeiku/Theory_of_Mind_Mistral - jeiku/Gnosis_Reformatted_Mistral - Undi95/Mistral-7B-small_pippa_limaRP-v3-lora - jeiku/Theory_of_Mind_Roleplay_Mistral tags: - mergekit - merge --- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/uAfKDeavEBisJYmDxjJsE.png) technicolor consists of the following merge, which was then merged with the below LoRAs to produce rainbow: ```yaml slices: - sources: - model: paulml/OGNO-7B layer_range: [0, 32] - model: SanjiWatsuki/Kunoichi-DPO-v2-7B layer_range: [0, 32] merge_method: slerp base_model: SanjiWatsuki/Kunoichi-DPO-v2-7B parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 dtype: bfloat16 ``` # rainbow This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using technicolor as a base. ### Models Merged The following models were included in the merge: * technicolor + [jeiku/Theory_of_Mind_Mistral](https://huggingface.co/jeiku/Theory_of_Mind_Mistral) * technicolor + [jeiku/Gnosis_Reformatted_Mistral](https://huggingface.co/jeiku/Gnosis_Reformatted_Mistral) * technicolor + [Undi95/Mistral-7B-small_pippa_limaRP-v3-lora](https://huggingface.co/Undi95/Mistral-7B-small_pippa_limaRP-v3-lora) * technicolor + [jeiku/Theory_of_Mind_Roleplay_Mistral](https://huggingface.co/jeiku/Theory_of_Mind_Roleplay_Mistral) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: task_arithmetic base_model: technicolor parameters: normalize: true models: - model: technicolor+jeiku/Theory_of_Mind_Roleplay_Mistral parameters: weight: 1 - model: technicolor+jeiku/Theory_of_Mind_Mistral parameters: weight: 1 - model: technicolor+jeiku/Gnosis_Reformatted_Mistral parameters: weight: 1 - model: technicolor+Undi95/Mistral-7B-small_pippa_limaRP-v3-lora parameters: weight: 1 dtype: float16 ```