--- base_model: - jdqwoi/TooManyMixRolePlay-7B-Story_V3.5 - kasper52786/StoryWeaver-7b-Instruct-v0.1 - MaziyarPanahi/Mistral-7B-claude-instruct-Mistral-7B-Instruct-v0.2-slerp - KnutJaegersberg/Mistral-7B-EssayWriter - tdh87/StoryTeller7b-meh - MrRobotoAI/Test001a - ajibawa-2023/General-Stories-Mistral-7B - luozhuanggary/GOAT-v0.2-Mistral-7B-Claude - scribis/Fantastica-7b-Instruct-0.2-Italian_merged - ajibawa-2023/Young-Children-Storyteller-Mistral-7B library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [MrRobotoAI/Test001a](https://huggingface.co/MrRobotoAI/Test001a) as a base. ### Models Merged The following models were included in the merge: * [jdqwoi/TooManyMixRolePlay-7B-Story_V3.5](https://huggingface.co/jdqwoi/TooManyMixRolePlay-7B-Story_V3.5) * [kasper52786/StoryWeaver-7b-Instruct-v0.1](https://huggingface.co/kasper52786/StoryWeaver-7b-Instruct-v0.1) * [MaziyarPanahi/Mistral-7B-claude-instruct-Mistral-7B-Instruct-v0.2-slerp](https://huggingface.co/MaziyarPanahi/Mistral-7B-claude-instruct-Mistral-7B-Instruct-v0.2-slerp) * [KnutJaegersberg/Mistral-7B-EssayWriter](https://huggingface.co/KnutJaegersberg/Mistral-7B-EssayWriter) * [tdh87/StoryTeller7b-meh](https://huggingface.co/tdh87/StoryTeller7b-meh) * [ajibawa-2023/General-Stories-Mistral-7B](https://huggingface.co/ajibawa-2023/General-Stories-Mistral-7B) * [luozhuanggary/GOAT-v0.2-Mistral-7B-Claude](https://huggingface.co/luozhuanggary/GOAT-v0.2-Mistral-7B-Claude) * [scribis/Fantastica-7b-Instruct-0.2-Italian_merged](https://huggingface.co/scribis/Fantastica-7b-Instruct-0.2-Italian_merged) * [ajibawa-2023/Young-Children-Storyteller-Mistral-7B](https://huggingface.co/ajibawa-2023/Young-Children-Storyteller-Mistral-7B) ### Configuration The following YAML configuration was used to produce this model: ```yaml ### This the config.yml for ABC_Books/test002 ### models: - model: KnutJaegersberg/Mistral-7B-EssayWriter parameters: weight: 0.1 density: 0.9 - model: MaziyarPanahi/Mistral-7B-claude-instruct-Mistral-7B-Instruct-v0.2-slerp parameters: weight: 0.1 density: 0.9 - model: ajibawa-2023/General-Stories-Mistral-7B parameters: weight: 0.1 density: 0.9 - model: ajibawa-2023/Young-Children-Storyteller-Mistral-7B parameters: weight: 0.1 density: 0.9 - model: jdqwoi/TooManyMixRolePlay-7B-Story_V3.5 parameters: weight: 0.1 density: 0.9 - model: kasper52786/StoryWeaver-7b-Instruct-v0.1 parameters: weight: 0.1 density: 0.9 - model: luozhuanggary/GOAT-v0.2-Mistral-7B-Claude parameters: weight: 0.1 density: 0.9 - model: scribis/Fantastica-7b-Instruct-0.2-Italian_merged parameters: weight: 0.1 density: 0.9 - model: tdh87/StoryTeller7b-meh parameters: weight: 0.1 density: 0.9 - model: MrRobotoAI/Test001a parameters: weight: 0.1 density: 0.9 merge_method: dare_ties ### Now this model best exemplifies the closest match to all of the features needed in the final model. So it now becomes the base model for merges ### base_model: MrRobotoAI/Test001a parameters: normalize: true int8_mask: true dtype: float16 ```