File size: 1,587 Bytes
abee9a6 71ebdfe abee9a6 71ebdfe abee9a6 af26be4 abee9a6 87a28c4 71ebdfe 87a28c4 71ebdfe abee9a6 af26be4 abee9a6 71ebdfe abee9a6 af26be4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
---
base_model:
- Local-Novel-LLM-project/Yosegi-2
- Local-Novel-LLM-project/Ninja-V2-7B
- Local-Novel-LLM-project/Yosegi-0604
library_name: transformers
tags:
- mergekit
- merge
---
# Ninja-V3
このモデルはEvoマージによって作成された Yosegi というモデルを対象に [Local-Novel-LLM-project/Ninja-V2-7B](https://huggingface.co/Local-Novel-LLM-project/Ninja-V2-7B) をベースにした Model Stock によるマージで作成されています。
今回、派生版として Shadows という Ninja-V3 を MoE したモデルも公開しています。
[Local-Novel-LLM-project/Shadows-MoE](https://huggingface.co/Local-Novel-LLM-project/Shadows-MoE)
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Local-Novel-LLM-project/Ninja-V2-7B](https://huggingface.co/Local-Novel-LLM-project/Ninja-V2-7B) as a base.
### Models Merged
The following models were included in the merge:
* [Local-Novel-LLM-project/Yosegi-2](https://huggingface.co/Local-Novel-LLM-project/Yosegi-2)
* [Local-Novel-LLM-project/Yosegi-0604](https://huggingface.co/Local-Novel-LLM-project/Yosegi-0604)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: "Local-Novel-LLM-project/Yosegi-2"
- model: "Local-Novel-LLM-project/Yosegi-0604"
merge_method: model_stock
base_model: Local-Novel-LLM-project/Ninja-V2-7B
dtype: bfloat16
``` |