|
--- |
|
license: apache-2.0 |
|
language: |
|
- ja |
|
tags: |
|
- causal-lm |
|
- not-for-all-audiences |
|
- nsfw |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
# SabbatH 2x7B |
|
|
|
<img src="https://huggingface.co/Elizezen/SabbatH-2x7B/resolve/main/OIG4.jpg" alt="drawing" style="width:512px;"/> |
|
|
|
## Model Description |
|
|
|
SabbatH 2x7B is a Japanese language model that has been created by combining two models: [Antler-RP-ja-westlake-chatvector](https://huggingface.co/soramikaduki/Antler-RP-ja-westlake-chatvector) and [Hameln-japanese-mistral-7B](https://huggingface.co/Elizezen/Hameln-japanese-mistral-7B), using a Mixture of Experts (MoE) approach. It also used [chatntq-ja-7b-v1.0](https://huggingface.co/NTQAI/chatntq-ja-7b-v1.0) as a base model. |
|
|
|
|
|
## Intended Use |
|
|
|
The primary purpose of this language model is to assist in generating novels. While it can handle various prompts, it may not excel in providing instruction-based responses. Note that the model's responses are not censored, and occasionally sensitive content may be generated. |