Khetterman's picture
Create README.md
3bbaa8a verified
---
base_model:
- huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated
- IlyaGusev/saiga_llama3_8b
- lightblue/suzume-llama-3-8B-multilingual
- lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full
- lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half
- lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25
- lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75
library_name: transformers
tags:
- mergekit
- merge
- bfloat16
- safetensors
- 8b
- chat
- conversational
language:
- de
- en
- es
- fr
- hi
- it
- ja
- pt
- ru
- th
- zh
---
# Multilingual-SaigaSuzume-8B
>Your words are like rain falling from heaven on a tower in a sinful land; can anyone in Babylon understand them?
![Multilingual-SaigaSuzume-8B-Logo256.png](https://cdn-uploads.huggingface.co/production/uploads/673125091920e70ac26c8a2e/aVbK8k3mUMBAOlUSXBK91.png)
This model was created as the basis of multilingual abilities for other models. I think it will be very useful as an integral part of your model. There is some censorship, keep this in mind.
## Merge Details
### Method
This is a simple, but usefull merge of **7 cool models**, created using [mergekit](https://github.com/arcee-ai/mergekit).
### Models
The following models were included in the merge:
* [huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated](https://huggingface.co/huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated)
* [IlyaGusev/saiga_llama3_8b](https://huggingface.co/IlyaGusev/saiga_llama3_8b)
* [lightblue/suzume-llama-3-8B-multilingual](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual)
* [lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full)
* [lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half)
* [lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25)
* [lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75)
### Configuration
The following YAML configurations was used to produce this model:
```yaml
# Multilingual-SaigaSuzume-8B-BFH
models:
- model: lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full
- model: IlyaGusev/saiga_llama3_8b
- model: lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half
merge_method: model_stock
base_model: huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated
dtype: bfloat16
# Multilingual-SaigaSuzume-8B-BTP
models:
- model: lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75
- model: IlyaGusev/saiga_llama3_8b
- model: lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25
merge_method: model_stock
base_model: huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated
dtype: bfloat16
# Multilingual-SaigaSuzume-8B-Classic
models:
- model: IlyaGusev/saiga_llama3_8b
- model: lightblue/suzume-llama-3-8B-multilingual
merge_method: model_stock
base_model: huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated
dtype: bfloat16
# Multilingual-SaigaSuzume-8B
models:
- model: Multilingual-SaigaSuzume-8B-BFH
- model: Multilingual-SaigaSuzume-8B-BTP
merge_method: model_stock
base_model: Multilingual-SaigaSuzume-8B-Classic
dtype: bfloat16
```
>My thanks to the authors of the original models, your work is incredible. Have a good time 🖤