GGUF
Inference Endpoints

The prompt format is Vicuna/Sharegpt.

This model was trained on a subset of orca-best combined with most of freedom-rp. To put it as a ratio, the dataset is roughly 90% orca-best and about 10% degenerate reverse proxy logs. The goal was to create a model with the intellegence and capacity of orca-best, but with enhanced roleplay and adult content capabilities. If you are looking for a model that is trained purely on wanton degeneracy without any attempt to retain intellegence, check out my cockatrice model.

You can find the dataset here used to train this model here:

https://huggingface.co/datasets/openerotica/basilisk-v0.2

If you like what I'm trying to do, please consider subscribing to my patreon. I'm only asking for about tree fiddy.

https://patreon.com/openerotica

Downloads last month
129
GGUF
Model size
7.24B params
Architecture
llama

2-bit

3-bit

4-bit

5-bit

6-bit

16-bit

Inference API
Unable to determine this model's library. Check the docs .