--- base_model: LakoMoor/Silicon-Alice-7B inference: false language: - en - ru library_name: transformers license: cc-by-nc-4.0 merged_models: - LakoMoor/Silicon-Masha-7B model_creator: LakoMoor model_name: Silicon-Alice-7B model_type: mistral pipeline_tag: text-generation quantized_by: Suparious tags: - 4-bit - AWQ - text-generation - autotrain_compatible - endpoints_compatible - not-for-all-audiences - nsfw --- # LakoMoor/Silicon-Alice-7B AWQ - Model creator: [LakoMoor](https://huggingface.co/LakoMoor) - Original model: [Silicon-Alice-7B](https://huggingface.co/LakoMoor/Silicon-Alice-7B) ![Silicon-Alice-7B](https://huggingface.co/LakoMoor/Silicon-Alice-7B/resolve/main/assets/alice.png) ## Model Summary Silicon-Alice-7B is a model based on [Silicon-Masha-7B](https://huggingface.co/LakoMoor/Silicon-Alice-7B) aiming to be both strong in RP, be smart **and** understand Russian, that can follow character maps very well. This model understands Russian better than the previous one. It is suitable for RP/ERP and general use. ## Prompt Template (Alpaca) I found the best SillyTavern results from using the Noromaid template but please try other templates! Let me know if you find anything good. SillyTavern config files: [Context](https://huggingface.co/LakoMoor/Silicon-Alice-7B/resolve/main/assets/context.json), [Instruct](https://huggingface.co/LakoMoor/Silicon-Alice-7B/resolve/main/assets/instruct.json). Additionally, here is my highly recommended [Text Completion preset](https://huggingface.co/LakoMoor/Silicon-Alice-7B/resolve/main/assets/MinP.json). You can tweak this by adjusting temperature up or dropping min p to boost creativity or raise min p to increase stability. You shouldn't need to touch anything else! ``` Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: {prompt} ### Response: ```