Support for oobabooga/text-generation-webui
I know this is not supported by oobabooga/text-generation-webui
yet because it's still using old llama.cpp version. However, I'd like to create this thread for people to follow up when oobabooga is ready π
I'm currently watching this issue: https://github.com/oobabooga/text-generation-webui/issues/2020
I quantized it with the older version f llama.cpp: https://huggingface.co/creative420/Wizard-Vicuna-13b-Uncensored_old_ggml
FYI, we also can download the version for previous LLaMA from the branch previous_llama
:
https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-GGML/tree/previous_llama
For who are using oobabooga, you can follow this instruction to use the new format of llama:
Current requirements.txt
:
accelerate==0.19.0
colorama
datasets
flexgen==0.1.7
gradio_client==0.1.4
gradio==3.25.0
markdown
numpy
pandas
Pillow>=9.5.0
pyyaml
requests
rwkv==0.7.3
safetensors==0.3.1
sentencepiece
tqdm
git+https://github.com/huggingface/peft
transformers==4.28.1
bitsandbytes==0.38.1; platform_system != "Windows"
llama-cpp-python==0.1.45; platform_system != "Windows"
https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.45/llama_cpp_python-0.1.45-cp310-cp310-win_amd64.whl; platform_system == "Windows"
- Change
llama-cpp-python==0.1.45
tollama-cpp-python==0.1.50
- Run
pip install -r requirements.txt
Re-run your normal oobabooga with the models in the new llama format π
Thanks Michael but didn't text-gen-ui already update? So I think you just need to update text-gen-ui normally and you'll get this commit:
https://github.com/oobabooga/text-generation-webui/commit/eee986348c3ef1cb0070b287ae865a682084922d
Oh yeah, I didn't notice it. Was quite busy since yesterday & share it as soon as I'm back. This world is moving blazing fast π
Btw, thanks for sharing this great model. Really love it π