Model doesnt load
#5
by
agonzalez
- opened
I have NVIDIA RTX 3060 12GB VRAM and can load 7B version of this model and use 6GB RAM, but this modelo keep loading for ever and doesnt give anty error any idea?
INFO:Loading TheBloke_Wizard-Vicuna-13B-Uncensored-GPTQ...
INFO:Loading TheBloke_Wizard-Vicuna-13B-Uncensored-GPTQ...
INFO:Found the following quantized model: models\TheBloke_Wizard-Vicuna-13B-Uncensored-GPTQ\Wizard-Vicuna-13B-Uncensored-GPTQ-4bit-128g.compat.no-act-order.safetensors
What version of GPTQ-for-LLaMa do you have in text-generation-webui/repositories?