Request for Quantized versions
#1
by
mandeepbagga
- opened
Tagging @TheBloke for creating quantized versions of this model.
We can use this with Open Code Interpreter to work as a Data Analysis tool.
OK, it's in my queue
OK, it's in my queue
Is there a simple script which you've created to Quantize models in different formats like AWQ, GPTQ, GGUF, EXL2, and so on?
pipizhao
changed discussion status to
closed