8bpw please
#2
by
Wsdfsdf
- opened
Don't know if 8BPW will actually improve the output at all compared to this 6BPW quant due to this being a quant of a dequantized 5.65BPW model (Q5_K_M GGUF), but I still would like to test it.
I had the same reluctance to make a > 6.0bpw quant for that reason. I'll add 8.0bpw quant to the list.
Edit: uploading now: https://huggingface.co/LoneStriker/miqu-1-70b-sf-8.0bpw-h8-exl2
Wsdfsdf
changed discussion status to
closed