Yi-34B AQLM?
#3
by
llama-anon
- opened
It would be very nice if there was a Yi 34B aqlm, DeepSeek 33b AQLM already fits on an RTX 3060 12gb and works very well. It is also very nice that there is Mixtral8x7b and Command R already but many people have 12-16gb vram GPUs, other 30B~ AQLM models would be very useful.
Thank you for such an amazing quantization method!