gguf
#2
by
goodasdgood
- opened
Can Reflection-Llama-3.1-70B-bnb-4bit it be converted to gguf?
Can Reflection-Llama-3.1-70B-bnb-4bit it be converted to gguf?
Yes of course but currently there is an issue with the Reflection models and the original creator - Matt is trying to solve it.
https://colab.research.google.com/drive/1_Hqdc6SWVMsQxsc5XIkC07BjFGF6sfJK#scrollTo=FduRpnbyNkMd
need tpu
You don't need a TPU to convert it to GGUF on google colab