granite and granitemoe model architecture not supported

#1
by FM-1976 - opened

Hi,
anyone knows about how to run these models with llama-cpp-python?
I know Ollama already announced full support to these models....
https://x.com/ollama/status/1848223852465213703
https://ollama.com/blog/ibm-granite

They're relatively recent so make sure to update your python package

Sign up or log in to comment