Serving this with llama.cpp
#1
by
qiisziilbash
- opened
Hi,
Is there an example of using this model? in particular with llama.cpp?
Thanks
Hi,
Is there an example of using this model? in particular with llama.cpp?
Thanks