Acknowledgement
#1
by
syubraj
- opened
The model might simply be broken, or not supported by llama.cpp
Which one do you suggest then to make the model running? or would you check from your side, if possible.
works fine for me in llama.cpp (to the extend I can test it, my nepali is notoriosuly bad :)
$ llama-cli -m RomanEng2Nep-v2.Q4_K_S.gguf -p muskuraudai
कुंजलिसहित [end of text]
mradermacher
changed discussion status to
closed