People like me are more likely to try this out if there is a command line utility.
#5
by
DaniCar
- opened
Would be great if this was supported in llama.cpp
Is there a way to plug this into Emacs as well?
@DaniCar @JeremiahFoster if you have mac you can follow this instructions
Try LMStudio (not affiliated): https://lmstudio.ai/