cant deploy on inference endpoint

#8
by goporo - opened

cannot deploy on inference endpoint

these are GGUF models for Llama.cpp, maybe you are looking for https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3 ?

tks, do you know why the model respond poorly on mistralai endpoint, i tried download your version and run on LM Studio and it worked fine

You are welcome. I am not sure, you might want to ask Mistral team and share your parameters.

Sign up or log in to comment