host LLM model

#7
by zirkze - opened

guys, i'm trying to host LLM models on a vps, but i can't, can anyone help me? when i host it it hallucinates, when it doesn't hallucinate it takes 10 minutes to respond

is there a discord so I can find someone who can help me? anything helps

@zirkze You need to ensure you have the right chat template regardless how you inference it. Read the instructions on the model card page.

Hey @zirkze ! I was able to host this LLM, reach out for any help :)

Orenguteng changed discussion status to closed

Sign up or log in to comment