ctransformers AutoTokenizer - Error on Tokenizer - Currently `AutoTokenizer.from_pretrained` only accepts a model object
I'm using TheBloke/Llama-2-13B-chat-GGUF with llama-2-13b-chat.Q5_K_M.gguf.
When I run the code I get the error.
Code:
from ctransformers import AutoModelForCausalLM,AutoTokenizer
...
model_name = "TheBloke/Llama-2-13B-chat-GGUF"
model_file = "llama-2-13b-chat.Q5_K_M.gguf"
llm = AutoModelForCausalLM.from_pretrained(model_dir, model_file=model_file, model_type="llama", gpu_layers=50,hf=True)
tokenizer = AutoTokenizer.from_pretrained(model_name)
....
In the line " tokenizer = AutoTokenizer.from_pretrained(model_name) " , I'm getting the error:
tokenizer = AutoTokenizer.from_pretrained(model_file)
File "/home/ubuntu/prjLlamaQuant/venv/lib/python3.10/site-packages/ctransformers/hub.py", line 262, in from_pretrained
raise TypeError(
TypeError: Currently AutoTokenizer.from_pretrained
only accepts a model object. Please use:
model = AutoModelForCausalLM.from_pretrained(..., hf=True)
tokenizer = AutoTokenizer.from_pretrained(model)
It was a basic error. No error at all of this type exists.