Tokenizer does not work
#1
by
zokica
- opened
Hello,
When I use this code:
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("yahma/llama-7b-hf")
##################################################################
ValueError: Couldn't instantiate the backend tokenizer from one of:
(1) a tokenizers
library serialization file,
(2) a slow tokenizer instance to convert or
(3) an equivalent slow tokenizer class to instantiate and convert.
You need to have sentencepiece installed to convert a slow tokenizer to a fast one.
#################################################################
I do have sentencepiece installed.