Sentence piece -- tokenizer problem
#80
by
Nahieli
- opened
I am having a problem for downloading the tokenizer for Mistral-7B-Instruct-v0.3. When I run this :
tokenizer = AutoTokenizer.from_pretrained(
"mistralai/Mistral-7B-Instruct-v0.3",
)
I have this error:
Cannot instantiate this tokenizer from a slow version. If it's based on sentencepiece, make sure you have sentencepiece installed.
These are the version I have installed
sentence-transformers 3.0.1
sentencepiece 0.2.0
transformers 4.42.3
For Python
Python 3.12.3
Any idea @patrickvonplaten ?
Thanks!