Can not load the tokenizer with newest transformers
#1
by
conan1024hao
- opened
Traceback (most recent call last):
File "train.py", line 188, in <module>
main()
File "train.py", line 134, in main
tokenizer = AutoTokenizer.from_pretrained("abeja/gpt-neox-japanese-2.7b")
File "/local/10511020.1.gpua/work/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 535, in from_pretrained
config = AutoConfig.from_pretrained(
File "/local/10511020.1.gpua/work/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 725, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
File "/local/10511020.1.gpua/work/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 432, in __getitem__
raise KeyError(key)
KeyError: 'gpt_neox_japanese'
@conan1024hao
Thank you for raising an issue. Did you install by pip install git+https://github.com/huggingface/transformers
like this colab?
@SO0529
I didn't do that, thank you for the response.
conan1024hao
changed discussion status to
closed