runtime error

Exit code: 1. Reason: 4.99k [00:00<00:00, 19.9MB/s] model.safetensors: 0%| | 0.00/982M [00:00<?, ?B/s] model.safetensors: 1%| | 10.5M/982M [00:03<04:52, 3.32MB/s] model.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰| 982M/982M [00:04<00:00, 244MB/s] generation_config.json: 0%| | 0.00/179 [00:00<?, ?B/s] generation_config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 179/179 [00:00<00:00, 834kB/s] tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s] tokenizer_config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 234/234 [00:00<00:00, 978kB/s] merges.txt: 0%| | 0.00/456k [00:00<?, ?B/s] merges.txt: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 456k/456k [00:00<00:00, 14.0MB/s] special_tokens_map.json: 0%| | 0.00/131 [00:00<?, ?B/s] special_tokens_map.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 131/131 [00:00<00:00, 834kB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 15, in <module> tokenizer = AutoTokenizer.from_pretrained(model_id) File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 787, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained return cls._from_pretrained( File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2060, in _from_pretrained slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained( File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/home/user/.local/lib/python3.10/site-packages/transformers/models/gpt2/tokenization_gpt2.py", line 181, in __init__ with open(vocab_file, encoding="utf-8") as vocab_handle: TypeError: expected str, bytes or os.PathLike object, not NoneType

Container logs:

Fetching error logs...