runtime error
loading pytorch_model.bin: 98%|█████████▊| 5.59G/5.68G [02:09<00:01, 80.8MB/s] Downloading pytorch_model.bin: 99%|█████████▊| 5.61G/5.68G [02:10<00:00, 84.8MB/s] Downloading pytorch_model.bin: 99%|█████████▉| 5.62G/5.68G [02:10<00:00, 87.6MB/s] Downloading pytorch_model.bin: 99%|█████████▉| 5.64G/5.68G [02:10<00:00, 98.2MB/s] Downloading pytorch_model.bin: 99%|█████████▉| 5.65G/5.68G [02:10<00:00, 89.8MB/s] Downloading pytorch_model.bin: 100%|█████████▉| 5.67G/5.68G [02:10<00:00, 98.0MB/s] Downloading pytorch_model.bin: 100%|█████████▉| 5.68G/5.68G [02:10<00:00, 91.1MB/s] Downloading pytorch_model.bin: 100%|██████████| 5.68G/5.68G [02:10<00:00, 43.5MB/s] The argument `trust_remote_code` is to be used with Auto classes. It has no effect here and is ignored. Traceback (most recent call last): File "/home/user/app/app.py", line 103, in <module> chat_bot = ChatBot(chroma_db) File "/home/user/app/app.py", line 67, in __init__ self.reset_context() File "/home/user/app/app.py", line 74, in reset_context self.qa_chain = build_qa_chain() File "/home/user/app/app.py", line 44, in build_qa_chain instruct_pipeline = pipeline(model=model_name, torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto", File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline framework, model = infer_framework_load_model( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model databricks/dolly-v2-3b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt_neox.modeling_gpt_neox.GPTNeoXForCausalLM'>).
Container logs:
Fetching error logs...