KeyError: 'llama'

#110
by ronnief1 - opened

getting this error when i run the following code:

model_name = "meta-llama/Meta-Llama-3.1-8B-Instruct" #use instruct model
llama_model = AutoModelForCausalLM.from_pretrained(model_name)

here's the full error:
File ~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:441, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
438 if kwargs_copy.get("torch_dtype", None) == "auto":
439 _ = kwargs_copy.pop("torch_dtype")
--> 441 config, kwargs = AutoConfig.from_pretrained(
442 pretrained_model_name_or_path,
443 return_unused_kwargs=True,
444 trust_remote_code=trust_remote_code,
445 **hub_kwargs,
446 **kwargs_copy,
447 )
448 if hasattr(config, "auto_map") and cls.name in config.auto_map:
449 if not trust_remote_code:

File ~/.local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:917, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
915 return config_class.from_pretrained(pretrained_model_name_or_path, **kwargs)
916 elif "model_type" in config_dict:
--> 917 config_class = CONFIG_MAPPING[config_dict["model_type"]]
918 return config_class.from_dict(config_dict, **unused_kwargs)
919 else:
920 # Fallback: use pattern matching on the string.
921 # We go from longer names to shorter names to catch roberta before bert (for instance)

File ~/.local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:623, in _LazyConfigMapping.getitem(self, key)
621 return self._extra_content[key]
622 if key not in self._mapping:
--> 623 raise KeyError(key)
624 value = self._mapping[key]
625 module_name = model_type_to_module_name(key)

thanks in advance

try updating your transformers

pip install -U transformers

yep thanks i got it working now.
upgraded to transformers==4.44.2

Sign up or log in to comment