что делать

#1
by andre456 - opened

подскажите пожалуйста.
на винде развернут oobabooga/text-generation-webui
викуня работает нормально.
при запуске IlyaGusev/llama_13b_ru_turbo_alpaca_lora_llamacpp дает ошибку
Traceback (most recent call last):
File “G:\ChatGPT\text-generation-webui\server.py”, line 84, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name)
File “G:\ChatGPT\text-generation-webui\modules\models.py”, line 171, in load_model
model = AutoModelForCausalLM.from_pretrained(checkpoint, **params)
File “G:\ChatGPT\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py”, line 441, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File “G:\ChatGPT\installer_files\env\lib\site-packages\transformers\models\auto\configuration_auto.py”, line 916, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File “G:\ChatGPT\installer_files\env\lib\site-packages\transformers\configuration_utils.py”, line 573, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File “G:\ChatGPT\installer_files\env\lib\site-packages\transformers\configuration_utils.py”, line 628, in _get_config_dict
resolved_config_file = cached_file(
File “G:\ChatGPT\installer_files\env\lib\site-packages\transformers\utils\hub.py”, line 380, in cached_file
raise EnvironmentError(
OSError: models\llama_13b_ru_turbo_alpaca_lora_llamacpp does not appear to have a file named config.json. Checkout ‘https://huggingface.co/models\llama_13b_ru_turbo_alpaca_lora_llamacpp/None’ for available files.

скрин настроек прилагаю
Screenshot_105.png

что можно сделать?

Не использовать llamacpp версию, а использовать оригинальную: https://huggingface.co/IlyaGusev/llama_13b_ru_turbo_alpaca_lora

IlyaGusev changed discussion status to closed

Sign up or log in to comment