Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

Does not support device_map

#14
by HAvietisov - opened

Self-explanatory : to load the model in 8 bits (AutoModelForCausalLM.from_pretrained("mosaicml/mpt-7b", torch_dtype=torch.float16, trust_remote_code=True, load_in_8bit=True, device_map='auto' )), I have to specify device_map.

HAvietisov changed discussion title from Cannot does not support device_map to Does not support device_map
abhi-mosaic changed discussion status to closed

Sign up or log in to comment