Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

module 'torch' has no attribute 'bfloat16'

#6
by Mozzipa - opened

Please check and advise me how to solve error AttributeError: module 'torch' has no attribute 'bfloat16'.

import transformers

model = transformers.AutoModelForCausalLM.from_pretrained(
  'mosaicml/mpt-7b-instruct',
  trust_remote_code=True,
)

for your reference, i have torchaudio-2.0.1 torchvision-0.15.1 torch-2.0.0 transformers-4.28.1 (apple silicon)

BFloat16 is not supported on Apple Silicon...

I believe that is correct, thanks @rjadr !

daking changed discussion status to closed

How did you manage to get around it?

Sign up or log in to comment