Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

What task does it support?

#10
by vshetty - opened

Currently it does not support "question-answering" or "text-generation".

You can use the MPT models with the "text-generation" pipeline. I believe a warning will pop up (b/c our architecture is custom and not in the transformers library), but you can ignore the warning and it should work fine.

generator = pipeline("text-generation", model="mosaicml/mpt-7b", trust_remote_code=True, torch_dtype=torch.bfloat16)
abhi-mosaic changed discussion status to closed

Sign up or log in to comment