Usage
from transformers import AutoTokenizer, AutoModelForMaskedLM, AutoModel, AutoModelForMLM
model = AutoModelForMaskedLM.from_pretrained("manu/optimus_v2", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("manu/optimus_v2", trust_remote_code=True)
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
- Downloads last month
- 15
Inference API (serverless) does not yet support model repos that contain custom code.