Edit model card

Usage

from transformers import AutoTokenizer, AutoModelForMaskedLM, AutoModel, AutoModelForMLM


model = AutoModelForMaskedLM.from_pretrained("manu/optimus_v2", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("manu/optimus_v2", trust_remote_code=True)

text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
Downloads last month
15
Safetensors
Model size
516M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.