Model Details
Model Description
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by: FrankL
- Language(s) (NLP): English
Direct Use
model = AutoModel.from_pretrained('FrankL/storytellerLM-v0', trust_remote_code=True, torch_dtype=torch.float16)
model = model.to(device='cuda')
tokenizer = AutoTokenizer.from_pretrained('FrankL/storytellerLM-v0', trust_remote_code=True)
def inference(
model: AutoModelForCausalLM,
tokenizer: AutoTokenizer,
input_text: str = "Once upon a time, ",
max_new_tokens: int = 16
):
inputs = tokenizer(input_text, return_tensors="pt").to(device)
outputs = model.generate(
**inputs,
pad_token_id=tokenizer.eos_token_id,
max_new_tokens=max_new_tokens,
do_sample=True,
top_k=40,
top_p=0.95,
temperature=0.8
)
generated_text = tokenizer.decode(
outputs[0],
skip_special_tokens=True
)
# print(outputs)
print(generated_text)
inference(model, tokenizer)
- Downloads last month
- 68
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.