ZiweiChen's picture
Update README.md
2fc7a45 verified
|
raw
history blame
1.44 kB
metadata
license: mit
datasets:
  - AGBonnet/augmented-clinical-notes
language:
  - en
base_model:
  - BioMistral/BioMistral-7B
pipeline_tag: text-generation
tags:
  - clinical
  - biology

Model Card for Model ID

How to use

Loading the model from Hunggingface:

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("ZiweiChen/BioMistral-Clinical-7B")
model = AutoModelForCausalLM.from_pretrained("ZiweiChen/BioMistral-Clinical-7B")

Lightweight model loading can be used - using 4-bit quantization!

from transformers import  AutoTokenizer, BitsAndBytesConfig, AutoModelForCausalLM
import torch

bnb_config = BitsAndBytesConfig(
    load_in_4bit=True,
    bnb_4bit_use_double_quant=True,
    bnb_4bit_quant_type="nf4",
    bnb_4bit_compute_dtype=torch.bfloat16
)

tokenizer = AutoTokenizer.from_pretrained("ZiweiChen/BioMistral-Clinical-7B")
model = AutoModelForCausalLM.from_pretrained("ZiweiChen/BioMistral-Clinical-7B", quantization_config=bnb_config)

How to Generate text:

model_device = next(model.parameters()).device

prompt = """
How to treat severe obesity?
"""
model_input = tokenizer(prompt, return_tensors="pt").to(model_device)

with torch.no_grad():
    output = model.generate(**model_input, max_new_tokens=100)
    answer = tokenizer.decode(output[0], skip_special_tokens=True)
    print(answer)

Model Details

Model Description