metadata
license: apache-2.0
Dataset
Japanese subset of the mC4 dataset
Training
Trained for 3000 steps on top of the MPT 7b checkpoint mosaicml/mpt-7b
How to load
Before running this model, please install the following pip package:
pip install einops
To run this model, you may need to load it in a lower precision in order for it to fit onto your GPU. We found for a T4 GPU, it requires loading the model in 8-bit precision. To load the model in 8-bit, please install the following pip packages:
pip install bitsandbytes accelerate
Caution - you will also need enough RAM to load the model. We estimate loading this model requires ~30GB.
Auto type
from transformers import AutoModelForCausalLM
model_name = "lightblue/japanese-mpt-7b"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype='auto',
trust_remote_code=True
)
In 8 bit
from transformers import AutoModelForCausalLM
model_name = "lightblue/japanese-mpt-7b"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype='auto',
load_in_8bit=True,
trust_remote_code=True
)
How to use
from transformers import AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
pipe("こんにちは", temperature=0, do_sample=False, return_full_text=False, max_new_tokens=32)