Edit model card

4 bit AWQ Full Parameter Finetuning 7B 32768 context length Mistral on Malaysian instructions dataset

Original model at https://huggingface.co/mesolitica/malaysian-mistral-7b-32k-instructions, read more about AWQ integration at https://huggingface.co/docs/transformers/main_classes/quantization#awq-integration

how-to

from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
import torch

def parse_mistral_chat(messages):

    user_query = messages[-1]['content']

    users, assistants = [], []
    for q in messages[:-1]:
        if q['role'] == 'user':
            users.append(q['content'])
        elif q['role'] == 'assistant':
            assistants.append(q['content'])

    texts = ['<s>']
    for u, a in zip(users, assistants):
        texts.append(f'[INST] {u.strip()} [/INST]{a.strip()}</s> ')

    texts.append(f'[INST] {user_query.strip()} [/INST]')
    prompt = ''.join(texts).strip()
    return prompt

tokenizer = AutoTokenizer.from_pretrained('mesolitica/malaysian-mistral-7b-32k-instructions-AWQ')
model = AutoModelForCausalLM.from_pretrained(
    'mesolitica/malaysian-mistral-7b-32k-instructions-AWQ',
    use_flash_attention_2 = True,
)
_ = model.cuda()

messages = [
    {'role': 'user', 'content': 'kwsp tu apa'}
]
prompt = parse_mistral_chat(messages)
inputs = tokenizer([prompt], return_tensors='pt', add_special_tokens=False).to('cuda')
generate_kwargs = dict(
    inputs,
    max_new_tokens=1024,
    top_p=0.95,
    top_k=50,
    temperature=0.9,
    do_sample=True,
    num_beams=1,
)
r = model.generate(**generate_kwargs)
tokenizer.decode(r[0])
<s> [INST] kwsp tu apa [/INST]KWSP bermaksud Kumpulan Wang Simpanan Pekerja. Ia adalah sebuah institusi simpanan persaraan yang ditubuhkan oleh Kementerian Kewangan Malaysia untuk tujuan mengumpul simpanan ahli untuk dibayar pada umur persaraan, penuh atau penuh persaraan penuh. KWSP ditubuhkan pada tahun 1951 dan mula beroperasi pada tahun 1952. KWSP adalah salah satu institusi simpanan persaraan terbesar di dunia, dengan pangkalan ahli sekitar 14 juta ahli.</s>
Downloads last month
19
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including mesolitica/malaysian-mistral-7b-32k-instructions-AWQ