Edit model card

Model Card for Model ID

malhajar/Llama-2-7b-chat-dolly-tr is a finetuned version of Llama-2-7b-hf using SFT Training. This model can answer information in turkish language as it is finetuned on a turkish dataset specifically databricks-dolly-15k-tr

llama

Model Description

Prompt Template

<s>[INST] <prompt> [/INST] 

How to Get Started with the Model

Use the code sample provided in the original post to interact with the model.

from transformers import AutoTokenizer,AutoModelForCausalLM
 
model_id = "malhajar/Llama-2-7b-chat-dolly-tr"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
                                             device_map="auto",
                                             torch_dtype=torch.float16,
                                             revision="main")

tokenizer = AutoTokenizer.from_pretrained(model_id)

question: "Türkiyenin en büyük şehir nedir?"
# For generating a response
prompt = '''
<s>[INST] {question}  [/INST]
'''
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
output = model.generate(inputs=input_ids,max_new_tokens=512,pad_token_id=tokenizer.eos_token_id,top_k=50, do_sample=True,repetition_penalty=1.3
        top_p=0.95)
response = tokenizer.decode(output[0])

print(response)

Example Generation

<s>[INST] Türkiyenin en büyük şehir nedir? [/INST]
İstanbul, dünyanın en kalabalık ikinci ve Turuncu kütle'de yer almaktadır. Pek çok insandaki birçok ünlüsün bulundusuyla biliniyor.
Downloads last month
2
Safetensors
Model size
6.74B params
Tensor type
BF16
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for malhajar/Llama-2-7b-chat-tr

Adapter
(1332)
this model

Dataset used to train malhajar/Llama-2-7b-chat-tr