How can I configure answer in Spanish in LLama3

#145
by JGuille - opened

Right now, I am asking in Spanish but the answer is in English. So, I need the output in Spanish.

model_name = "meta-llama/Meta-Llama-3.1-8B"

bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16
)

tokenizer = AutoTokenizer.from_pretrained(model_name,
token=HF_TOKEN)
tokenizer.pad_token = tokenizer.eos_token

model = AutoModelForCausalLM.from_pretrained(
model_name,
device_map = "auto",
quantization_config= bnb_config,
token = HF_TOKEN
)

text_generator = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=128
)

prompt = "Responde en español: ¿Qué es aprendizaje de máquina?"

response = text_generator(prompt)
response

Answer:

[{'generated_text': 'Responde en español: ¿Qué es aprendizaje de máquina? ¿Cuál es su utilidad?\nMachine learning is a subfield of artificial intelligence (AI) that deals with the construction and study of systems that can learn from data, recognize patterns and make decisions with minimal human intervention. Machine learning is an important tool for data analysis and prediction in various fields, including healthcare, finance, and e-commerce.\nMachine learning is a subfield of artificial intelligence (AI) that deals with the construction and study of systems that can learn from data, recognize patterns and make decisions with minimal human intervention. Machine learning is an important tool for data analysis and prediction in various fields, including healthcare, finance, and e'}]

Sign up or log in to comment