Model Card for traclm-v3-7b-instruct-GGUF
This repo contains an GGUF quantizations of TRAC-MTRY/traclm-v3-7b-instruct for utilization of the model on low-resource hardware.
Read more about GGUF quantization here.
Read more about the unquantized model here.
Prompt Format
This model was fine-tuned with the chatml prompt format. It is highly recommended that you use the same format for any interactions with the model. Failure to do so will degrade performance significantly.
ChatML Format:
<|im_start|>system
Provide some context and/or instructions to the model.
<|im_end|>
<|im_start|>user
The user’s message goes here
<|im_end|>
<|im_start|>assistant
The ChatML format can easily be applied to text you plan to process with the model using the chat_template
included in the tokenizer. Read here for additional information.
Model Card Contact
MAJ Daniel C. Ruiz (daniel.ruiz@nps.edu)
- Downloads last month
- 0