elucidator8918's picture
Upload README.md with huggingface_hub
1adf837 verified
metadata
base_model: PatronusAI/Llama-3-Patronus-Lynx-8B-Instruct
language:
  - en
library_name: transformers
license: cc-by-nc-4.0
tags:
  - text-generation
  - pytorch
  - Lynx
  - Patronus AI
  - evaluation
  - hallucination-detection
  - openvino
  - nncf
  - 4-bit

This model is a quantized version of PatronusAI/Llama-3-Patronus-Lynx-8B-Instruct and is converted to the OpenVINO format. This model was obtained via the nncf-quantization space with optimum-intel.

First make sure you have optimum-intel installed:

pip install optimum[openvino]

To load your model you can do as follows:

from optimum.intel import OVModelForCausalLM

model_id = "elucidator8918/Llama-3-Patronus-Lynx-8B-Instruct-openvino-4bit"
model = OVModelForCausalLM.from_pretrained(model_id)