This model is a quantized version of echarlaix/distilgpt2-openvino
and was exported to the OpenVINO format using optimum-intel via the nncf-quantization space.
First make sure you have optimum-intel installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForCausalLM
model_id = "echarlaix/distilgpt2-openvino-int4"
model = OVModelForCausalLM.from_pretrained(model_id)
- Downloads last month
- 13
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.