---
title: Huggingface
---
To use Open Interpreter with Huggingface models, set the `model` flag:
```bash Terminal
interpreter --model huggingface/
```
```python Python
from interpreter import interpreter
interpreter.llm.model = "huggingface/"
interpreter.chat()
```
You may also need to specify your Huggingface api base url:
```bash Terminal
interpreter --api_base
```
```python Python
from interpreter import interpreter
interpreter.llm.api_base = "https://my-endpoint.huggingface.cloud"
interpreter.chat()
```
# Supported Models
Open Interpreter should work with almost any text based hugging face model.
# Required Environment Variables
Set the following environment variables [(click here to learn how)](https://chat.openai.com/share/1062cdd8-62a1-4aa8-8ec9-eca45645971a) to use these models.
| Environment Variable | Description | Where to Find |
| ---------------------- | --------------------------- | ---------------------------------------------------------------------------------- |
| `HUGGINGFACE_API_KEY'` | Huggingface account API key | [Huggingface -> Settings -> Access Tokens](https://huggingface.co/settings/tokens) |