---
title: Mistral AI API
---
To use Open Interpreter with the Mistral API, set the `model` flag:
```bash Terminal
interpreter --model mistral/
```
```python Python
from interpreter import interpreter
interpreter.llm.model = "mistral/"
interpreter.chat()
```
# Supported Models
We support the following completion models from the Mistral API:
- mistral-tiny
- mistral-small
- mistral-medium
```bash Terminal
interpreter --model mistral/mistral-tiny
interpreter --model mistral/mistral-small
interpreter --model mistral/mistral-medium
```
```python Python
interpreter.llm.model = "mistral/mistral-tiny"
interpreter.llm.model = "mistral/mistral-small"
interpreter.llm.model = "mistral/mistral-medium"
```
# Required Environment Variables
Set the following environment variables [(click here to learn how)](https://chat.openai.com/share/1062cdd8-62a1-4aa8-8ec9-eca45645971a) to use these models.
| Environment Variable | Description | Where to Find |
| -------------------- | -------------------------------------------- | -------------------------------------------------- |
| `MISTRAL_API_KEY` | The Mistral API key from Mistral API Console | [Mistral API Console](https://console.mistral.ai/user/api-keys/) |