File size: 1,379 Bytes
886d8e9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
title: Mistral AI API
---

To use Open Interpreter with the Mistral API, set the `model` flag:

<CodeGroup>

```bash Terminal
interpreter --model mistral/<mistral-model>
```

```python Python
from interpreter import interpreter

interpreter.llm.model = "mistral/<mistral-model>"
interpreter.chat()
```

</CodeGroup>

# Supported Models

We support the following completion models from the Mistral API:

- mistral-tiny
- mistral-small
- mistral-medium

<CodeGroup>

```bash Terminal

interpreter --model mistral/mistral-tiny
interpreter --model mistral/mistral-small
interpreter --model mistral/mistral-medium
```

```python Python
interpreter.llm.model = "mistral/mistral-tiny"
interpreter.llm.model = "mistral/mistral-small"
interpreter.llm.model = "mistral/mistral-medium"
```

</CodeGroup>

# Required Environment Variables

Set the following environment variables [(click here to learn how)](https://chat.openai.com/share/1062cdd8-62a1-4aa8-8ec9-eca45645971a) to use these models.

| Environment Variable | Description                                  | Where to Find                                      |
| -------------------- | -------------------------------------------- | -------------------------------------------------- |
| `MISTRAL_API_KEY`    | The Mistral API key from Mistral API Console | [Mistral API Console](https://console.mistral.ai/user/api-keys/) |