Spaces:
Running
on
Zero
Running
on
Zero
fastapi_django_main_live
/
open-interpreter
/docs
/language-models
/hosted-models
/aws-sagemaker.mdx
--- | |
title: AWS Sagemaker | |
--- | |
To use Open Interpreter with a model from AWS Sagemaker, set the `model` flag: | |
<CodeGroup> | |
```bash Terminal | |
interpreter --model sagemaker/<model-name> | |
``` | |
```python Python | |
# Sagemaker requires boto3 to be installed on your machine: | |
!pip install boto3 | |
from interpreter import interpreter | |
interpreter.llm.model = "sagemaker/<model-name>" | |
interpreter.chat() | |
``` | |
</CodeGroup> | |
# Supported Models | |
We support the following completion models from AWS Sagemaker: | |
- Meta Llama 2 7B | |
- Meta Llama 2 7B (Chat/Fine-tuned) | |
- Meta Llama 2 13B | |
- Meta Llama 2 13B (Chat/Fine-tuned) | |
- Meta Llama 2 70B | |
- Meta Llama 2 70B (Chat/Fine-tuned) | |
- Your Custom Huggingface Model | |
<CodeGroup> | |
```bash Terminal | |
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b | |
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b-f | |
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-13b | |
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-13b-f | |
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-70b | |
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-70b-b-f | |
interpreter --model sagemaker/<your-hugginface-deployment-name> | |
``` | |
```python Python | |
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b" | |
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b-f" | |
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-13b" | |
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-13b-f" | |
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-70b" | |
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-70b-b-f" | |
interpreter.llm.model = "sagemaker/<your-hugginface-deployment-name>" | |
``` | |
</CodeGroup> | |
# Required Environment Variables | |
Set the following environment variables [(click here to learn how)](https://chat.openai.com/share/1062cdd8-62a1-4aa8-8ec9-eca45645971a) to use these models. | |
| Environment Variable | Description | Where to Find | | |
| ----------------------- | ----------------------------------------------- | ----------------------------------------------------------------------------------- | | |
| `AWS_ACCESS_KEY_ID` | The API access key for your AWS account. | [AWS Account Overview -> Security Credentials](https://console.aws.amazon.com/) | | |
| `AWS_SECRET_ACCESS_KEY` | The API secret access key for your AWS account. | [AWS Account Overview -> Security Credentials](https://console.aws.amazon.com/) | | |
| `AWS_REGION_NAME` | The AWS region you want to use | [AWS Account Overview -> Navigation bar -> Region](https://console.aws.amazon.com/) | | |