Spaces:
Runtime error
title: BeyondChatGPT Demo
emoji: π
colorFrom: pink
colorTo: yellow
sdk: docker
pinned: false
:wave: Welcome to Beyond ChatGPT!!
Agenda
Build ποΈ
Build and containerize your App
Ship π’
Deploy your App on Hugging Face
Share π
- Submit the link to your App for this assignment!
- Submit the link to a Loom video walkthrough (<5 min) of your Interactive Development Environment (IDE) for LLM Ops and your first LLM application
- Share 3 lessons learned
- Share 3 lessons not yet learned
- Make a social media post about your final application and tag @AIMakerspace
Here's a template to get your post started!
ππ Exciting News! ππ
ποΈ Today, I'm thrilled to announce that I've successfully built and shipped my first-ever LLM using the powerful combination of Chainlit, Docker, and the OpenAI API! π₯οΈ
Here are my 3 main takeaways!
- Takeaway 1
- Takeaway 2
- Takeaway 3
Check it out π
[LINK TO APP]
A big shoutout to the @**AI Makerspace** for all making this possible. Couldn't have done it without the incredible community there. π€π
Looking forward to building with the community! πβ¨ Here's to many more creations ahead! π₯π
Who else is diving into the world of AI? Let's connect! ππ‘
#FirstLLM #Chainlit #Docker #OpenAI #AIMakerspace
π€ Your First LLM App
If you need an introduction to
git
, or information on how to set up API keys for the tools we'll be using in this repository - check out our Interactive Dev Environment for LLM Development which has everything you'd need to get started in this repository!
In this repository, we'll walk you through the steps to create a Large Language Model (LLM) application using Chainlit, then containerize it using Docker, and finally deploy it on Huggingface Spaces.
Are you ready? Let's get started!
π₯οΈ Accessing "gpt-3.5-turbo" (ChatGPT) like a developer
Head to this notebook and follow along with the instructions!
Complete the notebook and try out your own system/assistant messages!
That's it! Head to the next step and start building your application!
ποΈ Building Your First LLM App
Clone this repo.
git clone https://github.com/AI-Maker-Space/Beyond-ChatGPT.git
Navigate inside this repo
cd Beyond-ChatGPT
Install the packages required for this python envirnoment in
requirements.txt
.pip install -r requirements.txt
Open your
.env.sample
file. Replace the###
in your.env.sample
file with your OpenAI Key and save the file - rename the.env.sample
file to.env
.OPENAI_API_KEY=sk-###
Let's try deploying it locally. Make sure you're in the python environment where you installed Chainlit and OpenAI. Run the app using Chainlit. This may take a minute to run.
chainlit run app.py -w
Great work! Let's see if we can interact with our chatbot.
Awesome! Time to throw it into a docker container and prepare it for shipping!
π³ Containerizing our App
Let's build the Docker image. We'll tag our image as
llm-app
using the-t
parameter. The.
at the end means we want all of the files in our current directory to be added to our image.docker build -t llm-app .
You'll see a number of steps - each of those steps corresponds to an item outlined in our Dockerfile
and the build process.
If you'd like to learn more - check out this resource by Docker: build
Run and test the Docker image locally using the
run
command. The-p
parameter connects our host port # to the left of the:
to our container port # on the right.docker run -p 7860:7860 llm-app
Visit http://localhost:7860 in your browser to see if the app runs correctly.
Great! Time to ship!
π Deploying Your First LLM App
- Let's create a new Huggingface Space. Navigate to Huggingface and click on your profile picture on the top right. Then click on
New Space
.
- Setup your space as shown below:
- Owner: Your username
- Space Name:
llm-app
- License:
Openrail
- Select the Space SDK:
Docker
- Docker Template:
Blank
- Space Hardware:
CPU basic - 2 vCPU - 16 GB - Free
- Repo type:
Public
- You should see something like this. We're now ready to send our files to our Huggingface Space. After cloning, move your files to this repo and push it along with your docker file. You DO NOT need to create a Dockerfile. Make sure NOT TO push your
.env
file. This should automatically be ignored.
- After pushing all files, navigate to the settings in the top right to add your OpenAI API key.
- Scroll down to
Variables and secrets
and click onNew secret
on the top right.
- Set the name to
OPENAI_API_KEY
and add your OpenAI key underValue
. Click save.
- To ensure your key is being used, we recommend you
Restart this Space
.
- Congratulations! You just deployed your first LLM! πππ Get on linkedin and post your results and experience using the provided template! Make sure to tag us at #AIMakerspace !