Spaces:
Sleeping
Sleeping
metadata
title: Chat with LLMs
emoji: 🤖💬
colorFrom: purple
colorTo: blue
sdk: gradio
sdk_version: 4.26.0
app_file: app.py
pinned: true
short_description: Chat with LLMs
Running Locally
- Check pre-conditions:
- Git Large File Storage (LFS) must have been installed.
- Run
python --version
to make sure you're running Python version 3.10 or above. - The latest PyTorch must have been installed. Here is a sample
conda
command for Linix/WSL2:
conda install -y pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
- Clone the repo
git lfs install
git clone https://huggingface.co/spaces/inflaton-ai/llm-qa-bench
- Install packages
pip install -r requirements.txt
- Set up your environment variables
- By default, environment variables are loaded from
.env.example
file - If you don't want to use the default settings, copy
.env.example
into.env
. Your can then update it for your local runs.
- Run automated test:
python qa_chain_test.py
- Start the local server at
http://localhost:7860
:
python app.py