Spaces:
Sleeping
Sleeping
metadata
title: Chat with LLMs
emoji: 🤖💬
colorFrom: purple
colorTo: blue
sdk: gradio
sdk_version: 4.26.0
app_file: app.py
pinned: true
short_description: Chat with LLMs
Running Locally
- Check pre-conditions:
- Git Large File Storage (LFS) must have been installed.
- Run
python --version
to make sure you're running Python version 3.10 or above. - The latest PyTorch with GPU support must have been installed. Here is a sample
conda
command:
conda install -y pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
- Clone the repo
git lfs install
git clone https://huggingface.co/spaces/inflaton-ai/llm-qa-bench
- Install packages
pip install -r requirements.txt
4. Set up your environment variables
- By default, environment variables are loaded from `.env.example` file
- If you don't want to use the default settings, copy `.env.example` into `.env`. Your can then update it for your local runs.
5. Run automated test:
python qa_chain_test.py
6. Start the local server at `http://localhost:7860`:
python app.py