QueryYourDocs / Dockerfile
LVKinyanjui's picture
Applied llama3.1-8b as a streamlit backend; first try
ec455f6
raw
history blame
No virus
438 Bytes
ARG PYTHON_VERSION=3.10.12
FROM python:${PYTHON_VERSION}-slim as base
WORKDIR /app
COPY requirements.txt .
RUN python -m pip install --no-cache-dir -r requirements.txt
# # Install ollama
# RUN curl -fsSL https://ollama.com/install.sh | sh
# Copy the source code into the container.
COPY . .
# Expose the port that the application listens on.
EXPOSE 8000
# Run the application.
CMD streamlit run app_inference.py --server.port 7860