|
--- |
|
title: FLUX Prompt Generator |
|
emoji: 😻 |
|
colorFrom: blue |
|
colorTo: gray |
|
sdk: gradio |
|
sdk_version: 4.40.0 |
|
app_file: app.py |
|
pinned: true |
|
license: apache-2.0 |
|
--- |
|
|
|
Windows installation (Watch my youtube video here for more info: ) |
|
|
|
git lfs install |
|
git clone https://huggingface.co/Aitrepreneur/FLUX-Prompt-Generator |
|
|
|
Inside the FLUX-Prompt-Generator create a new virtual env |
|
python -m venv env |
|
env\Scripts\activate |
|
|
|
Install the requirements |
|
pip install -r requirements.txt |
|
|
|
Then run the application |
|
python app.py |
|
|
|
|
|
Inside the app.py file, On line 337: self.groq_client = Groq(api_key="YOUR-GROQ-API-KEY") |
|
replace YOUR-GROQ-API-KEY by your API GROQ Key if you wanna use Groq for the text generation. |
|
|
|
IF YOU WANT TO USE EVERYTHING LOCALLY: |
|
|
|
1) Download and install ollama: https://ollama.com/ |
|
2) Once Ollama is running in the background, download the Llama 3 8B model by running this command in a new cmd window: ollama run llama3 |
|
3) Once the llama 3 8B model is donwloaded go in the FLux prompter Webui, check the "Use Ollama (local)" checkbox and you are good to go |
|
|