Multi-GPU Support?
#4
by
supernovastar
- opened
Does "Memory Requirements: 48GB+ VRAM" mean this VRAM size needs to be on a single GPU, or can it be the sum of multiple GPUs, where each GPU can have less than 48GB?
Actually, it's not a strict 48GB requirement. If you know how to modify the code, you can first generate the Qwen2VL image embeddings, then delete Qwen2VL and remove this model from GPU memory. Similarly, if you're using T5-XXL, after generating embeddings, you can delete T5. This way, only the Flux model remains in GPU memory, and you actually don't need 48GB VRAM at all.
We could split Qwen2VL and Flux to separate GPUs; and even tensor-parallel Qwen2VL to two GPUs to utilize multi gpu