Hardware requirements

#10
by ZahirHamroune - opened

Please,
out_of_memory.png
Can someone tell me what GPU hardware (vRam) is needed to load Qwen-VL-2B? Because when I look, it's only 4 GB, but I get an OUT OF MEMORY error even though I'm using a Colab with 15 GB of vRam. Does anyone have any experience to share or suggestions?

Same issue. Cannot seem to figure it out

Do you get the error when loading the model or when running inference?

Sign up or log in to comment