How many vram

#44
by Dizzl500 - opened

How many vram needed to run it? Can i run it on 3060 12Gb?

Meta Llama org

Are you sure you want to run the Base model? You might want to use the Instruct model for chatting.

It needs about 28GB in bf16 quantization

how much GPU required for running 11B model ?

Sign up or log in to comment