PEFT inquiry

#13
by MightyStudent - opened

Hello, just wondering if the PEFT using Qlora will be changed for this mode?

So far I was using these configs and it worked just fine, I am not sure if there are things that will change.

config = LoraConfig(r=32, lora_alpha=256, target_modules=["q_proj", "v_proj"], lora_dropout=0.05, bias="none")

Also can I attach previous PEFT adapter to the turbo model?

Sign up or log in to comment