alexsherstinsky/unsloth_llava-v1.6-mistral-7b-hf-bnb-4bit-llm-finetuned-llava-instruct-mix-vsft-mini-notmerged
Updated
@elsatch Thank you for tagging me in this conversation! I think that while the approach with LoRA adapters would (in my opinion) be relying on a different technique, the results could indeed be favorable. Here are some recent papers that point to high effectiveness of LoRA as a specific PEFT method on a variety of examples and application domains:
Happy to discuss! Thanks again!