How to export ONNX from finetune llaval-interleave model

#8
by jasonwang110 - opened

Thank you for sharing, your sharing is very helpful to me. There is a question, when fine-tuning the model again, how can we convert it to onnx? Can you provide a tutorial or script to guide us to do that?

Llava Hugging Face org

The conversion part is a bit tricky and has to be tailored for each llava model. Currently we don't have a clean and easy to run conversion script, maybe @Xenova can share his code. However note, you may still need to make changes to the code to convert successfully

Happy to receive your suggestions. Currently I plan to finetune llava-interval-0.5b and llava-oneversion and deploy quantization onnx. If possible, please share your code for converting the model to onnx file, and then we could make modifications based on your great work. Now I am confused where to start. I am hearing your good news @Xenova .
Thanks。

hello @Xenova :
Sorry to bother you. I'd like to ask if there are any recent plans to sharing the code of converting the Llava-interleave model to the Onnx file. If not, could you briefly describe the method or process of the conversion?
Thank you

Llava Hugging Face org

Hehe, I just remembered that @Xenova shared one of the conversion script with me earlier. Here you go, I didn't test it so I hope it works for you

https://colab.research.google.com/drive/1IhC8YOV68cze0XWGfuqSclnVTt_FskUd?usp=sharing

Hi @RaushanTurganbay ,
Thank for your sharing. Based on your above the tip,I have succeed in converting three models of the LLava model into onnx files.
Thanks.

Sign up or log in to comment