BAAI
/

How to call the model

#1
by nenu - opened

When I use the calling script of qwen2-vl to call the model, an error will be reported.
You are using a model of type qwen2 to instantiate a model of type qwen2_vl. This is not supported for all configurations of models and can yield errors.
Qwen2VLRotaryEmbedding can now be fully parameterized by passing the model config through the config argument. All other arguments will be removed in v4.46
Some weights of Qwen2VLForConditionalGeneration were not initialized from the model checkpoint at /b4-ai-hl/share_model_zoo/Aquila-VL-2B-llava-qwen and are newly initialized: ['visual.blocks.0.attn.proj.bias', 'visual.blocks.0.attn.proj.weight', 'visual.blocks.0.attn.qkv.bias', 'visual.blocks.0.attn.qkv.weight', 'visual.blocks.0.mlp.fc1.bias', 'visual.blocks.0.mlp.fc1.weight', 'visual.blocks.0.mlp.fc2.bias', 'visual.blocks.0.mlp.fc2.weight', 'visual.blocks.0.norm1.bias', 'visual.blocks.0.norm1.weight', 'visual.blocks.0.norm2.bias', 'visual.blocks.0.norm2.weight', 'visual.blocks.1.attn.proj.bias', 'visual.blocks.1.attn.proj.weight', 'visual.blocks.1.attn.qkv.bias', 'visual.blocks.1.attn.qkv.weight', 'visual.blocks.1.mlp.fc1.bias', 'visual.blocks.1.mlp.fc1.weight', 'visual.blocks.1.mlp.fc2.bias', 'visual.blocks.1.mlp.fc2.weight', 'visual.blocks.1.norm1.bias', 'visual.blocks.1.norm1.weight', 'visual.blocks.1.norm2.bias', 'visual.blocks.1.norm2.weight', 'visual.blocks.10.attn.proj.bias', 'visual.blocks.10.attn.proj.weight', 'visual.blocks.10.attn.qkv.bias', 'visual.blocks.10.attn.qkv.weight', 'visual.blocks.10.mlp.fc1.bias', 'visual.blocks.10.mlp.fc1.weight', 'visual.blocks.10.mlp.fc2.bias', 'visual.blocks.10.mlp.fc2.weight', 'visual.blocks.10.norm1.bias', 'visual.blocks.10.norm1.weight', 'visual.blocks.10.norm2.bias', 'visual.blocks.10.norm2.weight', 'visual.blocks.11.attn.proj.bias', 'visual.blocks.11.attn.proj.weight', 'visual.blocks.11.attn.qkv.bias', 'visual.blocks.11.attn.qkv.weight', 'visual.blocks.11.mlp.fc1.bias', 'visual.blocks.11.mlp.fc1.weight', 'visual.blocks.11.mlp.fc2.bias', 'visual.blocks.11.mlp.fc2.weight', 'visual.blocks.11.norm1.bias', 'visual.blocks.11.norm1.weight', 'visual.blocks.11.norm2.bias', 'visual.blocks.11.norm2.weight', 'visual.blocks.12.attn.proj.bias', 'visual.blocks.12.attn.proj.weight', 'visual.blocks.12.attn.qkv.bias', 'visual.blocks.12.attn.qkv.weight', 'visual.blocks.12.mlp.fc1.bias', 'visual.blocks.12.mlp.fc1.weight', 'visual.blocks.12.mlp.fc2.bias', 'visual.blocks.12.mlp.fc2.weight', 'visual.blocks.12.norm1.bias', 'visual.blocks.12.norm1.weight', 'visual.blocks.12.norm2.bias', 'visual.blocks.12.norm2.weight', 'visual.blocks.13.attn.proj.bias', 'visual.blocks.13.attn.proj.weight', 'visual.blocks.13.attn.qkv.bias', 'visual.blocks.13.attn.qkv.weight', 'visual.blocks.13.mlp.fc1.bias', 'visual.blocks.13.mlp.fc1.weight', 'visual.blocks.13.mlp.fc2.bias', 'visual.blocks.13.mlp.fc2.weight', 'visual.blocks.13.norm1.bias', 'visual.blocks.13.norm1.weight', 'visual.blocks.13.norm2.bias', 'visual.blocks.13.norm2.weight', 'visual.blocks.14.attn.proj.bias', 'visual.blocks.14.attn.proj.weight', 'visual.blocks.14.attn.qkv.bias', 'visual.blocks.14.attn.qkv.weight', 'visual.blocks.14.mlp.fc1.bias', 'visual.blocks.14.mlp.fc1.weight', 'visual.blocks.14.mlp.fc2.bias', 'visual.blocks.14.mlp.fc2.weight', 'visual.blocks.14.norm1.bias', 'visual.blocks.14.norm1.weight', 'visual.blocks.14.norm2.bias', 'visual.blocks.14.norm2.weight', 'visual.blocks.15.attn.proj.bias', 'visual.blocks.15.attn.proj.weight', 'visual.blocks.15.attn.qkv.bias', 'visual.blocks.15.attn.qkv.weight', 'visual.blocks.15.mlp.fc1.bias', 'visual.blocks.15.mlp.fc1.weight', 'visual.blocks.15.mlp.fc2.bias', 'visual.blocks.15.mlp.fc2.weight', 'visual.blocks.15.norm1.bias', 'visual.blocks.15.norm1.weight', 'visual.blocks.15.norm2.bias', 'visual.blocks.15.norm2.weight', 'visual.blocks.16.attn.proj.bias', 'visual.blocks.16.attn.proj.weight', 'visual.blocks.16.attn.qkv.bias', 'visual.blocks.16.attn.qkv.weight', 'visual.blocks.16.mlp.fc1.bias', 'visual.blocks.16.mlp.fc1.weight', 'visual.blocks.16.mlp.fc2.bias', 'visual.blocks.16.mlp.fc2.weight', 'visual.blocks.16.norm1.bias', 'visual.blocks.16.norm1.weight', 'visual.blocks.16.norm2.bias', 'visual.blocks.16.norm2.weight', 'visual.blocks.17.attn.proj.bias', 'visual.blocks.17.attn.proj.weight', 'visual.blocks.17.attn.qkv.bias', 'visual.blocks.17.attn.qkv.weight', 'visual.blocks.17.mlp.fc1.bias', 'visual.blocks.17.mlp.fc1.weight', 'visual.blocks.17.mlp.fc2.bias', 'visual.blocks.17.mlp.fc2.weight', 'visual.blocks.17.norm1.bias', 'visual.blocks.17.norm1.weight', 'visual.blocks.17.norm2.bias', 'visual.blocks.17.norm2.weight', 'visual.blocks.18.attn.proj.bias', 'visual.blocks.18.attn.proj.weight', 'visual.blocks.18.attn.qkv.bias', 'visual.blocks.18.attn.qkv.weight', 'visual.blocks.18.mlp.fc1.bias', 'visual.blocks.18.mlp.fc1.weight', 'visual.blocks.18.mlp.fc2.bias', 'visual.blocks.18.mlp.fc2.weight', 'visual.blocks.18.norm1.bias', 'visual.blocks.18.norm1.weight', 'visual.blocks.18.norm2.bias', 'visual.blocks.18.norm2.weight', 'visual.blocks.19.attn.proj.bias', 'visual.blocks.19.attn.proj.weight', 'visual.blocks.19.attn.qkv.bias', 'visual.blocks.19.attn.qkv.weight', 'visual.blocks.19.mlp.fc1.bias', 'visual.blocks.19.mlp.fc1.weight', 'visual.blocks.19.mlp.fc2.bias', 'visual.blocks.19.mlp.fc2.weight', 'visual.blocks.19.norm1.bias', 'visual.blocks.19.norm1.weight', 'visual.blocks.19.norm2.bias', 'visual.blocks.19.norm2.weight', 'visual.blocks.2.attn.proj.bias', 'visual.blocks.2.attn.proj.weight', 'visual.blocks.2.attn.qkv.bias', 'visual.blocks.2.attn.qkv.weight', 'visual.blocks.2.mlp.fc1.bias', 'visual.blocks.2.mlp.fc1.weight', 'visual.blocks.2.mlp.fc2.bias', 'visual.blocks.2.mlp.fc2.weight', 'visual.blocks.2.norm1.bias', 'visual.blocks.2.norm1.weight', 'visual.blocks.2.norm2.bias', 'visual.blocks.2.norm2.weight', 'visual.blocks.20.attn.proj.bias', 'visual.blocks.20.attn.proj.weight', 'visual.blocks.20.attn.qkv.bias', 'visual.blocks.20.attn.qkv.weight', 'visual.blocks.20.mlp.fc1.bias', 'visual.blocks.20.mlp.fc1.weight', 'visual.blocks.20.mlp.fc2.bias', 'visual.blocks.20.mlp.fc2.weight', 'visual.blocks.20.norm1.bias', 'visual.blocks.20.norm1.weight', 'visual.blocks.20.norm2.bias', 'visual.blocks.20.norm2.weight', 'visual.blocks.21.attn.proj.bias', 'visual.blocks.21.attn.proj.weight', 'visual.blocks.21.attn.qkv.bias', 'visual.blocks.21.attn.qkv.weight', 'visual.blocks.21.mlp.fc1.bias', 'visual.blocks.21.mlp.fc1.weight', 'visual.blocks.21.mlp.fc2.bias', 'visual.blocks.21.mlp.fc2.weight', 'visual.blocks.21.norm1.bias', 'visual.blocks.21.norm1.weight', 'visual.blocks.21.norm2.bias', 'visual.blocks.21.norm2.weight', 'visual.blocks.22.attn.proj.bias', 'visual.blocks.22.attn.proj.weight', 'visual.blocks.22.attn.qkv.bias', 'visual.blocks.22.attn.qkv.weight', 'visual.blocks.22.mlp.fc1.bias', 'visual.blocks.22.mlp.fc1.weight', 'visual.blocks.22.mlp.fc2.bias', 'visual.blocks.22.mlp.fc2.weight', 'visual.blocks.22.norm1.bias', 'visual.blocks.22.norm1.weight', 'visual.blocks.22.norm2.bias', 'visual.blocks.22.norm2.weight', 'visual.blocks.23.attn.proj.bias', 'visual.blocks.23.attn.proj.weight', 'visual.blocks.23.attn.qkv.bias', 'visual.blocks.23.attn.qkv.weight', 'visual.blocks.23.mlp.fc1.bias', 'visual.blocks.23.mlp.fc1.weight', 'visual.blocks.23.mlp.fc2.bias', 'visual.blocks.23.mlp.fc2.weight', 'visual.blocks.23.norm1.bias', 'visual.blocks.23.norm1.weight', 'visual.blocks.23.norm2.bias', 'visual.blocks.23.norm2.weight', 'visual.blocks.24.attn.proj.bias', 'visual.blocks.24.attn.proj.weight', 'visual.blocks.24.attn.qkv.bias', 'visual.blocks.24.attn.qkv.weight', 'visual.blocks.24.mlp.fc1.bias', 'visual.blocks.24.mlp.fc1.weight', 'visual.blocks.24.mlp.fc2.bias', 'visual.blocks.24.mlp.fc2.weight', 'visual.blocks.24.norm1.bias', 'visual.blocks.24.norm1.weight', 'visual.blocks.24.norm2.bias', 'visual.blocks.24.norm2.weight', 'visual.blocks.25.attn.proj.bias', 'visual.blocks.25.attn.proj.weight', 'visual.blocks.25.attn.qkv.bias', 'visual.blocks.25.attn.qkv.weight', 'visual.blocks.25.mlp.fc1.bias', 'visual.blocks.25.mlp.fc1.weight', 'visual.blocks.25.mlp.fc2.bias', 'visual.blocks.25.mlp.fc2.weight', 'visual.blocks.25.norm1.bias', 'visual.blocks.25.norm1.weight', 'visual.blocks.25.norm2.bias', 'visual.blocks.25.norm2.weight', 'visual.blocks.26.attn.proj.bias', 'visual.blocks.26.attn.proj.weight', 'visual.blocks.26.attn.qkv.bias', 'visual.blocks.26.attn.qkv.weight', 'visual.blocks.26.mlp.fc1.bias', 'visual.blocks.26.mlp.fc1.weight', 'visual.blocks.26.mlp.fc2.bias', 'visual.blocks.26.mlp.fc2.weight', 'visual.blocks.26.norm1.bias', 'visual.blocks.26.norm1.weight', 'visual.blocks.26.norm2.bias', 'visual.blocks.26.norm2.weight', 'visual.blocks.27.attn.proj.bias', 'visual.blocks.27.attn.proj.weight', 'visual.blocks.27.attn.qkv.bias', 'visual.blocks.27.attn.qkv.weight', 'visual.blocks.27.mlp.fc1.bias', 'visual.blocks.27.mlp.fc1.weight', 'visual.blocks.27.mlp.fc2.bias', 'visual.blocks.27.mlp.fc2.weight', 'visual.blocks.27.norm1.bias', 'visual.blocks.27.norm1.weight', 'visual.blocks.27.norm2.bias', 'visual.blocks.27.norm2.weight', 'visual.blocks.28.attn.proj.bias', 'visual.blocks.28.attn.proj.weight', 'visual.blocks.28.attn.qkv.bias', 'visual.blocks.28.attn.qkv.weight', 'visual.blocks.28.mlp.fc1.bias', 'visual.blocks.28.mlp.fc1.weight', 'visual.blocks.28.mlp.fc2.bias', 'visual.blocks.28.mlp.fc2.weight', 'visual.blocks.28.norm1.bias', 'visual.blocks.28.norm1.weight', 'visual.blocks.28.norm2.bias', 'visual.blocks.28.norm2.weight', 'visual.blocks.29.attn.proj.bias', 'visual.blocks.29.attn.proj.weight', 'visual.blocks.29.attn.qkv.bias', 'visual.blocks.29.attn.qkv.weight', 'visual.blocks.29.mlp.fc1.bias', 'visual.blocks.29.mlp.fc1.weight', 'visual.blocks.29.mlp.fc2.bias', 'visual.blocks.29.mlp.fc2.weight', 'visual.blocks.29.norm1.bias', 'visual.blocks.29.norm1.weight', 'visual.blocks.29.norm2.bias', 'visual.blocks.29.norm2.weight', 'visual.blocks.3.attn.proj.bias', 'visual.blocks.3.attn.proj.weight', 'visual.blocks.3.attn.qkv.bias', 'visual.blocks.3.attn.qkv.weight', 'visual.blocks.3.mlp.fc1.bias', 'visual.blocks.3.mlp.fc1.weight', 'visual.blocks.3.mlp.fc2.bias', 'visual.blocks.3.mlp.fc2.weight', 'visual.blocks.3.norm1.bias', 'visual.blocks.3.norm1.weight', 'visual.blocks.3.norm2.bias', 'visual.blocks.3.norm2.weight', 'visual.blocks.30.attn.proj.bias', 'visual.blocks.30.attn.proj.weight', 'visual.blocks.30.attn.qkv.bias', 'visual.blocks.30.attn.qkv.weight', 'visual.blocks.30.mlp.fc1.bias', 'visual.blocks.30.mlp.fc1.weight', 'visual.blocks.30.mlp.fc2.bias', 'visual.blocks.30.mlp.fc2.weight', 'visual.blocks.30.norm1.bias', 'visual.blocks.30.norm1.weight', 'visual.blocks.30.norm2.bias', 'visual.blocks.30.norm2.weight', 'visual.blocks.31.attn.proj.bias', 'visual.blocks.31.attn.proj.weight', 'visual.blocks.31.attn.qkv.bias', 'visual.blocks.31.attn.qkv.weight', 'visual.blocks.31.mlp.fc1.bias', 'visual.blocks.31.mlp.fc1.weight', 'visual.blocks.31.mlp.fc2.bias', 'visual.blocks.31.mlp.fc2.weight', 'visual.blocks.31.norm1.bias', 'visual.blocks.31.norm1.weight', 'visual.blocks.31.norm2.bias', 'visual.blocks.31.norm2.weight', 'visual.blocks.4.attn.proj.bias', 'visual.blocks.4.attn.proj.weight', 'visual.blocks.4.attn.qkv.bias', 'visual.blocks.4.attn.qkv.weight', 'visual.blocks.4.mlp.fc1.bias', 'visual.blocks.4.mlp.fc1.weight', 'visual.blocks.4.mlp.fc2.bias', 'visual.blocks.4.mlp.fc2.weight', 'visual.blocks.4.norm1.bias', 'visual.blocks.4.norm1.weight', 'visual.blocks.4.norm2.bias', 'visual.blocks.4.norm2.weight', 'visual.blocks.5.attn.proj.bias', 'visual.blocks.5.attn.proj.weight', 'visual.blocks.5.attn.qkv.bias', 'visual.blocks.5.attn.qkv.weight', 'visual.blocks.5.mlp.fc1.bias', 'visual.blocks.5.mlp.fc1.weight', 'visual.blocks.5.mlp.fc2.bias', 'visual.blocks.5.mlp.fc2.weight', 'visual.blocks.5.norm1.bias', 'visual.blocks.5.norm1.weight', 'visual.blocks.5.norm2.bias', 'visual.blocks.5.norm2.weight', 'visual.blocks.6.attn.proj.bias', 'visual.blocks.6.attn.proj.weight', 'visual.blocks.6.attn.qkv.bias', 'visual.blocks.6.attn.qkv.weight', 'visual.blocks.6.mlp.fc1.bias', 'visual.blocks.6.mlp.fc1.weight', 'visual.blocks.6.mlp.fc2.bias', 'visual.blocks.6.mlp.fc2.weight', 'visual.blocks.6.norm1.bias', 'visual.blocks.6.norm1.weight', 'visual.blocks.6.norm2.bias', 'visual.blocks.6.norm2.weight', 'visual.blocks.7.attn.proj.bias', 'visual.blocks.7.attn.proj.weight', 'visual.blocks.7.attn.qkv.bias', 'visual.blocks.7.attn.qkv.weight', 'visual.blocks.7.mlp.fc1.bias', 'visual.blocks.7.mlp.fc1.weight', 'visual.blocks.7.mlp.fc2.bias', 'visual.blocks.7.mlp.fc2.weight', 'visual.blocks.7.norm1.bias', 'visual.blocks.7.norm1.weight', 'visual.blocks.7.norm2.bias', 'visual.blocks.7.norm2.weight', 'visual.blocks.8.attn.proj.bias', 'visual.blocks.8.attn.proj.weight', 'visual.blocks.8.attn.qkv.bias', 'visual.blocks.8.attn.qkv.weight', 'visual.blocks.8.mlp.fc1.bias', 'visual.blocks.8.mlp.fc1.weight', 'visual.blocks.8.mlp.fc2.bias', 'visual.blocks.8.mlp.fc2.weight', 'visual.blocks.8.norm1.bias', 'visual.blocks.8.norm1.weight', 'visual.blocks.8.norm2.bias', 'visual.blocks.8.norm2.weight', 'visual.blocks.9.attn.proj.bias', 'visual.blocks.9.attn.proj.weight', 'visual.blocks.9.attn.qkv.bias', 'visual.blocks.9.attn.qkv.weight', 'visual.blocks.9.mlp.fc1.bias', 'visual.blocks.9.mlp.fc1.weight', 'visual.blocks.9.mlp.fc2.bias', 'visual.blocks.9.mlp.fc2.weight', 'visual.blocks.9.norm1.bias', 'visual.blocks.9.norm1.weight', 'visual.blocks.9.norm2.bias', 'visual.blocks.9.norm2.weight', 'visual.merger.ln_q.bias', 'visual.merger.ln_q.weight', 'visual.merger.mlp.0.bias', 'visual.merger.mlp.0.weight', 'visual.merger.mlp.2.bias', 'visual.merger.mlp.2.weight', 'visual.patch_embed.proj.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Traceback (most recent call last):
File "/home/xin.jiang3/MMLM/baai/baai_2b.py", line 89, in
text = processor.apply_chat_template(
File "/home/xin.jiang3/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1867, in apply_chat_template
rendered_chat = compiled_template.render(
File "/usr/local/lib/python3.10/dist-packages/jinja2/environment.py", line 1301, in render
self.environment.handle_exception()
File "/usr/local/lib/python3.10/dist-packages/jinja2/environment.py", line 936, in handle_exception
raise rewrite_traceback_stack(source=source)
File "

Beijing Academy of Artificial Intelligence org
edited 10 days ago

Hi @nenu

In this repo, we haven’t uploaded the corresponding model yet. You can try Aquila-VL-2B, which will have nearly the same performance as the model in this repo.

The model architecture is the same as Llava-OneVision, and method for using Aquila-VL-2B is here.

Best

Sign up or log in to comment