Use openbmb/MiniCPM-2B-sft-bf16-llama-format Fail

#3
by zjh6177 - opened

When I use the code showing in "https://github.com/OpenBMB/MiniCPM/tree/main?tab=readme-ov-file#minicpm-2b-llama-format", I got error message below:
Some weights of LlamaForCausalLM were not initialized from the model checkpoint at openbmb/MiniCPM-2B-dpo-bf16-llama-format and are newly initialized: ['lm_head.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

Do anyone have same error ? and how to fix? Thanks

Can fix by:model.lm_head.weight = model.model.embed_tokens.weight

Sign up or log in to comment