Fix https://github.com/huggingface/peft/issues/1974#issue-2437471248
Browse filesThe issue is also posted at https://github.com/THUDM/GLM-4/issues/493. `THUDM/glm-4-9b-chat` uses custom code, which is not compatible with PromptEncoder. As the error indicates, this line fails:
> batch_size, seq_length = input_ids.shape
> AttributeError: 'NoneType' object has no attribute 'shape'
This is because the prompt encoder does not pass input_ids, instead passing the inputs_embeds directly. I could patch the issue by using inputs_embeds instead by editing this line:
https://huggingface.co/THUDM/glm-4-9b-chat/blob/c24133cef34ff7a7010f1e97c113effdead0966b/modeling_chatglm.py#L875
```python
if input_ids is not None:
batch_size, seq_length = input_ids.shape
else:
batch_size, seq_length, _ = inputs_embeds.shape
```
- modeling_chatglm.py +4 -1
@@ -872,7 +872,10 @@ class ChatGLMModel(ChatGLMPreTrainedModel):
|
|
872 |
use_cache = use_cache if use_cache is not None else self.config.use_cache
|
873 |
return_dict = return_dict if return_dict is not None else self.config.use_return_dict
|
874 |
|
875 |
-
|
|
|
|
|
|
|
876 |
|
877 |
if inputs_embeds is None:
|
878 |
inputs_embeds = self.embedding(input_ids)
|
|
|
872 |
use_cache = use_cache if use_cache is not None else self.config.use_cache
|
873 |
return_dict = return_dict if return_dict is not None else self.config.use_return_dict
|
874 |
|
875 |
+
if input_ids is not None:
|
876 |
+
batch_size, seq_length = input_ids.shape
|
877 |
+
else:
|
878 |
+
batch_size, seq_length, _ = inputs_embeds.shape
|
879 |
|
880 |
if inputs_embeds is None:
|
881 |
inputs_embeds = self.embedding(input_ids)
|