Config.json missing model_max_length
#3
by
MarsupialAI
- opened
Ran into an error trying to quantize this with LCPP. Adding
"model_max_length": 131072
resolved that error. There are other errors which are likely on the LCPP side of things, but you probably want to fix your config.
I'm assuming the max length I entered is correct?
ahmetustun
changed discussion status to
closed
ahmetustun
changed discussion status to
open
Thanks for noticing, I have updated the config.
ahmetustun
changed discussion status to
closed