Context Length?
#1
by
kst23
- opened
Hey, just wanted to know what's the context-length of these models? Is it still 4k or does it now support 32k now with function calls?
Thanks.
It's 32k according to config.json.
Yeah, but the last model I tried (I guess it was medium v2.2), in that after a certain context size, the model kept forgetting that it does have function access... Guess that was near around 3-4k context only.
Hi, the context length that we used to train v2.4 models is 8k. You can refer to here for more details about all our models.
That's cool... thanks for sharing!
kst23
changed discussion status to
closed