Model supports more then 64k tokens out of the box?
#1
by
nullt3r
- opened
Hi,
first thank you for your work!
I have a question regarding the model's context length. Could you clarify if the model supports a context length greater than 64k tokens out of the box?
Practically speaking, can I set the context size to 80k and more? I suppose that should be supported, because config says:
"max_position_embeddings": 131072
Just want to be sure.
Thanks!