4096 context length - is that correct?
#2
by
smcleod
- opened
Just checking is that perhaps a typo for 40960 or 409600?
Hey @smcleod , an extended context length version is forthcoming. Additionally, note that our models were trained with RoPE, allowing them to accept inputs beyond 4K tokens even in their current state.