Context-length
#1
by
ceoofcapybaras
- opened
Hey, thanks for releasing this model! I am curious why does the title say 8K, the description also says that it has been trained to extend context length from 2k -> 8k. But at the top of the model card it says "Context Length 2048", and config.json also says "seq_len": 2048.
Thanks for catching this, Ill fix it
Thanks!
ceoofcapybaras
changed discussion status to
closed