Context length

#1
by SporkySporkness - opened

I love Gemma 2 27b, but the short context is an issue for me. What is the context on this fine-tune? Thx

Anthracite org

Gemma's max context is 8k & we trained at 8k. You could try rope scaling the model to 16k as some others ive heard have done.

lucyknada changed discussion status to closed

Sign up or log in to comment