Is the context length of this model the same as it's base codellama model which is 4k tokens?
100k for both models.
· Sign up or log in to comment