This open-source model was created by The Qwen Team of Alibaba cloud . You can find the release blog post here. The model is available on the huggingface hub: https://huggingface.co/Qwen/Qwen2.5-32B-Instruct. The 32B model was pretrained on 18 trillion tokens spanning 29 languages. It supports up to 128K tokens and can generate up to 8K tokens.