Molmo Collection Artifacts for open multimodal language models. β’ 5 items β’ Updated 3 days ago β’ 276
Meta Llama 3 Collection This collection hosts the transformers and original repos of the Meta Llama 3 and Llama Guard 2 releases β’ 5 items β’ Updated Sep 25 β’ 683
Qwen1.5 Collection Qwen1.5 is the improved version of Qwen, the large language model series developed by Alibaba Cloud. β’ 55 items β’ Updated 3 days ago β’ 206
Canonical models Collection This collection lists all the historical (pre-"Hub") canonical model checkpoints, i.e. repos that were not under an org or user namespace β’ 68 items β’ Updated Feb 13 β’ 13
Switch-Transformers release Collection This release included various MoE (Mixture of expert) models, based on the T5 architecture . The base models use from 8 to 256 experts. β’ 9 items β’ Updated Jul 31 β’ 15
LLM in a flash: Efficient Large Language Model Inference with Limited Memory Paper β’ 2312.11514 β’ Published Dec 12, 2023 β’ 258
The Flan Collection: Designing Data and Methods for Effective Instruction Tuning Paper β’ 2301.13688 β’ Published Jan 31, 2023 β’ 8