Post
6616
New state-of-the-art open LLM! ๐ Databricks just released DBRX, a 132B MoE trained on 12T tokens. Claiming to surpass OpenAI GPT-3.5 and is competitive with Google Gemini 1.0 Pro. ๐คฏ
TL;DR
๐งฎ 132B MoE with 16 experts with 4 active in generation
๐ช 32 000 context window
๐ Outperforms open LLMs on common benchmarks, including MMLU
๐ Up to 2x faster inference than Llama 2 70B
๐ป Trained on 12T tokens
๐ก Uses the GPT-4 tokenizer
๐ Custom License, commercially useable
Collection: databricks/dbrx-6601c0852a0cdd3c59f71962
Demo: databricks/dbrx-instruct
Kudos to the Team at Databricks and MosaicML for this strong release in the open community! ๐ค
TL;DR
๐งฎ 132B MoE with 16 experts with 4 active in generation
๐ช 32 000 context window
๐ Outperforms open LLMs on common benchmarks, including MMLU
๐ Up to 2x faster inference than Llama 2 70B
๐ป Trained on 12T tokens
๐ก Uses the GPT-4 tokenizer
๐ Custom License, commercially useable
Collection: databricks/dbrx-6601c0852a0cdd3c59f71962
Demo: databricks/dbrx-instruct
Kudos to the Team at Databricks and MosaicML for this strong release in the open community! ๐ค