Mar 28 (edited) in 💬 General
Introducing DBRX: A New State-of-the-Art Open LLM
Introducing DBRX, an open, general-purpose LLM created by Databricks. Across a range of standard benchmarks, DBRX sets a new state-of-the-art for established open LLMs.
  • mixture-of-experts (MoE) architecture with 132B total parameters, of which 36B parameters are active on any input
  • trained on 12 trillion tokens — Llama 2 was 2T
  • maximum context length of 32k tokens
  • Llama-like license: non-commercial terms set at 700 million users and cannot train on outputs. 
13
12 comments
Marcio Pacheco
7
Introducing DBRX: A New State-of-the-Art Open LLM
Data Alchemy
skool.com/data-alchemy
Your Community to Master the Fundamentals of Working with Data and AI — by Datalumina®
Leaderboard (30-day)
powered by