Get the latest tech news
Snowflake Arctic Instruct (128x3B MoE), largest open source model
An efficient, intelligent, and truly open-source language model
Arctic is a dense-MoE Hybrid transformer architecture pre-trained from scratch by the Snowflake AI Research Team. Please see our blog Snowflake Arctic: The Best LLM for Enterprise AI — Efficiently Intelligent, Truly Open for more information on Arctic and links to other relevant resources such as our series of cookbooks covering topics around training your own custom MoE models, how to produce high-quality training data, and much more. Arctic combines a 10B dense transformer model with a residual 128x3.66B MoE MLP resulting in 480B total and 17B active parameters chosen using a top-2 gating.
Or read this on Hacker News