On April 24, 2024, Snowflake launched its groundbreaking open-source large language model (LLM), Snowflake Arctic, following its acquisition by Flexera. Arctic is notable for its Dense Mixture of Experts (MoE) hybrid transformer architecture, boasting an impressive 480 billion parameters with 128 experts and a top-2 gating technique to select 17 billion active parameters. Unlike its competitors, Arctic is truly open-source, licensed under Apache 2.0, which enhances accessibility in enterprise AI. Designed for efficiency and effectiveness, Arctic excels in complex tasks like SQL generation, outperforming other models like DBRX and Llama in industry benchmarks. The model’s sophisticated training involved 3.5 trillion tokens and a unique three-stage curriculum. Snowflake Arctic can handle long-context tasks with a 4,096 token attention window. To experience its capabilities, users can access live demos on various platforms, demonstrating its potential in driving enterprise innovation and collaboration in AI technology.
Source link
