Home AI Trillion Labs Launches Open-Source LLM ‘Tri-21B’: A Ground-Up Approach to Pre-Learning

Trillion Labs Launches Open-Source LLM ‘Tri-21B’: A Ground-Up Approach to Pre-Learning

0
Trillion Labs Opens Source for LLM 'Tri-21B' Based on Pre-Learning from the Beginning

Trillion Labs has unveiled the Tri-21B, a cutting-edge large-scale language model (LLM) specifically engineered for advanced language comprehension and problem-solving. Utilizing the innovative ‘From-Scratch’ pre-learning method and the ‘X-Language Cross-Learning System (XLDA)’, this model significantly reduces learning costs by 1/12 while providing performance on par with global benchmarks like Alibaba’s Qwen 3 and Meta’s LLaMA 3. Tri-21B boasts over 21 billion parameters, enhancing its effectiveness in high-difficulty reasoning tasks such as mathematics and coding. Particularly notable is its exceptional capability in Korean language comprehension, scoring impressively in various metrics like the Hae-Rae and KMMLU evaluations. CEO Jae-min Shin emphasizes that Tri-21B merges the strengths of larger models efficiently, laying the groundwork for future developments like the full-size Tri-70B. This ambitious initiative positions Trillion Labs as a leader in AI, showcasing its commitment to innovative technologies tailored for diverse industries.

Source link

NO COMMENTS

Exit mobile version