Upstage recently launched its open-source large language model, “Solar Open 100B,” marking a significant milestone in AI development. Funded by the Ministry of Science and ICT, this model was built from the ground up, showcasing a unique learning methodology. Released on Hugging Face, Solar Open outperforms China’s Dipsyk R1 in Korean, English, and Japanese benchmarks, achieving over double the performance in Korean cultural understanding. To enhance learning despite the scarcity of Korean data, Upstage employed synthetic datasets tailored for sectors like finance and medicine. Leveraging a mixture of 129 expert models within a “MoE” structure, Solar Open optimizes GPU efficiency, achieving an 80% increase in token throughput and cutting the learning period in half. This results in a GPU infrastructure cost reduction of 12 billion KRW. CEO Kim Sung-hoon emphasizes Solar Open as a pivotal development for Korean AI, enhancing understanding of Korean emotions and linguistic nuances.
Source link
![news-p.v1.20260106.5874e72d079b4db08e4108eee26ce9d6_P1.png [Picture = Upstage]](https://site.server489.com/wp-content/uploads/2026/01/news-p.v1.20260106.5874e72d079b4db08e4108eee26ce9d6_P1.png)