Home AI Sarvam Unveils Groundbreaking 105-Billion Parameter AI LLM Model

Sarvam Unveils Groundbreaking 105-Billion Parameter AI LLM Model

0
Sarvam rolls out 105-bn parameter AI LLM model

Sarvam AI, a pioneering startup based in New Delhi, has unveiled a state-of-the-art 105-billion-parameter large language model (LLM), marking a significant advancement in India’s AI landscape. Designed to meet the unique needs of the Indian market, this model competes effectively with both open and closed systems, showcasing impressive agentic and tool-calling functionalities. Co-founder Pratyush Kumar noted its superior performance against other models, outperforming the 600-billion-parameter Deepseek R1 and Google’s Gemini 2.5 Flash on Indian language benchmarks. Sarvam’s strategic choice of model size reflects a commitment to practical applications, with anticipated enhancements in performance. Parameters in LLMs are crucial, representing the model’s learned knowledge. Unlike the trillions found in leading global models, Sarvam’s focused design highlights India’s capability to develop cutting-edge AI from the ground up. Selected under the India AI Mission, Sarvam has received government support, including GPU access for efficient training.

Source link

NO COMMENTS

Exit mobile version