Tuesday, August 12, 2025

Ollama 0.10 Accelerates Local AI Models and Launches New Desktop App

Ollama is a powerful command-line application that enables users to run generative AI models locally on their computers, now featuring an updated desktop application for enhanced usability. It supports models such as DeepSeek-R1, Google’s Gemini 3, Meta’s LLaMA 3, and Microsoft’s Phi 4, functioning similarly to ChatGPT. The latest version, 0.10, introduces two new features: the ollama ps command to display the context length of loaded models and support for WebP images in its OpenAI-compatible API. Performance boosts are notable, with Gemma 3n models reportedly achieving 2–3 times faster speeds and parallel processing improvements of 10–30% with multiple GPUs. The desktop app offers a user-friendly chat interface, multimodal input, and Markdown support, making it accessible for all users. Ollama’s command-line version is available through Homebrew and Docker Hub. For the latest updates and downloads, visit Ollama’s official website.

Source link

Share

Read more

Local News