OpenAI has launched two new AI models, gpt-oss-20b and gpt-oss-120b, both featuring open weights and capable of local execution on desktops or laptops. The gpt-oss-20b model, designed for consumer devices, requires a minimum of 16GB RAM, making it runnable on standard laptops. Meanwhile, the gpt-oss-120b model, intended for data center use, requires 80GB RAM and is suited for high-end workstations. Both models can be downloaded from Hugging Face and run using OpenAI’s Ollama platform. Users can customize these models, although safety measures are in place. For gpt-oss-20b to operate effectively on powerful configurations, a recommended setup includes an AMD Ryzen AI 300 CPU and RX 7000 or 9000 series GPU. While Sam Altman claimed the smaller model could work on smartphones, practical execution remains limited. Overall, these advancements hint at the capabilities of local AI processing in the near future.
Source link