Saturday, July 12, 2025

The Limitations of Generalist AI: Challenges in Scaling

Share

The rising costs of training generalist AI models have sparked debate in the tech industry, particularly regarding their scalability. Training a size-N model requires handling O(N) inputs and performing O(N) computations for each input, leading to O(N^2) total computations. This complexity is problematic as resources are strained and data generation poses challenges, especially for general knowledge. To address these issues, experts suggest transitioning to specialized AI models instead of one generalist. By splitting models into k parts—each focusing on a specific domain—the training cost can be significantly reduced to O(N^2 / k). While this means sacrificing some versatility, the benefits of having multiple expert AIs outweigh the drawbacks. This approach allows for the creation of a smarter, more extensive pool of models, optimizing training efficiency and ensuring that queries are handled by the appropriate specialists, thereby improving overall performance. Specialization, it seems, is key in AI development.

Source link

Read more

Local News