Hugging Face has launched a new integration that connects Inference Providers with GitHub Copilot Chat in Visual Studio Code, allowing developers to access open-source large language models like Kimi K2, DeepSeek V3.1, and GLM 4.5 directly within the editor. This streamlines workflow as users can install the Hugging Face Copilot Chat extension, enter their Hugging Face token, and easily switch between models using a familiar interface. The integration enhances productivity by eliminating the need to switch platforms. Developers must ensure they are using VS Code version 1.104.0 or later. The update allows access to diverse AI tools, facilitating the use of specialized models tailored for specific tasks. Hugging Face claims advantages like instant access to various models, zero vendor lock-in, and production-ready performance. Additionally, developers can benefit from a free tier with monthly inference credits and a straightforward pay-as-you-go pricing model, ensuring affordability and flexibility.
Source link

Share
Read more