While the AI industry chases reinforcement learning, Essential AI made the opposite bet. Their new 8B model embodies a thesis about where machine intelligence originates. The transformer's co-inventor is calling the shots on research.
Tim Cook built Apple's leadership into a monument of stability. In 2025, that monument cracked. Meta poached AI and design chiefs with $25M packages. The chip architect may follow. What broke inside the world's most valuable company?
OpenRouter's 100 trillion token study was supposed to prove AI is transforming everything. The data shows something else: half of open-source usage is roleplay, enterprise adoption is thin, and one account caused a 20-point spike in the metrics.
Microsoft just made its AI coding tools open source - a move that could reshape how developers work. The decision opens up GitHub Copilot's core features and adds an autonomous coding agent that writes and fixes code on its own. Here's what developers need to know.
Microsoft is transforming VS Code into an open source AI editor. The company announced at Build 2025 that it will release the code behind GitHub Copilot Chat under the MIT license and integrate AI capabilities directly into VS Code's core.
The move marks a shift from treating AI as an optional add-on to making it a fundamental part of the code editor. It's like VS Code finally admitting that AI is more than just a fancy party trick - it's becoming part of the family.
For developers, this means unprecedented access to VS Code's AI internals. They'll be able to inspect, modify, and enhance the same AI features they use daily. The change affects everything from code completion to the new autonomous coding agent that can tackle bugs and add features while developers grab coffee.
Meet your new AI teammate
This autonomous agent is particularly interesting. It works by spinning up a virtual machine, cloning your repository, and getting to work - sort of like hiring a very efficient intern who never needs sleep or snacks. When it's done, it tags you for review and stands ready to address any feedback.
The timing isn't random. Microsoft says several factors pushed them toward openness. Large language models have improved so much that keeping prompting strategies secret no longer makes sense. It's like holding onto a recipe for boiling water - the technique is now common knowledge.
Security through transparency
Security played a role too. With AI tools increasingly targeted by bad actors, Microsoft is betting that sunlight is the best disinfectant. The open source community has a track record of spotting and squashing bugs faster than closed teams.
The move also responds to growing questions about data collection in AI coding tools. By opening the source code, Microsoft lets developers see exactly what data these tools gather. No more wondering if your code comments about your boss are being secretly archived somewhere.
Extension developers get full access
Extension developers stand to benefit particularly. Currently, building AI-powered VS Code extensions is like trying to dance with a partner while blindfolded - you can do it, but it's awkward and you'll probably step on some toes. Access to the Copilot Chat source code should make this process smoother.
Microsoft isn't just dumping code onto GitHub and calling it a day. They're also releasing their prompt testing infrastructure, making it easier for contributors to test AI features. Given how unpredictable large language models can be, this is like providing a safety net for the coding trapeze act.
Part of broader Copilot expansion
The announcement comes alongside other GitHub Copilot updates. The tool is getting PostgreSQL support, expanded agent capabilities, and new features for modernizing Java and .NET applications. It's currently used by 15 million developers, suggesting AI coding assistants have moved well beyond the experimental phase.
Why this matters:
This shifts AI coding tools from being proprietary black boxes to community-driven projects. Developers can now peek under the hood of tools they use daily, potentially leading to faster improvements and better security.
Microsoft is betting that community collaboration beats secrecy in AI development. It's a notable stance from a company that once considered open source the enemy - kind of like Darth Vader deciding to join the Rebel Alliance.
Tech journalist. Lives in Marin County, north of San Francisco. Got his start writing for his high school newspaper. When not covering tech trends, he's swimming laps, gaming on PS4, or vibe coding through the night.
Tim Cook built Apple's leadership into a monument of stability. In 2025, that monument cracked. Meta poached AI and design chiefs with $25M packages. The chip architect may follow. What broke inside the world's most valuable company?
The New York Times sued Perplexity for copyright infringement—months after signing an AI licensing deal with Amazon. Perplexity built revenue-sharing programs for publishers. The Times declined to join any of them. Now lawyers are involved.
Chinese hackers operated inside U.S. VMware servers for 17 months undetected. The malware repairs itself when deleted. It hides where most security teams don't look. CISA's December 4 advisory exposes an architectural blind spot in enterprise defense.
Werner Vogels ends his 14-year keynote streak by handing out printed newspapers and warning developers about "verification debt." His parting message: AI generates code faster than humans can understand it. The work is yours, not the tools.