Logo

Microsoft is building its own AI model as it loosens ties to OpenAI

Microsoft is hedging its OpenAI bet by building its own models, chips, and leverage — so “Copilot everywhere” never depends on a single supplier

Sven Hoppe/picture alliance via Getty Images

Microsoft $MSFT wants out of the “powered by someone else” business. The company’s AI chief, Mustafa Suleyman, told the Financial Times that the company is pushing toward AI “self-sufficiency.” That means developing its own advanced foundation models and continuing to reduce its reliance on OpenAI, even as the two companies keep their relationship intact.

Microsoft’s October 2025 reset with OpenAI preserved the core perks: Microsoft says OpenAI remains its “frontier model partner,” and Microsoft’s IP rights and Azure API exclusivity run “through 2032,” including models “post-AGI.” So this is Microsoft buying itself even more room to negotiate, route, and replace. 

Because when your flagship AI product sits inside Microsoft 365, “single supplier” starts sounding like a vulnerability you have to try to explain on earnings calls. Microsoft can keep selling “Copilot everywhere,” but the real prize is making sure the underlying compute, security, and billing stay Microsoft-shaped, no matter which model is hot this quarter.

Microsoft is also trying to prove it’s not just talking. In August 2025, Microsoft AI previewed MAI-1-preview, calling it “an in-house mixture-of-experts model” that was “pre-trained and post-trained on ~15,000 NVIDIA H100 GPUs,” with plans to roll it into certain Copilot text use cases. 

That’s a clear marker of intent: Microsoft is building models, and it’s doing it at a meaningful scale, on the same hardware reality as everyone else.

Microsoft’s new Maia 200 chip is positioned as an inference accelerator “engineered to dramatically improve the economics of AI token generation” — or, essentially, to take aim at Nvidia $NVDA’s software, pairing custom silicon with a software package meant to loosen CUDA’s grip. Inference is where the bills stack up — and where hyperscalers most want leverage.

Meanwhile, Microsoft is widening its menu on purpose, hosting models from xAI, Meta $META, Mistral, and Black Forest Labs in its data centers. It has also been willing to use Anthropic models in Microsoft 365 Copilot experiences after internal testing found them better for certain Office tasks, a shift that even involved paying AWS for access. 

So yes, Microsoft wants to be the place where every winning model runs — and it wants at least one winner to have a Microsoft badge on it.

📬 Sign up for the Daily Brief

Our free, fast and fun briefing on the global economy, delivered every weekday morning.