Microsoft, an early supporter of OpenAI and a prominent user of ChatGPT models in products like Copilot, is reportedly developing its own AI models. This strategic move aims to reduce reliance on OpenAI and establish a proprietary AI foundation for Copilot applications.
Microsoft is actively investing in building its own AI capabilities. Recent developments include the introduction of the Phi-4-multimodal and Phi-4-mini language models. These models boast multi-modal capabilities, processing text, speech, and visual input, similar to OpenAI’s ChatGPT and Google’s Gemini. Microsoft’s AI chief, Mustafa Suleyman, is spearheading this initiative to create a comprehensive AI stack within the company.
Building a Proprietary AI Ecosystem
The Phi-4 models are already accessible to developers through Microsoft’s Azure AI Foundry and platforms like HuggingFace and the NVIDIA API Catalog. Benchmarks released by Microsoft suggest that Phi-4 outperforms Google’s Gemini 2.0 models in several areas. The company highlights Phi-4’s proficiency in speech summarization, achieving performance comparable to OpenAI’s GPT-4o model. Microsoft plans to commercially release its “MAI” models, currently under development, via the Azure service.
The Surface Laptop shown in front of a Copilot+ sign.
Microsoft emphasizes Phi-4’s robust language, mathematical, and visual science reasoning capabilities. This focus on reasoning models signifies the next phase in AI development, promising a more sophisticated understanding of queries, improved logical deduction, and enhanced problem-solving abilities.
Exploring Third-Party AI Integrations and Reasoning Model Development
While developing in-house AI models, Microsoft is also evaluating third-party alternatives like DeepSeek, xAI, and Meta for Copilot integration. DeepSeek, known for its high performance and cost-effectiveness, has garnered significant attention, boasting a remarkable cost-to-profit ratio.
Today, we are advancing our AI ambitions with the release of DeepSeek R1 7B & 14B distilled models for Copilot+ PCs via Azure AI Foundry. This is the next step on our journey to continue to make Windows the platform for AI, seamlessly integrating intelligence from the cloud to… pic.twitter.com/QaUYrlMIt6
— Pavan Davuluri (@pavandavuluri) March 3, 2025
Beyond replacing OpenAI’s GPT infrastructure for Copilot, Microsoft is actively developing its own reasoning AI models. This puts Microsoft in direct competition with OpenAI’s GPT-o1 and emerging Chinese competitors like DeepSeek, both specializing in reasoning capabilities. Reported tensions between Microsoft and OpenAI, particularly regarding technology sharing and transparency around GPT-o1’s inner workings, have reportedly accelerated Microsoft’s internal reasoning model development.
Conclusion
Microsoft’s strategic shift towards developing its own AI models reflects a broader industry trend towards greater control over AI capabilities. By building a proprietary AI stack, Microsoft aims to strengthen its position in the AI landscape and reduce its reliance on external partners. This move positions Microsoft as a major player in the ongoing evolution of AI technology, particularly in the crucial area of reasoning models. The development of these in-house AI models has the potential to significantly enhance Copilot and other Microsoft products, while also fostering competition within the AI industry.