Microsoft has unveiled Maia 200, a next-generation AI chip developed in partnership with TSMC and designed to accelerate large-scale AI workloads, including OpenAI’s GPT-5.2 models. The chip will power Microsoft 365 Copilot, Azure cloud services, and Microsoft’s internal AI research, offering faster and more cost-efficient performance than previous hardware.
Maia 200 is built to handle the intensive calculations required by modern AI. It processes billions of data points quickly, making applications like virtual assistants, chatbots, and AI-driven productivity tools more responsive and capable. Microsoft says Maia 200 delivers three times the performance of Amazon’s third-generation Trainium for low-precision tasks and outpaces Google’s latest TPU for mid-level workloads, all while using less energy. This means AI can run faster for users while keeping costs and power consumption lower.
The chip is currently deployed in Microsoft’s datacenter in Des Moines, Iowa, with additional locations, including Phoenix, expected in the near future. Maia 200 is part of a multi-generational program, with future versions expected to set new benchmarks in performance and efficiency, helping Microsoft maintain its leadership in cloud AI infrastructure.
Microsoft is also releasing a software development kit (SDK) so developers, AI startups, and researchers can optimize their models for Maia 200. The SDK includes PyTorch integration, a compiler, and low-level programming options, making it easier to take advantage of the chip’s full capabilities. This allows AI teams to run large models more efficiently and experiment with new applications without waiting for hardware upgrades.
Beyond running existing AI models, Maia 200 will support Microsoft’s internal research on synthetic data and reinforcement learning. By generating high-quality, domain-specific data more quickly, Microsoft can train future AI models with fresher, more targeted inputs, improving performance for products such as Microsoft 365 Copilot and other AI-powered cloud tools.
Scott Guthrie, Microsoft’s executive overseeing cloud computing and AI, said Maia 200 will make AI faster, more reliable, and more cost-effective for millions of users. “This accelerator is designed from the ground up to support the next generation of AI,” he said. “It allows us to deliver high-quality AI services at cloud scale while keeping energy and operational costs down.”
Maia 200 highlights the growing importance of AI hardware in powering modern software experiences. By combining TSMC’s advanced 3nm process, OpenAI’s models, and Microsoft’s cloud infrastructure, the company is setting the stage for more powerful AI applications in everyday tools, from office productivity software to virtual assistants. As Microsoft continues expanding AI across its platforms, Maia 200 is a critical step in making advanced AI faster, more efficient, and widely accessible.