Microsoft recently announced its first homegrown artificial intelligence (AI) chip, Maia 100, at the annual Microsoft Ignite conference. The chip is designed to accelerate in-house AI workloads through Microsoft’s Azure cloud computing service.
Maia 100 is a GPU that has been specifically designed to run large language models like GPT 3.5 Turbo and GPT-4, which underpin Microsoft’s Azure OpenAI services and Microsoft Copilot (formerly Bing Chat). It has 105 billion transistors that are manufactured on a 5-nm TSMC process.
Microsoft’s decision to develop its own AI chip is a significant move that could disrupt the big tech industry, especially Nvidia, AMD, and Intel. By owning the silicon, the servers, the software, and the services that run on top of it, Microsoft has control over every step between it and the customer.
Maia 100 is not the only chip that Microsoft has developed. The company also announced the Microsoft Azure Cobalt 100 CPU, which is an ARM-based chip designed to perform conventional computing tasks like powering Microsoft Teams.
Microsoft has no plans to sell either chip, preferring them for internal use only. The company wants to be “the Copilot company,” and it will need a lot of computing power to meet that goal.
According to Reuters, Microsoft and other tech firms have struggled with the high cost of delivering AI services that can cost 10 times more than services like search engines. By developing its own AI chip, Microsoft can reduce the cost of delivering AI services and increase its competitiveness in the market.
Overall, Maia 100 is a significant development in the field of AI and cloud computing. It will be interesting to see how Microsoft’s competitors respond to this move and whether it will lead to further innovation in the industry.
Share to your social below!