Microsoft accelerates AI with new data centre chips
Enhanced data centre efficiency with Microsoft’s new chips promises better AI performance and reduced energy consumption.

Microsoft has unveiled two innovative data centre infrastructure chips, designed to enhance AI operations and bolster data security. Announced at the Ignite conference, these new additions demonstrate a commitment to developing in-house silicon tailored for advanced computing needs. By producing custom chips, Microsoft joins major players like Amazon and Google in reducing dependency on suppliers such as Intel and Nvidia.
One of the new chips, the Azure Integrated HSM, is engineered to improve security by safeguarding sensitive encryption and security data within dedicated hardware. It will be integrated into all new data centre servers from next year. The second chip, a Data Processing Unit (DPU), consolidates server components for optimised cloud storage performance. Compared to existing hardware, it delivers quadruple the performance while consuming significantly less energy.
Microsoft’s investment in custom chips aligns with its broader strategy to enhance data centre efficiency for AI-driven applications. Rani Borkar, corporate vice president of Azure Hardware Systems, highlighted the importance of streamlining infrastructure to meet the demands of modern AI. The chips aim to process data at unprecedented speeds while maintaining robust security protocols.
Alongside these advancements, Microsoft introduced a new liquid cooling system for data centre servers, designed to support the intensive computing demands of large-scale AI models. The technology promises to reduce component temperatures efficiently, ensuring sustainability while catering to growing AI workloads.