Microsoft Makes Waves with New AI and ARM Chips

Microsoft made waves this week by unveiling its first ever custom-designed chips at the Ignite conference. The tech giant introduced two new processors: the Maia 100 chip for artificial intelligence workloads and the Cobalt 100 chip for general computing purposes. These new silicon offerings have the potential to shake up the chip industry and cloud computing markets.

The Maia 100 is Microsoft’s answer to the AI accelerators from rivals like Nvidia and Amazon. It is tailored to boost performance for AI tasks like natural language processing. During Ignite, Microsoft demonstrated Maia handling queries for its Bing search engine, powering the Copilot coding assistant, and running large OpenAI language models.

Microsoft has been collaborating closely with OpenAI and is a major investor in the AI research company. OpenAI’s popular ChatGPT was trained on Azure using Nvidia GPUs. By designing its own chip, Microsoft aims to reduce reliance on third-party silicon for AI workloads.

Though performance details remain unclear, Microsoft stated that Maia handles AI tasks with high throughput and low latency. It emphasized efficiency as a key design goal. The chip was engineered in close consultation with Microsoft’s internal AI teams to ensure it fits their requirements.

Microsoft has created novel liquid cooling technology called Sidekicks to work alongside Maia server racks. This advanced thermal management unlocks Maia’s full processing capacity while avoiding the overheating issues that often plague GPU-powered data centers.

When available on Azure, Maia will provide customers access to specialized AI hardware on demand instead of buying dedicated GPUs. Microsoft did not provide a timeline for Maia’s availability or pricing. But offering it as a cloud service instead of a physical product sets Maia apart from AI chips from Intel, Nvidia and others.

The second new chip announced at Ignite was the Cobalt 100 ARM-based processor for general computing. It is expected to deliver a 40% performance boost over existing Azure ARM chips from Ampere.

Microsoft believes Cobalt will provide a compelling alternative to Intel’s server CPUs for cloud workloads. Companies like Amazon have already demonstrated success in cloud data centers by transitioning from Intel to custom ARM chips.

Virtual machines powered by Cobalt will become available on Azure in 2024. Microsoft is currently testing it for key services like Teams and Azure SQL database. More efficient ARM servers can translate to lower costs for cloud customers.

The Cobalt announcement highlights Microsoft’s growing reliance on ARM architecture across its cloud infrastructure. ARM chips are known for power efficiency in mobile devices, but companies like Amazon, Microsoft and Apple now recognize their benefits for data centers too.

By designing its own server-class ARM processor, Microsoft can optimize performance and features specifically for its cloud services. With both Maia and Cobalt, Microsoft aims to give Azure a competitive edge over rivals like AWS and Google Cloud.

Microsoft has lagged behind in cloud infrastructure market share, but introducing unique silicon could help close the gap. Its vertically integrated approach produces chips tailor-made for AI and its cloud platform. With demand for AI compute and cloud services booming, Microsoft’s gambit on custom chips could soon pay dividends.

AMD’s Future Hinges on AI Chip Success

Chipmaker Advanced Micro Devices (AMD) offered an optimistic forecast this week for its new data center AI accelerator chip, predicting $2 billion in sales for the product in 2024. This ambitious target represents a crucial test for AMD as it seeks to challenge rival Nvidia’s dominance in the artificial intelligence (AI) chip market.

AMD’s forthcoming MI300X processor combines the functionality of a CPU and GPU onto a single chip optimized for AI workloads. The chipmaker claims the MI300X will deliver leadership performance and energy efficiency. AMD has inked deals with major hyperscale cloud customers to use the new AI chip, including Amazon Web Services, Google Cloud, Microsoft Azure and Oracle Cloud.

The $2 billion revenue projection for 2024 would represent massive growth considering AMD expects a modest $400 million from the MI300X this quarter. However, industry analysts caution that winning significant market share from Nvidia will prove challenging despite AMD’s technological advancements. Nvidia currently controls over 80% of the data center AI accelerator market, fueled by its popular A100 and H100 chips.

“The AI chip market is still in its early phases, but it’s clear Nvidia has built formidable customer loyalty over the past decade,” said Patrick Moorhead, President of Moor Insights & Strategy. “AMD will need to aggressively discount and wow customers with performance to take share.”

AMD’s fortunes sank earlier this year as the PC market slumped and excess inventory weighed on sales. Revenue from the company’s PC chips dropped 42% in the third quarter. However, AMD sees data center and AI products driving its future growth. The company aims to increase data center revenue by over 60% next year, assuming the MI300X gains traction.

But AMD faces headwinds in China due to new U.S. export rules limiting the sale of advanced AI chips there. “AMD’s ambitious sales target could prove difficult to achieve given the geopolitical climate,” said Maribel Lopez, Principal Analyst at Lopez Research. China is investing heavily in AI and domestic chipmakers like Baidu will be courting the same hyperscale customers.

Meanwhile, Intel aims to re-enter the data center GPU market next year with its new Ponte Vecchio chip. Though still behind Nvidia and AMD, Intel boasts financial resources and manufacturing scale that shouldn’t be underestimated. The AI chip market could get very crowded very quickly.

AMD CEO Lisa Su expressed confidence in meeting customer demand and hitting sales goals for the MI300X. She expects AMD’s total data center revenue mix to shift from approximately 20% today to over 40% by 2024. “The AI market presents a tremendous opportunity for AMD to grow and diversify,” commented Su.

With PC sales stabilizing, AMD raising its AI chip forecast provided a sigh of relief for investors. The company’s stock rebounded from earlier losses after management quantified the 2024 sales target. All eyes will now turn to AMD’s execution ramping production and adoption of the MI300X over the coming year. AMD finally has a shot at becoming a major player in the AI chip wars—as long as the MI300X lives up to the hype.