Arm launches 136-core AGI CPU for data centers, its first in-house chip

Enter Player Three.

Arm has just announced its new AGI CPU, the company’s first proper in-house data centre CPU and built for what it calls “agentic AI”.

This is big news because Arm has spent almost all its entire existence designing chip architectures that its partners turn into silicon (for example, Qualcomm), or licensing out its Arm architecture – or “basic language” – to others who then design and make their own chips (like Apple with its M-series chips). The AGI CPU marks the first time it is producing and selling its own chips.

How this is important for Arm

While GPU has hogged most of the AI conversations, it is important to remember that CPUs still handle a lot of the unglamorous work, from scheduling tasks, feeding data to accelerators, and basically keeping everything in sync across massive clusters. As these AI systems get more complex, that synchronisation layer starts to matter a lot more, and that’s where Arm is positioning this chip.

The AGI CPU is based on Arm’s Neoverse platform and built on TSMC’s 3nm process, scaling up to an impressive 136 cores per chip. In its press release, Arm says the emphasis with the new CPU isn’t peak performance in short bursts, but “sustained throughput across a huge number of concurrent workloads”. The company is also talking up rack-level efficiency, claiming “more than double the performance per rack compared to traditional x86 setups”, alongside the ability to scale to thousands of cores in a single deployment.

These all sound very impressive but the bigger story here is what all these means for Arm as a company.

By building and selling its own chip, Arm is stepping into territory that’s traditionally been occupied by its partners – companies like Qualcomm, Nvidia, AMD and even hyperscalers designing their own silicon. On one hand, it gives Arm a slice of the AI industry pie that’s projected to grow to over US$2.4 trillion by 2032-2034 and more control over how its designs are implemented in real-world deployments. On the other, it means competing with key partners and risks complicating these relationships with the very ecosystem that made it successful in the first place.

Meta as its first partner

Meta is the lead partner on this project and is expected to deploy the AGI CPU in its data centres, with names like OpenAI, Cloudflare, and SAP also involved. So we know there’s real interest in the AGI CPU, but it also highlights how concentrated this market is – Arm and everyone else is effectively building for just a handful of very large players.

And that’s the other thing. The data centre CPU space isn’t exactly empty. Intel and AMD are still deeply entrenched, and hyperscalers like Amazon and Google have already been pushing their own custom silicon for years. Arm isn’t entering a gap here, but trying to carve out a new layer in an already crowded stack.

Still, the timing may not have been better. As AI workloads shift towards more autonomous, always-on systems, the role of the CPU has started to evolve as well. Arm’s bet is that this layer becomes important enough to justify its own silicon push, rather than leaving it to partners to figure out.

Share this article