Arm has unveiled its inaugural self-produced chip, the Arm AGI CPU, marking a significant strategic shift after decades of primarily licensing its chip designs to other manufacturers. This new offering boasts an impressive specification of up to 136 cores per CPU and claims to deliver double the performance per watt compared to traditional x86 chips.
The UK-based company's foray into direct chip production comes with a major customer announcement: Meta. The Arm AGI CPU is specifically engineered for inference workloads, supporting the intensive cloud processing required by advanced AI tools, such as AI agents capable of continuously spawning multiple tasks simultaneously. Meta, which has reportedly faced challenges in developing its own AI chips, is positioned as the first adopter of this new technology.
Meta has confirmed its role as both a lead partner and co-developer, expressing intentions to collaborate on "multiple generations" of these datacenter CPUs. These chips are destined for integration alongside hardware from other leading vendors like Nvidia and AMD. The announcement garnered congratulatory messages from a host of existing Arm customers, including Amazon AWS, Microsoft, Google, Marvell, Nvidia, and Samsung. Notably absent from this list was Qualcomm, which previously declared a "complete victory" over Arm in a court ruling last fall regarding licensing agreement terms.
While the partnership details are substantial, specific financial terms of the deal, as well as the projected number of chips Meta plans to procure from Softbank-owned Arm, remain undisclosed.
According to Arm, the AGI CPU leverages the robust Neoverse platform, which is also utilized by prominent AI chips such as AWS Graviton, Nvidia Vera, and Microsoft's offerings. The chip design supports up to 136 cores per CPU and can be configured with 64 CPUs per air-cooled server rack. Arm asserts that its AGI CPU achieves twice the performance per watt of conventional x86 CPUs while simultaneously mitigating memory bottlenecks, capitalizing on the inherent long-running efficiency advantages of its architecture.
Beyond Meta, a diverse array of additional customers are already lined up to adopt Arm's new chip, including Cerebras, Cloudflare, F5, OpenAI, Positron, Rebellions, SAP, and SK Telecom. Mohamed Awad, Arm's head of cloud AI, articulated the company's strategic objective to CNBC, stating its aim is to provide a viable and competitive option for companies that may not possess the resources to develop their own in-house processors.
The Editorial Staff at AIChief is a team of professional content writers with extensive experience in AI and marketing. Founded in 2025, AIChief has quickly grown into the largest free AI resource hub in the industry.