Companies are increasingly turning to training models to use AI, fueling the need for generalist CPU chips designed to handle heavy workloads.
Under the agreement announced Thursday, Alphabet’s Google unit will continue to deploy Intel’s Xeon processors that support a broad range of workloads such as predictive and general-purpose computing. The company will also use Intel’s latest Xeon 6 chips.
Intel and Google will also expand co-development of custom infrastructure processing units (IPUs), which can handle tasks traditionally managed by CPUs, enabling more efficient computing.
“Scaling AI requires more than acceleration – it requires balanced systems. CPUs and IPUs are central to delivering the performance, efficiency and flexibility demanded by modern AI workloads,” said Intel CEO Lip-Bu Tan.
The growing demand for agentic AI systems – which perform complex, multi-step operations beyond simple chatbot functionality – has fueled the need for significantly more CPU processing power.
A surge in demand for CPUs could help Intel strengthen its balance sheet and win new customers after the chipmaker lost market share to rivals during the early years of the AI boom.
The company said Tuesday it will join Elon Musk’s Terafab AI chip complex project with SpaceX and Tesla to power the billionaire’s robotics and data center ambitions.
Intel also plans to take full ownership of its Ireland manufacturing facility, where it makes Xeon server processors, by buying back the stake it sold to Apollo Global Management.
(You can now subscribe to our ETMarkets WhatsApp channel)