Apple is making its iPhones even smarter with Apple Intelligence and Google is playing an important role in this
Apple has introduced a beta version of Apple Intelligence, revealing AI features powered by Google’s TPU chips.
Listen to the story
Apple recently launched the developer betas of iOS 18.1, iPadOS 18.1, and macOS Sequoia, as well as the beta version of Apple Intelligence, a new suite of AI-powered features. This update introduces Apple’s proprietary AI capabilities across its entire ecosystem, giving the company a competitive edge in the AI ​​space. However with the release of the beta version, Apple has also offered a rare glimpse into the development of these advanced AI features, as well as revealing a surprise partnership with Google for its chips to train its AI models.
Apple’s latest publication paper, titled “Apple Intelligence Foundation Language Models,” provides an in-depth overview of the technical foundations of Apple Intelligence. The paper, primarily intended for researchers, reveals that Apple’s two foundational language models, AFM-on-device and AFM-server, were developed using Google’s Tensor Processing Units (TPUs) rather than Apple’s own silicon.
As per the report highlighted by 9 to 5 Mac, the AFM-on-Device model designed to work directly on the iPhone used 2,048 TPUv5p chips during training. The larger AFM-Server model, which supports more complex tasks from Apple’s servers, was trained on 8,192 TPUv4 chips. “These two foundation models are part of a larger family of generative models created by Apple to support users and developers; this includes a coding model (based on the AFM language model) for building intelligence in Xcode, as well as a diffusion model to help users express themselves visually, for example, in the Messages app,” the paper reads.
Although the paper doesn’t go into much detail, it is surprising how Apple chose Google’s chip over Nvidia’s considering the growing AI race and how big technology companies are looking for alternatives to reduce their dependence on Nvidia.
Earlier, at the 2024 Worldwide Developers Conference (WWDC), Apple introduced Apple Intelligence as a core component of its upcoming software update. The company revealed that Apple Intelligence is designed as a personal assistant deeply integrated into the user experience across Apple’s ecosystem. Apple Intelligence will bring a suite of AI features aimed at enhancing the user experience by providing users with tools such as crafting messages, creating visual content, and even automating routine tasks.
Apple has started rolling out Apple Intelligence features to some of its users in the iOS and iPadOS 18.1 developer beta. However, the beta can only be downloaded on the iPhone 15 Pro and iPhone 15 Pro Max, as well as iPads and Macs with M1 or later chips. To participate in the beta, users must join the waitlist after updating their devices to the latest beta versions of iOS, iPadOS, or macOS.
Meanwhile, the current beta version of Apple Intelligence gives a preview of several new features that were first showcased at Apple’s annual developers conference in June. The update includes a refreshed Siri interface that brightens up the phone’s edges, enhanced Siri’s ability to understand commands even when the speaker is stuttering, and the ability to answer troubleshooting questions about Apple products. Additionally, there are improvements to photo search and movie creation, AI-generated summaries for Mail, Messages, and voicemail transcription, and a new text-generation service called Writing Tools.
However, some features are still not available in the current AI preview but are expected to become available by next year.