Your iPhone will soon understand what you’re saying from your facial expressions
Apple’s acquisition of AI startup Q.ai could allow future iPhones to understand whispered or mouthed words by reading subtle facial movements. Here’s the full story.

Apple may be getting closer to a future where your iPhone won’t just listen to you, but will also understand what you’re trying to say by looking at your face. The company has acquired Q.ai, an Israeli artificial intelligence startup that works on advanced audio and imaging technologies. Apple confirmed the acquisition on Thursday but did not share financial details. However, people familiar with the deal told Reuters that Q.ai was valued at about $1.6 billion, while another report put the figure closer to $2 billion, making it Apple’s second-largest acquisition after Beats.
Q.ai specializes in using machine learning to improve the way devices handle sound, especially in difficult situations. The startup is working on technology that helps devices understand whispered speech and clean audio in noisy environments. But what has drawn widespread attention is Q.ai’s work in detecting small movements in facial skin and understanding words spoken by mouth or softly.
Last year, the company filed a patent describing how these “micro-movements of facial skin” could be used to read speech, identify a person, and even predict emotional state, heart rate, and breathing patterns. In simple terms, this could allow future Apple devices to detect what you are saying, even when your voice is barely audible, or possibly not heard at all.
Apple hasn’t said how it plans to use Q.ai’s technology. Still, this acquisition fits perfectly with the direction the company is going. Over the past year, Apple has been continuously adding AI-powered features to its products, especially in audio. Its AirPods already support live language translation, and Apple is exploring smart ways for the devices to adapt to real-world sound conditions.
This technology could play a role beyond the iPhone, too. Apple is developing systems that detect subtle facial muscle activity, something that could improve experiences on devices like the Vision Pro headset, where hands-free conversations are a priority.
About 100 of Q.ai’s employees, including CEO Aviad Maizels and co-founders Yonatan Wexler and Avi Baralia, will join Apple. For Meizels, this is familiar territory. In 2013 he sold his previous company PrimeSense to Apple. That acquisition later helped Apple move away from fingerprint sensors and create the facial recognition system now used in iPhones.
In a statement, Magels said joining Apple opens up new possibilities for expanding the work done by Q.ai and bringing it to a much broader audience. Johnny Sruzzi, Apple’s hardware chief, described Q.ai as a company doing creative work at the intersection of imaging and machine learning and said Apple is excited about what’s next.
The deal also highlights how intense competition has become among big tech companies. Apple, Meta and Google are all racing to lead the next phase of AI, with a focus on hardware that feels more natural and intuitive to use. For Apple, gaining stronger control over audio and perception technologies could give it a quiet but meaningful advantage.