Know: What is Visual Intelligence and what will it do in Apple iPhone 16 and iPhone 16 Pro
Apple has introduced a new AI-powered feature called Visual Intelligence, just like Google Lens. This feature is enabled by camera controls on both iPhone 16 and iPhone 16 Pro
Listen to the story

Apple has finally unveiled the iPhone 16 series in full. While the new design is being talked about this season, the AI-powered features are the most talked about. During the event, the company highlighted a new feature called Visual Intelligence. It was known for a long time that the iPhone 16 series will be powered by AI, but at the event held yesterday (September 9), the company highlighted every aspect to tell everything about it. Visual Intelligence is one such interesting feature. It aims to help users learn something new by just clicking a picture. It is similar to the established feature of Google Lens.

What is visual intelligence?
Visual intelligence is basically a reverse image search engine combined with text recognition. This means if you want to identify something on the street, all you have to do is click a picture and your iPhone will tell you everything about it. In the demo, Apple showed a user searching for a restaurant by clicking a picture. Here, visual intelligence comes in handy. The feature shows the restaurant’s hours, ratings, and options to view the menu or make a reservation. In another example, if a user sees a flyer for an event, he can quickly add the title, time, date, and location to his calendar using this feature. iPhones use a combination of “on-device intelligence and Apple services that never store your images” to power visual intelligence.
How to activate visual intelligence?
Remember the leak about the new camera button on the iPhone 16 series? It was a bit odd for Apple to make it just a camera button with no other capabilities. But during the event, the company finally revealed that there is more to it. Users can activate visual intelligence using this new-age sensitive button on the right side of the device. Also, it is not called the camera button but camera control. With a single click, visual intelligence can identify objects, provide information, and take actions based on what you point at.

Apple hasn’t announced when the feature will arrive. But according to the release, “later this year, Camera Control will unlock visual intelligence,” helping users learn about objects and locations faster than ever before.
Camera Control
Camera controls allow users to quickly launch the camera, take a photo, and start recording a video so users don’t miss the moment. A new camera preview helps users frame the shot and adjust other control options – such as zoom, exposure, or depth of field – to create a great photo or video by sliding their finger across the camera controls. Additionally, developers will be able to bring camera controls to third-party apps like Snapchat.