iOS 18.2 brings visual intelligence to iPhone 16, 16 Pro: What it is and how it works
Apple’s iOS 18.2 is releasing with several Apple Intelligence updates. Visual Intelligence is one of the exciting features of the recent update. Let’s find out what it is and how it works.
listen to the story

Apple has finally rolled out the second phase of Apple Intelligence with the iOS 18.2 update. This update brings much-awaited AI-powered features including Image Playground, integrated ChatGPT, and more. While all of these features are to be shared by the iPhone 15 Pro models and the iPhone 16 series, Visual Intelligence is available exclusively to the latest iPhones. Visual intelligence helps users quickly learn about objects and places, thanks to new camera controls on the iPhone 16 lineup.
With Visual Intelligence, you can summarize and copy text, translate text between languages, locate phone numbers and email addresses, do more with the option to add contacts. Let’s look at this feature in detail.
Visual Intelligence: What is it?
Apple’s Visual Intelligence feature is a new tool that combines image recognition and text extraction, allowing users to identify objects, extract information, and take actions with just a photo. Whether it’s identifying a restaurant, extracting event details from a flyer, or recognizing objects on the street, visual intelligence uses a combination of on-device intelligence and Apple services to provide users with relevant information, it Ensuring that images are never stored.

The camera controls also integrate seamlessly with Google Search, so you can access ChatGPT’s problem-solving expertise to find where to buy an item or decipher complex diagrams like class notes. The best part is that you have control over when to use third-party tools and what information is shared, ensuring your privacy and security.
Visual Intelligence: How does it work?
Until now, the camera control button has only been used as a shortcut to launch the camera app. Even with some camera controls, the role was essentially to use the camera. But now, as revealed at WWDC 2024, the camera control button will unlock visual intelligence as well. After installing iOS 18.2, use this special iPhone 16 button to experience the feature. With a click, visual intelligence can identify objects, provide information and offer actions based on where you point them.

Additionally, since iOS 18.2 also brings ChatGPT integration, you can use Visual Intelligence with it. What should be done? Point your iPhone camera at an object in front of you, then click and hold a camera control. Tap the Ask button to ask ChatGPT about the item. And voil! After tapping the Ask button, you can type or direct a follow-up question in the text field at the bottom of the screen.
Yes, we can say that it works just like Google Lens feature. However, there are additional extensions like ChatGPT with visual intelligence. Overall, this seems like a fun feature that can help save a lot of time by replacing typing with just clicking a picture.