What is “Apple Visual Intelligence?”

Everything you need to know about Apple Visual Intelligence on iPhone 16 and iPhone 16 Pro
Avatar for Dave Johnson

by Dave Johnson

Our iPhones are with us everywhere we go, and Apple has given us the ability to instantly learn about everything we see. Apple’s new “Visual Intelligence” is just a click away with iPhone 16s’ new Camera Control.

A person using Visual Intelligence from Apple’s iPhone 16 announcement video. The person is holding their iPhone in front of a restaurant and looking at what’s shown onscreen.

What is Apple “Visual Intelligence?”

Apple’s “Visual Intelligence” is an iPhone 16 feature that lets users scan the world around them through their device’s camera to identify a dog breed, copy event details off a poster, or look up just about anything around them.

How does Apple “Visual Intelligence work?”

To use “Visual Intelligence” on iPhone 16, users are required to click and hold Camera Control to pull up the hours or ratings for a restaurant they pass, add an event from a flyer to their calendar, quickly identify a dog by breed, and more. Camera Control will also serve as a gateway into third-party tools with specific domain expertise, like when users want to search on Google to find where they can buy an item, or to benefit from ChatGPT’s problem-solving skills. Users are in control of when third-party tools are used and what information is shared.

An arrow points to the location of the Camera Control on the side of iPhone.  

Let’s look at practical examples on how to use Visual Intelligence on iPhone 16 and iPhone 16 Pro:

Suppose you’re out for a stroll and you stumble upon a restaurant you haven’t been to before. Just click and hold the Camera Control and point your iPhone. With just a click, boom! Your iPhone instantly pulls up restaurant hours, ratings, and quick options to check out the menu or make a reservation. And you can learn more with just a tap. Awesome.

And say you come across a flyer for an interesting event. Just click and you can add it to your calendar. Details about the event title, time and date, and location are automatically entered for you.

And say you see a cute little puppy at the park and are wondering, what kind of dog is that? Click and now you know.

All of this is done privately using a combination of on-device intelligence and Apple services that never store your images.

And there’s more.

The Camera Control is also your gateway to third-party tools, making it super fast and easy to tap into their specific domain expertise. So if you come across a bike that looks exactly like the kind you’re in the market for, just tap to search Google for where you can buy something similar.

And if you’re studying notes from your college lecture and get stuck on a particular concept, just use the Camera Control to ask chat GPT for guidance. Nice.

Of course, you’re always in control of when third-party tools are used and what information is shared.

So that’s visual intelligence, enabled by the new Camera Control on iPhone 16, helping you learn about your surroundings and get answers to your questions faster than ever before. It’s coming to the Camera Control later this year.

Avatar for Dave Johnson

Author: Dave Johnson

Dave Johnson is a tech writer at iGeekCentral covering news, how-tos, and user guides. Dave grew up in New Jersey before entering the Air Force to operate satellites, teach space operations, and do space launch planning. He then spent eight years as a content lead on the Windows team at Microsoft. As a photographer, Dave has photographed wolves in their natural environment; he's also a scuba instructor and co-host of several podcasts. Dave is a long time Mac user and has contributed to many sites and publications including CNET, Forbes, PC World, How To Geek, and Insider.

Recent stories by Dave Johnson

Leave a Comment