Visual Intelligence is a new Apple Intelligence feature that lets you get information on things around you, add events to your calendar from a poster, look up products and more. The iPhone 16 lineup’s new Camera Control feature, makes this powerful functionality instantly available at the press of a button.
After showcasing previously announced Apple Intelligence features, Craig Federighi, Apple’s senior vice president of software engineering, introduced Visual Intelligence during Monday’s “It’s Glowtime” event.
“We wanted to give you the ability to instantly learn about everything you see,” said Federighi.
Visual Intelligence will be available on all iPhone 16 models.
Visual Intelligence coming exclusively to iPhone 16 models

Photo: Apple
Apple unveiled Apple Intelligence, its suite of AI-powered features coming to iPhone, Mac and iPad, in June at WWDC24. The company applies this umbrella term to all of the next-generation machine learning features coming to Apple’s biggest platforms.
The latest addition, Visual Intelligence, lets you point your camera at something, press the Camera Control button, and take action on it:
- Capture a business storefront to look up hours, photos, reviews and more on Apple Maps.
- Point the camera at a poster to add the event details to your calendar.
- Point the camera at a dog to look up what kind of breed it is.
- Look up a product at the press of a button.
- Capture a page of notes to ask ChatGPT for more information.
Visual Intelligence is powered by the Apple silicon on the iPhone hardware itself. For additional processing power, Private Cloud Compute enables additional power without compromising privacy. That means Apple cannot access the images you capture using Visual Intelligence.
Other Apple Intelligence features

Photo: Apple
The first Apple Intelligence features will come to the United States in English in October, including:
- Writing tools that help you proofread, change tone and summarize text.
- Notification summaries that give a brief overview of a busy text conversation.
- Image clean-up tool in Photos that lets you erase elements from a picture by circling it.
- Siri can understand you if you stumble over your words, and has a new visual design.
Apple will roll out additional Apple Intelligence features in later updates:
- A more capable version of Siri will be able to see your screen and take action on your behalf inside the apps you use.
- Integration with ChatGPT will help Siri answer general knowledge questions.
- Contextual knowledge of personal information based on your conversations and data inside your apps will help Apple Intelligence and Siri understand your life.
- Image Playground is a tool in Messages, Notes and other apps for creating images.
- Genmoji lets you create custom emoji based on a description or based on a person in your Photos library.
- Swift Assist will help programmers in Xcode write functions or add features to their code.
Apple Intelligence will become available in English localized to Australia, Canada, New Zealand, South Africa and the United Kingdom in December. Next year, it will be available in Chinese, French, Japanese and Spanish, Apple says.
Apple Intelligence is only for the latest and greatest hardware
These advanced features mean a high bar of system requirements. Only the iPhone 15 Pro and the new iPhone 16 models can run Apple Intelligence. On a Mac or iPad, you need an M-series chip; iPads with the A12Z or A14 chip and Macs with an Intel processor don’t make the cut.
Apple will make Visual Intelligence available only on the iPhone 16 and iPhone 16 Pro models.