Apple’s much-maligned Siri voice assistant is getting a huge infusion of artificial intelligence. It’s part of a movement to bring AI features to a wide range of iPhone, Mac and iPad applications.
“Thanks to the capabilities of Apple Intelligence, this year marks the start of a new era for Siri,” said Kelsey Peterson, Apple’s Director, Machine Learning and Al, during Monday’s WWDC24 keynote.
Apple and Siri go all-in on AI
Microsoft and Google rushed artificial intelligence products to users in recent months, but Apple was slow to embrace the new trend. Monday’s keynote address for WWDC24 makes it clear that’s now changed.
Apple’s new AI efforts are focused on typical users. An average iPhone user doesn’t need AI to generate a heist movie starring puppets, and so that’s not a part of Siri. Instead, the voice-driven system will be better at understanding what users want and performing complex everyday tasks. Apple calls it, “AI for the rest of us.”
There’s no better example than the redesign. Instead of a circle at the bottom of the display, the entire outside edge of the screen now lights up to show Siri is active.
An actual digital assistant
The old Siri could handle “Turn on the kitchen light.” The new Siri is up to “Send the photos from the cookout on Saturday to Malia.” This means the AI is capable of identifying which pictures in the Photos app are from a cookout held on Saturday, who Malia is, and how to send her the images.
It can also maintain context from one request to the next. As a simple example, it can handle “What time is it in San Francisco?” then “What’s the weather like there?”
Plus, the AI can respond to relatively vague requests, like “Play that podcast Jamie recommended.”

Screenshot: Apple
A highlight is the ability to access information in multiple applications to answer a question. During a demo during Monday’s WWDC24 keynote, Kelsey Peterson asked Siri “When is Mom’s flight landing?” To get the answer, Siri needed to know who Mom is and what flight she’s on, then access real-time flight details to give an arrival time.
Staying (mostly) private
Apple protects the privacy of its customers so as much of its new artificial intelligence features as possible will be handled on the device. However, some tasks are beyond what an iPhone or even a Mac can handle. Those will be sent to a system Apple calls Private Cloud Compute.
“When you make a request. Apple Intelligence analyzes whether it can be processed on device if it needs greater computational capacity,” said Craig Federighi, SVP President of Software Engineering, during the WWDC24 keynote. “It can draw on Private Cloud Compute and send only the data that’s relevant to your task to be processed on Apple silicon servers. Your data is never stored or made accessible to Apple.”
Those who want capabilities beyond those offered by Apple Intelligence can tie in directly to the full OpenAI GPT features, which go into areas Apple doesn’t, like video generation. Users will be explicitly asked if they want to send a request outside Apple’s system.
All of Apple’s AI capabilities are free. The same goes for accessing the basic GPT functions. However, OpenAI charges for access to advanced features, and a method to allow users to pay for these is being developed.
Ai-powered Siri in iOS 18, macOS 15 and iPadOS 18
While Apple used Monday’s keynote address of WWDC24 to unveil the new artificial intelligence features coming to Siri, the full launch of these is still months away. iOS 18, macOS 15 and iPadOS 18 with Siri built in will reach customers in this autumn.
Those who just can’t wait can sign up for the Apple Beta Software program. It’s free, but also guarantees buggy software.
Just be aware, the OpenAI tie in isn’t scheduled to be added until later.
Via: Apple