‘Apple Intelligence’ Revealed: 12 AI Features Coming To iPhones, Macs And iPads
Apple’s much-anticipated upcoming suite of AI features for select iPhones, Macs and iPads is called Apple Intelligence, and it will include an enhanced Siri experience, systemwide Writing Tools, message and notification prioritization, call transcription and summarization, and ChatGPT integration.
Apple Monday revealed Apple Intelligence, its much-anticipated suite of AI features that will enhance Siri’s natural language understanding and integration with apps, among several other features, to deliver what it’s calling a “brand-new standard for privacy in AI.”
Unveiled at the tech giant’s WWDC 2024 event, the suite of AI features will include integration with ChatGPT and is expected to arrive this fall in beta for Macs and iPad Pros with an M1 chip or greater and the iPhone 15 Pro series as part of the upcoming MacOS Sequoia, iOS 18 and iPadOS 18 operating systems.
[Related: Apple Calls AI PC Rivals Laggards With M4-Based iPad Pro Reveal]
Tim Cook, Apple’s CEO, called Apple Intelligence “the new personal intelligence system that makes your most personal products even more useful and delightful.”
“We think Apple Intelligence is going to be indispensable to the products that already play such an integral role in our lives,” he said in Apple’s WWDC keynote.
The Cupertino, Calif.-based company made the announcement as vendors in the Windows-based PC market start to push a second generation of AI PCs—powered by Qualcomm’s Snapdragon X chips at launch with support to come later from Intel and AMD—that come with new Copilot+ capabilities from Microsoft.
Apple dedicated more than 40 minutes of its keynote to explain Apple Intelligence’s capabilities, which include systemwide Writing Tools, message and notification prioritization, call transcription and summarization, image generation as well as an upgraded Siri that comes with enhanced language understanding and system integration.
All these capabilities are based around the idea of combining generative AI capabilities with a “user’s personal context to deliver truly helpful intelligence,” according to Cook.
To deliver these capabilities while promising an unprecedented level of privacy, Apple is relying on in-house generative AI models that run on the device’s processor—the M-series chips for Macs and iPad Pros, the A17 for the iPhone 15 Pro series—and on Apple silicon running in the company’s new Private Cloud Compute infrastructure.
What follows are 12 features coming with Apple Intelligence this fall.
‘A New Era For Siri’
Thanks to generative models Apple developed for Apple Intelligence, the company’s 13-year-old Siri voice assistant is getting a substantial update in natural language understanding and systemwide integration. The assistant will also accept text commands when the new generative AI features debut this fall.
“We can make Siri more natural, more contextually relevant, and, of course, more personal to you,” said Kelsey Peterson, Apple’s director of machine learning and AI.
The boost in natural language understanding means Siri can accurately follow a user’s voice commands, even if the user stumbles and corrects themselves midway, according to Apple. It can also understand the context of commands and uses information from previous conversations to extrapolate details.
For instance, in a demonstration during Apple’s keynote, Peterson used Siri to ask for tomorrow’s weather at Muir Woods National Monument. After Siri provided the forecast, Peterson then asked the AI assistant to “create an event for a hike there tomorrow at 9 a.m.,” and Siri knew she was referring to Muir Words.
Apple plans to integrate Siri more deeply within the operating systems of the iPhone, iPad and Mac too, enabling the AI assistant to “take hundreds of new actions in and across Apple and third-party apps,” according to the company.
The new version of Siri will also have on-screen awareness, giving it the ability to understand commands in the context of an app that’s open. Apple indicated that not all apps may be supported at launch, but the list of supported apps will expand over time.
In addition, Siri will have the ability to provide tailored responses using personal information within apps. For instance, Apple said Siri could automatically locate and play an episode of a podcast after the user requested the AI assistant to “play that podcast that Jamie recommended” without specifying if this was in a text message or email.
Systemwide Writing Tools
Apple Intelligence will give users the ability to rewrite, summarize and proofread text nearly everywhere they write with the systemwide Writing Tools feature, which will work with Mail, Notes, Pages and an undisclosed number of third-party apps.
The feature can rewrite text using a different tone, giving users the choice of friendly, professional or concise, according to Apple.
With proofreading, Writing Tools can check “grammar, word choice and sentence structure while also suggesting edits—along with explanations of the edits—that users can review or quickly accept,” the company said.
The feature’s summarization capability allows text to be summed up in a paragraph, bulleted key points, a table or a list.
Priority Messages, Smart Reply, Priority Notifications And More
In the Mail app, Apple plans to introduce a feature called Priority Messages that will use Apple Intelligence to automatically prioritize emails based on their content. Priority emails could range in subject from event invitations to boarding passes.
The Mail app is getting other AI-enhanced improvements, such as automatic summaries that replace the standard excerpts viewed from the inbox. There’s also Smart Reply, which provides users with suggestions on how to respond and which questions to answer.
Systemwide, Apple Intelligence is enabling a feature called Priority Notifications, which, like Priority Messages, automatically prioritizes notifications based on importance and provides summaries of each notification. It will include an option to Reduce Interruptions as part of Apple’s Focus settings by surfacing only the most important notifications.
Apple is also adding the ability to automatically record, transcribe and summarize phone calls with the Notes and Phone Apps.
These are all made possible by the enhanced natural language understanding and generation capabilities of Apple Intelligence.
Image Playground, Genmoji And Photos App Update
Apple Intelligence is introducing an image creation feature called Image Playground, which automatically generates images from a combination of text prompts, concept suggestions and photos from the user’s personal library.
The images are generated using on-device processing, and they can be created using three styles: animation, illustration or sketch.
Image Playground will be integrated into the Messages app and create suggestions for image creations based on the contents of the conversation. It will also be made available in other Apple apps such as Notes, Keynote, Freeform and Pages as well as third-party apps.
Apple Intelligence is also introducing a way to create custom emojis. Called Genmoji, the feature can automatically create a Genmoji based on a text description or photos of friends and family. Users can then share the Genmoji in Messages and other apps.
The Photos app is getting a boost from Apple Intelligence too, giving users the ability to search for photos and videos with natural language queries and remove distracting objects from the backgrounds of photos, among other things.
A new feature called Memories will give users the ability to craft stories from photos and videos based on text descriptions. Using language and image understanding, Apple Intelligence will stitch these visuals together, assemble them into chapters and export the presentation into a movie that has its own narrative arc.
Private Cloud Compute
Apple said it developed Private Cloud Compute to handle “more complex requests” for Apple Intelligence and said that it offers a more secure and privacy-minded alternative to other cloud computing solutions.
“Traditionally, servers can store your data without you realizing it and use it in ways you did not intend. And since server software is only accessible to its owners, even if a company says it’s not misusing your data, you’re unable to verify their claim or if it changes over time,” said Craig Federighi, Apple’s senior vice president of software engineering.
By contrast, Private Cloud Compute draws upon the “security properties of the Swift programming language,” only performs requests if Apple Intelligence determines it’s necessary, only receives data that is relevant to a computationally complex request, never stores or makes accessible to Apple personal data, and allows independent experts to inspect the servers’ software to verify the company’s privacy claims, Federighi claimed.
“In fact, Private Cloud Compute cryptographically ensures your iPhone, iPad and Mac will refuse to talk to a server unless its software has been publicly logged for inspection. This sets a brand-new standard for privacy in AI and unlocks intelligence you can trust,” he said.
ChatGPT Integration With Siri, Writing Tools
For Siri prompts that go beyond the knowledge base of Apple Intelligence, the company plans to let users send queries to ChatGPT as part of a planned integration.
Apple said users can make queries to ChatGPT using text, images and documents, but Siri will ask the user’s permission every time before the request goes through. The answers are then provided directly within the Siri chat interface.
The company plans to integrate ChatGPT into Writing Tools as well, giving users the ability to tap into the cloud-based AI assistant to generate text content. Users will also be able to use ChatGPT to create images within documents.
To protect user privacy, Apple said it will hide IP addresses from ChatGPT while OpenAI won’t store user requests. While users will have access to ChatGPT for free without creating an account, ChatGPT subscribers will be able to connect their accounts and access paid features through Siri and Writing Tools.
Apple’s ChatGPT integration will use OpenAI’s GPT-4o large language model.