Article Summary:
- Apple introduces “Apple intelligence” at WWDC 24.
- “Apple Intelligence” includes AI algorithms integrated into iPhones, iPads, and Macs.
- Partnership with OpenAI integrates ChatGPT into Siri using GPT-4.
- Apple Intelligence aims to be powerful, intuitive, integrated, personal, and private.
- Key functions:
- Language: text rewriting, idea condensing, and semantic reviews in apps like Mail, Notes, Safari, and Pages.
- Images: Generates images from scratch and personalizes them based on the recipient.
- Actions: Performs tasks like photo searches based on context and details.
- Personal context: understands user context and screen content for commands.
- Emphasis on security and privacy with functions generated from the device.
- Criticism for lack of originality; similar functions to existing systems like Samsung’s.
- A private cloud computing system ensures security for larger AI models using Apple processors and encrypted data.
- Questions remain about potential AI issues and data usage.
- iOS 18 release in September will enable these features on iPhone 15 Pro, iPads, and Macs with M1 chips or later.
- Limited to North American English initially, expanding to more languages by 2025.
- Apple’s collaboration with OpenAI provides significant financial support to OpenAI.
- ChatGPT integration into all Apple platforms, improving Siri’s capabilities.
Table of Contents
What is Apple Intelligence?
Apple Intelligence is a suite of sophisticated algorithms and AI systems designed to enhance user experience across Apple devices. It seamlessly integrates into various functionalities, making everyday tasks easier and more intuitive. This innovation encompasses text processing, image generation, action execution, and personal context understanding.
Apple’s commitment to artificial intelligence is, under no circumstances, to call it as such: it is called “Apple Intelligence,” presented at WWDC 24, the company’s global conference for developers.
Apple is never the first to adopt new technologies, nor is it the last, but it always does so when it is relatively mature and it makes sense to integrate it into one of its products. Well, that’s just been seen at Apple’s developer conference, where a lot of what they call “Apple Intelligence” has been shown.
Apple Intelligence is a set of algorithms and artificial intelligence systems built into iPhones, iPads, and Macs that make some tasks easier.
Integration with OpenAI:
In addition, the partnership with OpenAI (Sam Altman was at the WWDC) and Apple has paid off in an almost total integration of ChatGPT into Siri, which will use all the possibilities of GPT 4.0 to make any query, as long as you give your approval for that data to be sent to an OpenAI server that, don’t forget, will be used to improve their algorithms.
This is a review and its implications of what Apple has presented at WWDC 24, its annual conference for developers.
Apple Intelligence: Apple brings generative artificial intelligence to millions of mobiles, tablets, and computers.
Apple’s commitment to artificial intelligence is clear and fully integrated, but also somewhat disappointing.
Security and Privacy:
Powerful, intuitive, integrated, personal, and private. These are the five words that Tim Cook has used in his presentation to define Apple Intelligence and what an AI should be like for the end user, you and me.
These are the functions that Apple Intelligence will have that, I’m sure, will sound familiar to you.
- Language: intelligence will be able to rewrite text, condense ideas, and review text for semantic issues in typical Apple apps, such as Mail, Notes, Safari, Pages, and third-party apps.
- Images: It will use generative intelligence to create images from scratch through messages integrated into its applications, such as Messages, widely used in countries such as the US. It will also take into account who you are writing with to personalize those images.
- Actions: Intelligence will be able to review all the content of your mobile to perform actions on your request. In this way, to requests such as “show me all the photos of my mother with my brother and me,” Apple’s AI will be able to create a choice of those photos where these people are shown. It features the context of who’s who, location, details within photos, and more.
- Personal context: Perhaps the most important function of intelligence is that it can understand the personal context in which you find yourself. Like in the example in the image in Messages, it knows who you’re writing to; it knows data about that person from conversations. It can also read the screen to use that information in your command.
As Apple usually does, it always puts security in context, and all these functions can be used directly generated from the device.
So far, Apple has demonstrated a lack of originality with the possibilities of AI. It is exactly the functions that other systems, such as Samsung’s, already allow. It’s not a direct Apple problem, but rather the limitations of what generative AI can do today on the device.
Where we can start talking about gray areas between security and artificial intelligence at Apple is when larger models are needed. Apple has created a special server security system called Private Cloud Compute. This system uses only servers that work with Apple processors, and the company indicates that independent experts will be able to inspect and verify that the code of the software they use is safe. All data will be encrypted.
Market Impact:
There are still many doubts about Apple Intelligence, such as… Is it an AI that can hallucinate? Will it use your data to improve its responses, or to improve the responses of all users?
What is clear is that starting this September, when iOS 18 is released, all iPhone 15 Pro, iPads, and Macs with M1 chips and above will handle a lot more information, giving a real reason for the existence of those cores for AI.
One thing is clear from Apple’s announcement, and a lot of it has to do with the incredible marketing machinery they have: a lot of people are scared about the future of artificial intelligence by the news they read and see on TV, but who could be scared of “intelligence”? Nobody. Apple, again, turning the toast around.
Remember that Apple Intelligence can only be used on iPhone 15 Pro, 15 Pro Max, and iPads and Macs that use an M1 chip or later in the next versions of their operating systems. In addition, it is limited to North American English until more languages arrive in 2025.
Is Apple OpenAI’s savior? ChatGPT Integrated—free—on all Apple devices
If OpenAI was looking for a permanent way to stay afloat without relying on venture capital investments or premium versions of ChatGPT, Apple has just come to save them for several years.
It was known that OpenAI and Apple had signed a collaboration and that it would be integrated, in some way, into Apple devices, but what Apple has shown is basically what you would get by buying the company and integrating it deeply.
When Siri can’t answer, it will ask ChatGPT—with your prior approval—to answer for it. A demonstration that although Siri will improve with the new versions of iOS, iPadOS, and macOS, they are still a long way from what OpenAI has achieved.
We can only get an idea of the millions that Apple will be paying OpenAI to have access to what will surely be many millions of queries a day to ChatGPT’s systems, which GPT 4.0 will use.