All the AI Features Apple Plans to Introduce This Year
While the rest of the tech world is trying to add as many artificial intelligence features as possible to every product and service imaginable, Apple remains silent. In the eighteen months since OpenAI changed the game with the launch of ChatGPT, Apple has yet to bring any significant AI functionality to the iPhone, iPad or Mac, although Microsoft, Google and Meta don’t seem to be focusing on anything friend.
But if rumors are to be believed, that will change this year as Apple is expected to release new AI features for iOS 18 and macOS 15 at WWDC in June . Rumors about the trillion-dollar company’s artificial intelligence plans have been swirling for months, and the hints keep coming. According to Bloomberg’s Mark Gurman , who has a solid track record of covering Apple rumors, the company is planning an approach to artificial intelligence that isn’t as flashy as some rival efforts. Instead, Apple will introduce artificial intelligence features that integrate with apps iPhone and Mac users already know and use, such as Photos, Notes, and Safari. The initiative is known as Project Greymatter.
As an artificial intelligence skeptic, I like this plan. This allows Apple to enter the AI space by offering AI features that people can actually use, rather than spending resources on “revolutionary” AI capabilities that most users will ignore once the novelty wears off.
How Apple plans to implement its artificial intelligence features
Whatever Apple announces this year, its AI features will need to be backed by… something. Chances are, your iPhone or Mac already has an NPU (neural processing unit), which is a piece of hardware specifically designed to handle artificial intelligence tasks. (Apple already has some AI features, including Live Text , which uses the NPU to handle processing.) The company has put a lot of emphasis on the NPU in the M4 chip in the new iPad Pro , which was once added to the new iPad Pro line. Macs will likely be the basis for many of Apple’s future artificial intelligence features.
However, not all features are ideal for on-device processing, especially on older iPhones and Macs. Gurman predicts that most Apple features will be able to run locally on devices released in the last year or so. If your iPhone, iPad or Mac is fairly new, the hardware should be up to par. However, for older devices or any features that are particularly power hungry, Apple plans to outsource that processing power to the cloud.
Apple has reportedly been in talks with OpenAI and Google to lease those companies’ cloud AI systems to run some of its new features, but it’s unclear if or when those deals will materialize. Gurman says Apple plans to bring some cloud functionality to server farms with M2 Ultra chips. If Apple can handle their cloud processing on their own, I’m sure they’ll prefer that to signing a deal with a competitor. We’ll likely see Apple’s grand AI plan come to fruition at WWDC.
Speaking of plans, here are the AI features Apple is rumored to reveal in June:
Generative artificial intelligence emoticons
Emoji are a huge part of any iOS or macOS update (we may have already seen a few new emoji coming in future versions of iOS 18 and macOS 15). However, Gurman suggests that Apple is working on a feature that would allow generative AI to create unique emoji based on what you currently type. It sounds really fun and rewarding if done well. While there are already a ton of emoji to choose from, if nothing matches your specific mood, an icon based on what you’re actively talking about with a friend might be your best bet. On Android, users have had the Emoji Kitchen for several years now, which allows you to combine certain emojis to create something completely new. Apple seems ready to offer an effective implementation of this idea.
AI-powered Siri
I don’t know about you, but I’ve never been crazy about Siri. Smart Assistant often doesn’t fulfill my requests because it misunderstands my request or simply ignores me. For some reason, this is especially bad on macOS, to the point that I don’t try to use Siri on my MacBook at all. If Apple can infuse Siri with artificial intelligence, or at least make it reliable , that would be great in my opinion.
Earlier this month, we learned about the possibility of Apple’s artificial intelligence integrating with Siri based on information leaked to The New York Times by an unnamed source. A recent report from Gurman says Apple plans to make interacting with Siri “more natural,” and while it’s possible the company will outsource that work to Google or OpenAI, Gurman says the company wants to use its own large language models (LLMs). Perhaps a special AI Siri is even being developed for the Apple Watch.
This doesn’t mean Apple is turning Siri into a chatbot—at least not according to Gurman, who reports that Apple wants to find a partner that can provide a chatbot for the company’s platforms by WWDC, without Apple having to build one itself. . For now, it appears the company is siding with OpenAI over Google, which is why we could see ChatGPT on the iPhone this year.
Smart Search in Safari
Earlier this month, we learned that Apple was planning at least one AI feature for Safari: Smart Search . The feature reportedly scans any web page and highlights keywords and phrases to create a generative summary of that site using AI. While Gurman says Apple is working to improve Safari’s web search, there’s a lot we don’t know yet about how Smart Search will work.
AI in Spotlight Search
Speaking of improving search, Apple could use artificial intelligence to make on-device Spotlight search more useful. This feature always produces mixed results, so I’d like generative AI to not only speed up my searches, but also return more relevant results.
AI-powered accessibility features
Apple recently announced several new accessibility features coming “later this year,” which almost certainly means “coming in iOS 18 and macOS 15.” While not all of these features are supported by artificial intelligence, at least two of them seem really interesting. First, Apple is using artificial intelligence to allow users to control their iPhone and iPad with just their eyes, without the need for external hardware. There’s also a “Listen to Atypical Speech” feature that uses on-device AI to learn and identify your specific speech patterns.
Automatic replies to messages
Starting last year, iOS and macOS will suggest words and phrases as you type to help you finish sentences faster. (This feature should not be confused with the three predictive text options in Messages that have been around for years.) In iOS 18 and macOS 15, Apple may also introduce automatic replies. This means that when you receive an email or text message, the app can suggest a complete response based on what you reply to. Soon we will all be able to communicate with the click of a button. (Hmm, do I want to click on the response “Sounds good: meet me there” or “Sorry, I can’t. Raincheck!”)
Transcription of voice notes
This would be a great use of AI: Apple already transcribes voice messages sent in the Messages app (pretty quickly in my experience), so using AI to transcribe the recordings you make in voice notes makes sense.
Smart resumes
Gurman says one of Apple’s main focuses is “smart summaries,” or summaries of information generated by artificial intelligence that you may have missed without looking at your iPhone, iPad or Mac, including notifications, messages, web pages, articles , notes and much more. . iOS already has a notification summary feature, but these smart summaries seem more complex.
Photo retouching
We don’t know much about this feature, but Gurman also says Apple is working on ways to use artificial intelligence to retouch photos. The company has already created an AI-powered image editor that uses natural language prompts to perform editing; It’s possible that some of these features will be included in the AI ”photo enhancer” for the Photos app on iOS and macOS.