Here’s When Apple Plans to Unveil Its Biggest Apple Intelligence Features

Apple made a splash during its WWDC keynote last week when it announced Apple Intelligence . This is the company’s official attempt to leverage fancy AI features that most tech companies are already using. While Apple Intelligence may have generated the most headlines over the past week, many of its core features won’t be available when you upgrade your iPhone, iPad or Mac this fall.

According to Bloomberg’s Mark Gurman , Apple is slowing the rollout of these long-awaited AI features. The main reason is that these features are not ready yet. Apple has been trying to implement generative artificial intelligence features in its products for more than a year after the technology boomed in late 2022. (Thanks to ChatGPT.) Many of these features are quite complex and will take more time to implement correctly.

That said, Apple could probably release these features sooner and in larger batches if it wanted, but there’s a strategy here: by rolling out big AI features in limited numbers, Apple can root out any major problems before adding more AI into the mix ( After all, the AI ​​is hallucinating ) and can continue to grow its cloud network without putting too much pressure on the system. It helps that the company has saved these features for a specific small pool of Apple devices: the iPhone 15 Pro and 15 Pro Max (and likely the iPhone 16 line), as well as M-series Macs and iPads.

Apple Intelligence in 2024

If you installed the beta version of iOS 18 or macOS 15 right now, you might think that none of the Apple Intelligence features will be ready in the fall. That’s because Apple is delaying these AI features for beta testers until this summer. With a public beta scheduled for July, it seems safe to assume that Apple plans to retire Apple Intelligence next month. Again, we don’t know for sure.

There are currently some AI features in this first beta, even if they aren’t strictly “Apple Intelligence” features: iOS 18 supports voice memo transcription, as well as improved voicemail transcription, and supports automatic calculation of equations you type. This is a limited experience, but since this is just the first beta, we’ll see more features soon.

In fact, Apple is currently planning to implement some flagship features in the first release of Apple Intelligence. This includes web page summaries, voice memos, notes, and emails; AI-powered writing tools (e.g. rewriting and proofreading); and generating images, including artificial intelligence-generated emoticons. Apple calls it “Genmoji”. You’ll also receive a summary of AI notifications and see specific alerts first based on what the AI ​​thinks is most important.

Additionally, some new updates to Siri will arrive with the first version of iOS 18. This fall, you should notice a new assistant user interface, as well as a new convenient typing experience in Siri. But most of Siri’s touted features won’t be ready for some time. (More on this below.)

The timing of ChatGPT integrations is also a bit up in the air: They might not arrive with the first iOS 18 release in the fall, but Gurman believes they’ll arrive before the end of the year. For developers, the Xcode Swift Assist AI assistant will likely not be released until this year.

Apple Intelligence’s new Siri won’t arrive until 2025

The biggest delay appears to be related to prominent Siri updates, many of which won’t arrive on iOS and macOS until 2025. This includes contextual understanding and action: a prime example from the keynote was when a demonstrator asks Siri when her mom’s plane will land. , and the digital assistant can answer the question by pulling data from multiple applications. This “understanding”, which could provide many convenient actions without having to explicitly tell Siri what you want it to do, takes a little longer to bake.

Additionally, Apple will delay until next year the ability for Siri to act on apps using custom commands. If possible, you’ll be able to ask Siri to edit the photo and then add it to a message before sending. Siri will actually feel like a smart assistant that can do everything for you on your iPhone, iPad, and Mac, but it takes time.

Siri also won’t be able to analyze and understand what’s happening on your screen until 2025. Next year, you’ll be able to ask Siri a simple question based on what you’re doing on your device, and the assistant will understand. If you’re trying to plan a movie with someone to watch Inside Out 2 , you can ask Siri, “When is it on?” and Siri should analyze the conversation and return results based on movie viewing times in your area.

Finally, Apple Intelligence will remain English-only, at least until next year. Apple needs more time to train AI in other languages. However, as with other AI features, it makes sense to delay implementing this feature until it is 100% ready.

AI may be the focus of the tech industry, but major AI features often lead to disastrous consequences. (Just check out Google’s AI reviews or Microsoft’s review feature .) The more time Apple gives itself to get the technology right, the better. In the meantime, we can use the new features that are already available.

More…

Leave a Reply