Here’s when the best of Apple Intelligence arrives widely | Digital Trends

Here’s when the best of Apple Intelligence arrives widely | Digital Trends

The Apple Intelligence toolkit has witnessed a staggered mix of delayed features and underwhelming perks. But it seems that the most promising set of those AI tools that Apple revealed at WWDC earlier this year is right around the corner.

In the latest edition of his PowerOn newsletter, Bloomberg’s Mark Gurman writes that the iOS 18.2 update will start rolling out via the stable channel in the first week of December.

The aforementioned iteration of iOS is already in the beta testing phase and has introduced a trio of the most interesting AI upgrades destined for the iPhone. Among them is Siri’s integration with OpenAI’s ChatGPT, which seamlessly offloads complex queries to the chatbot.

Then, we have Visual Intelligence, which lets users point their phone’s camera at the world around them, and extract valuable information about whatever appears in the frame. In its current iteration, it can accomplish tasks like identifying a dog breed or picking up details listed on a poster.

However, given the deep level of integration with OpenAI’s stack, it won’t be surprising to see Visual Intelligence gaining the same set of capabilities as the latest iteration of ChatGPT and its own multi-modal comprehension capabilities.

Nadeem Sarwar / Digital Trends

iOS 18.2 beta has also introduced tricks like the standalone Image Playgrounds app that lets users create fun images based on their own pictures. The custom Genmoji system is also on the update table.

Gurman adds that with the iOS 18.4 update, Siri will finally attain the form that has fans excited for the virtual assistant’s true rebirth in the AI age. “It should let the digital assistant tap into people’s data and respond to queries based on the information on their screens,” he writes.

See also  Ultimate Ears Miniroll is a waterproof and wearable speaker | Digital Trends

Right now, Google’s Gemini is able to parse local files, both text and media, and respond based on the information it has picked up from those files. The deep integration with Workspace is what allows the AI to take a glance at files in Google Drive and other key tools like Docs.

The Notebook LM tool is also capable of doing the same, and can even make sense of YouTube videos as well as third-party webpage URLs without asking users for any subscription.

In Apple’s case, Siri – supercharged by OpenAI’s tech stack — will be able to answer questions by pulling information from files and data stored locally on the phone. Imagine scenarios like pulling up travel information from email, scheduling details from calendar entries, and business data from PDFs and sheets.

Visual Intelligence on iPhone.
Jesse Hollington / Digital Trends

However, the most utilitarian of those capabilities will be deeper integration with third-party apps. This would allow the virtual assistant to execute tasks in third-party apps solely based on voice or text commands.

Notably, Apple will also let users dip into the potential of other third-party tools such as Gemini, down the road. Moreover, Siri will also be able to provide answers based on the on-screen content.

iOS 18.4 is expected to arrive at some point in April next year. That’s the same window when Apple is expected to launch the next-gen iPhone SE, featuring a new design, faster silicon, and support for Apple Intelligence.











Source link

Technology