Apple’s AI Features Could Potentially Change How an iPhone is Used

Technology



Apple is set to host its “Let Loose” event on Tuesday, May 7, where it's expected to unveil new iPad Air and iPad Pro models and a new Apple Pencil. However, the Worldwide Developers Conference (WWDC) 2024 on June 10 could be the event to watch out for, as it may change the company's focus on its devices, especially the iPhone. The Cupertino tech giant is said to unveil its artificial intelligence (AI) strategy and introduce new features with iOS 18. Based on the articles published by Apple researchers, we can see the company's vision behind

A report from The Verge delved into research documents that Apple has recently released to highlight that the company is focusing on creating a more efficient and intelligent Siri, the iPhone's virtual assistant. While this is a strong possibility, there are enough hints that AI features could completely reshape the way users interact with the iPhone. While it's unlikely that all of the new capabilities will be rolled out at once, the changes could be phased in over the next few years.

In most of Apple's published research work, there is a focus on small language models (SLMs) that can operate independently within a device. For example, the company published a paper on an AI model called ReALM, which stands for Reference Resolution As Language Model. The functionality of this model is described as performing and performing tasks that are requested using a contextual language. The description has suggested that this model could be used to update Siri.

Another such research paper mentions a 'Ferret-UI', a multi-modal AI model that is “designed to perform precise reference and grounding tasks specific to user interface screens, while interpreting and acts skillfully according to open language instructions.” In essence, it can read your screen and perform actions on any interface, be it the home screen or an app. This functionality could make it much more intuitive to use an iPhone using spoken commands over finger gestures.

Then there's Keyframer, which claims it can generate animation from static images, and another AI model that can edit images with AI. These capabilities could exponentially improve the Photos app and allow users to perform complex edits in simple steps, similar to what DALL-E and Midjourney offer.

However, it should be noted that these speculations are based on the research papers published by Apple and there is no guarantee that they will become a feature. Apple's vision behind AI will become clearer after the WWDC 2024 keynote.


Affiliate links may be automatically generated; see our ethics statement for more information.



Source

Leave a Reply

Your email address will not be published. Required fields are marked *