Published on October 22, 2023, 11:38 am
Apple is quickly adapting to the rapid advancement of AI tools like OpenAI’s ChatGPT and is now determined to develop competitive features for its devices. While the company did not specifically mention AI at its recent developer conference, it heavily emphasized “machine learning” that will be utilized in various Apple products. For instance, the improved autocorrect in iOS 17 is powered by a language model, and there’s also the impressive computer vision application called “Apple Vision Pro.”
Contrary to the perception that Apple is lagging behind in AI, Bloomberg’s Mark Gurman reveals that Apple has been working diligently since late last year to catch up with the tech industry’s AI focus. According to insider information shared by Gurman, Apple was caught off guard and feels it has missed out on an important development. Internally, there exists a sense of anxiety regarding this perceived oversight.
Even Siri, which seems like an obvious candidate for AI enhancements given ChatGPT’s recent audio capabilities, has yet to fully benefit from advances in generative AI. However, Apple is making significant efforts to address this issue. Senior vice presidents John Giannandrea and Craig Federighi, along with Eddy Cue (head of services), are leading Apple into the world of generative AI. They have an annual budget of $1 billion allocated towards this endeavor, which may seem modest compared to Microsoft, OpenAI, or Google investments but remains substantial for a tech giant like Apple.
Giannandrea’s team primarily focuses on developing new AI technology and improving Siri’s intelligence. Although an improved version of Siri might be released next year, the technology is still under development and concerns about safety persist.
Federighi’s software development group is integrating AI into the next version of iOS while incorporating features based on Apple’s large language model (LLM). Simultaneously, Cue’s team explores ways to incorporate AI into as many applications as possible. Internal sources suggest that Apple’s software development teams are even considering integrating generative AI into development tools like Xcode. This move could streamline the app development process and align Apple’s services with Microsoft’s GitHub Copilot, which offers automatic code completion suggestions for developers.
Another area of application for AI is productivity apps. For instance, AI could assist with writing in apps like Pages or automatically generate slide presentations in Keynote, similar to features introduced by Microsoft in its Word and PowerPoint apps.
Apple is also exploring how generative AI can improve its internal customer service tools within the AppleCare Group. By implementing AI, Apple aims to enhance efficiency and responsiveness, ultimately improving the overall user experience.
Internally, there is an ongoing debate regarding how generative AI should be deployed: as an on-device capability, a cloud-based solution, or a combination of both. The on-device approach aligns with Apple’s commitment to privacy and would be faster to implement. However, utilizing cloud-based large language models (LLMs) would provide Apple with greater flexibility—an advantage given the rapid evolution of AI technologies. It is likely that a combined approach will be adopted.
According to Gurman’s earlier report, Apple has already been using an internal language model named “Apple GPT.” This model based on the Ajax framework allows prototyping, text summarization, and answering questions related to trained data. Although its use in customer applications remains uncertain at this point.
Additionally, it was reported that Apple is investing millions of dollars daily in training its AI systems. A dedicated team known as the “Foundational Models” team, led by Giannandrea and comprised of 16 individuals, focuses on conversational AI similar to ChatGPT. The leaked “Ajax GPT” model boasts over 200 billion parameters—surpassing OpenAI’s GPT-3.5—and research conducted using this model is expected to deliver more accurate voice commands for iPhone users.
For example, Apple plans to introduce a voice feature that allows users to send GIFs through voice commands using the Apple Shortcut app. This feature is anticipated to be available with the next version of iOS, set to launch next year.