Apple is known for its innovative products and services, but when it comes to artificial intelligence (AI), it has lagged behind its competitors such as Google, Amazon, and Microsoft. However, the tech giant is trying to change that by focusing on a different approach: making AI operate natively on its devices rather than through the cloud. This means that the AI algorithms and data are stored and processed locally on the device, without relying on an internet connection or a remote server. In this article, we will explore why Apple is pursuing this strategy, what are the latest innovations in AI technology that enable it, and what are the implications of local AI on Apple’s hardware.
Why Apple is Focusing on Locally Hosted AI?
Apple is directing its efforts towards locally hosted artificial intelligence (AI), signaling a departure from the prevalent trend of relying on cloud-based solutions. In a recent paper titled “LLM in a Flash”, Apple researchers present a groundbreaking approach to address the current computational bottleneck associated with running large language models (LLMs) on devices with limited memory, such as iPhones. This move underscores Apple’s commitment to advancing generative AI and catching up with industry rivals in Silicon Valley.
The traditional approach involves housing AI models, like those powering ChatGPT, in expansive data centers with substantial computing power, a strategy embraced by major players like Microsoft and Google. However, Apple’s research reveals a strategic pivot—shifting the focus towards AI that can seamlessly operate directly on iPhones. The motive behind this shift becomes evident as Apple aims to enhance the capabilities of its virtual assistant, Siri, and compete in the emerging market of AI-focused smartphones.
While competitors like Samsung gear up to launch “AI smartphones,” Qualcomm’s CEO, Cristiano Amon, anticipates that integrating AI features into smartphones will breathe new life into the industry. He envisions a transformative user experience, with AI-powered devices launching in early 2024, offering innovative use cases and reversing the decline in mobile sales. Apple’s unique angle in this landscape is emphasizing privacy benefits, ensuring queries are addressed on individual devices without compromising user data by avoiding cloud transmission.
The technical challenges of implementing large AI models on personal devices, primarily due to limited computing resources, have been a focal point for AI researchers. Apple’s experiment, optimizing inference efficiency on devices like the iPhone, opens the door to faster AI responses and potential offline functionality. The research, centered around models like Falcon 7B, signals Apple’s commitment to overcoming these challenges and harnessing the full potential of LLMs across various devices and applications.
Implications of Local AI on Apple’s Hardware
The use of local AI on Apple’s devices has several implications for its hardware, such as:
Improved User Experience
By using local AI, Apple can improve the user experience and satisfaction of its products and services, by making them more responsive, reliable, and convenient. For example, by using local AI, Siri can respond faster and more accurately to the user’s voice commands, without depending on the internet speed or availability. Similarly, by using local AI, the Photos app can create and display personalized albums and memories, without requiring the user to upload their photos to the cloud.
Potential For New Features And Capabilities
By using local AI, Apple can create and offer new features and capabilities that were not possible or feasible before, by leveraging the power and potential of AI. For example, by using local AI, the Face ID feature can recognize the user’s face even when they are wearing a mask, glasses, or a hat, by using AI to adapt to the user’s appearance changes. Similarly, by using local AI, the AirPods Pro feature can automatically switch between noise cancellation and transparency modes, by using AI to detect the user’s intention and situation.
Increased Demand For Apple Hardware
By using local AI, Apple can increase the demand and sales of its hardware products, by creating a competitive advantage and differentiation from its rivals. For example, by using local AI, the Translate app can provide a unique and useful service that is not available on other devices, by allowing the user to translate text and speech offline, and without compromising their privacy. Similarly, by using local AI, the M1 chip can provide a superior and unmatched performance and efficiency that is not achievable on other chips, by integrating the CPU, GPU, Neural Engine, and other components on a single chip.