The competition of Apple AI's domestic large models is not suspenseful

07/04 2024 584

Written by: Intelligent Relativity

Apple finally announced its latest AI progress.

A month ago, just as expected, artificial intelligence was the focus of this year's WWDC conference. Out of the 105-minute keynote speech, more than 40 minutes were dedicated to introducing Apple's AI achievements.

It seems that Apple also intentionally played with a "homophonic pun," redefining AI as "Apple Intelligence."

But regardless, Apple AI's appearance did indeed answer many doubts in the market.

Previously, there were rumors that Apple phones might have two versions, one for the domestic market and one for the overseas market, due to the different AI large models integrated into the system. Some even went so far as to call the domestic version a "crippled version" online.

Is that really the case? Perhaps the answer is not.

The fundamental base of AI is firmly in Apple's hands

Apple Intelligence, abbreviated as AI, is not just a "homophonic pun" for Apple. Instead, it reflects their unique logic in understanding and designing AI functionality.

When introducing Apple Intelligence, Tim Cook emphasized that they aim to create "Personal Intelligence" that goes beyond "Artificial Intelligence."

What does that mean? We can infer from Apple's past practices that Apple AI is highly integrated with its system, applications, and ecosystem to fully meet the intelligent needs of Apple users. For example, deep integration with services like Siri, FaceTime, iMessage, Apple Pay, and iCloud provides a seamless cross-device AI experience.

In simple terms, Apple AI aims to achieve the advantages of system-level applications.

From the currently announced information, we can see some preliminary applications. Like AI images, Apple AI is expected to not only generate images based on prompts but also create image content closely related to users' personal lives and interests based on their actual experiences and personalized needs.

Similar to the "Memories" feature in Apple phones, Apple is attempting to provide users with a brief AI video editing service, but this requires access to users' photo libraries so that the Apple system can repeatedly identify photo content to edit videos with different themes.

Apple's Memories feature

To achieve such or even better services and effects, Apple AI must possess system-level capabilities. Therefore, it is highly probable that Apple AI's underlying large model will adopt a self-developed solution.

From the technical documents of Apple's large model, it is indeed self-developed. It is reported that Apple's devices will use a small model with 3 billion parameters to handle local computation and data processing, while also learning and training on users' personal habits and data to provide more intimate and personalized services.

On the cloud side, Apple has not yet disclosed the parameters of its self-developed cloud-based large model, but it is said to be comparable to GPT-4 Turbo. Additionally, the integration of third-party large model products like ChatGPT is also a fact, but not the only option.

According to Craig Federighi, Apple's senior vice president of software engineering, Apple plans to allow users to choose their preferred large models in the future, including Google's Gemini.

Furthermore, Apple has also released OpenELM, a set of small, open-source AI models designed specifically for running on devices. These models range from 270 million to 3 billion parameters and aim to improve AI task processing capabilities on mobile devices.

It is evident that Apple already has the capability to build a closed-loop AI system internally, and the fundamental base of AI is firmly in Apple's hands. Third-party large model products like ChatGPT are merely a condiment.

The "crippled" Apple is a figment of imagination

Judging from the current situation, neither OpenAI's ChatGPT nor Baidu's ERNIE Bot is sufficient to affect Apple AI's basic functions. The positioning of these third-party large model products is more like Apps in the App Store for Apple.

Perhaps due to regional differences, there may be differences in app usage permissions, but this does not define the functional version of Apple's products. Just as we cannot say that the current Apple phone is a "crippled version" just because we cannot access "X" (Twitter), the same applies to third-party large model products.

Whether it's "X" or ChatGPT, if these products cannot be used domestically, it is a problem with the product itself, not the distribution channel. Moreover, Apple's self-developed large model, from the device side to the cloud, is already quite comprehensive, providing a relatively complete AI experience.

From the current perspective, what will have a greater impact on user experience is Apple's integration and application of small models on the device side. Based on Apple's concept of prioritizing user experience, it emphasizes the customization of AI models, similar to the sparse activation mechanism of the MoE model. Through the deployment and combination of multiple small models, it can quickly solve daily tasks during user usage.

As we all know, large models require more computational resources, storage space, and network speed due to their larger parameter sizes. If hardware and network conditions cannot keep up, issues like high latency and packet loss are likely to occur, which cannot provide a good AI experience. In contrast, small models are suitable for running on terminals like smartphones due to their small size, low computational resource requirements, and high training efficiency.

Today, the MoE architecture emerging in the large model field corresponds to Apple's approach to deploying models from the device side to the cloud. The high efficiency and adaptability of small models on the device side are inevitable, while large models, whether self-developed by Apple or intervened by third-party vendors domestically or internationally, only provide an advanced personalized experience without affecting the main body.

Of course, through the experience of "Intelligent Relativity," we found that when we really use large model products, domestic large models are actually not that bad. On the contrary, due to their native computing power, Chinese semantic understanding, and high integration with the Chinese internet, domestic large models have more advantages for local users.

On the one hand, regarding language and cultural understanding, the application of AI large models is more conversational. During our experience, "Intelligent Relativity" found that users' personal expressions are very casual and may not be very standardized. Therefore, domestic large models have a more significant advantage in understanding users' semantics, thus responding more accurately to users' needs.

On the other hand, there is the issue of acquiring data resources. AI large models are constantly evolving, especially by continuously acquiring new data during user usage for training and learning, ultimately presenting a more ideal state. In this regard, domestic large models, backed by the vast Chinese internet and user base, can maximize access to local data for learning and refinement, while OpenAI and Google have certain limitations and are difficult to access.

In short, when paying attention to large models, we should not only look at their performance in the lab but also at their application space and development prospects in the market and industry.

Therefore, it is not a bad thing for Apple AI to integrate domestic large models like Baidu's ERNIE Bot in China. For local users, based on the joint creation of Apple and Baidu, they can obtain services and experiences that are more in line with local needs.

Perhaps we don't need to focus on the choice of Apple AI by obsessing over a product like OpenAI, which is inherently difficult to experience.

Final Thoughts

Overall, there is no doubt that Apple AI's fundamental base is supported by self-developed large models.

In addition, the performance of domestic large models in local applications is not bad, which we can look forward to in the coming time.

Apple AI's mission is not to introduce OpenAI's ChatGPT to China but to create a more efficient, convenient, and powerful AI system based on its own system and ecosystem to provide advanced services to its users.

Therefore, controversies surrounding the "crippled version" and large models are not that important. As revealed by Apple, the integration of third-party large model products is not the only option, and Apple will find ways to allow users to choose their preferred large models.

In the future, we look forward to Apple fulfilling its promise and giving users the right to choose.

*All images in this article are sourced from the internet.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.