The Overlooked Covert Chip Conflict: The 'Core Battle' in the Realm of AI/AR Glasses

01/30 2026 411

When Meta's Ray-Ban smart glasses achieve annual sales exceeding 2 million units, Alibaba's Kuake AI glasses top the bestseller lists across major platforms within just 4 hours of their release, Xiaomi enters the sub-1,000 yuan AI glasses market with a mere 40-gram device, and even Li Auto unveils its own Livis AI glasses... Suddenly, it seems like the era of AI+AR glasses has truly arrived, hasn't it?

Press conferences follow in quick succession, products launch one after another, and the 'Hundred Glasses Battle' is in full swing. Yet amidst this fervor, the true battleground that determines victory or defeat remains quietly overlooked.

It's not under the glaring lights of press conferences, nor concealed within flashy marketing rhetoric. Instead, it resides within a frame weighing less than 50 grams, nestled inside a chip no larger than a fingernail.

This is a battle of extreme trade-offs between computing power, power consumption, and size—a severely underestimated 'core battle': whoever masters the chip tailored for AI/AR glasses will truly hold the key to the future.

Why Can't Smartphone Chips Save AR Glasses?

Many people's initial reaction is: 'Isn't it just a mini smartphone? Stuff in a Snapdragon chip, add some RAM, and you're good to go!' Sounds reasonable, right? After all, modern smartphones can run large models—why can't glasses?

The idea is appealing, but the reality is harsh.

Let's take a look at what a mainstream AI+AR glasses actually need to do:

Real-time spatial awareness (SLAM)

Multi-camera visual processing

Eye-tracking

Local AI voice assistant operation

Driving high-refresh-rate Micro-OLED displays

Long battery life, no overheating, comfortable to wear

These tasks may seem 'smartphone-like,' but here's the catch: glasses aren't miniaturized phones. They're always-on, highly parallel, severely constrained edge computing devices—no fans, no heat dissipation, 24/7 readiness, and must remain comfortable to wear all day.

Simply put: being able to run doesn't mean it's usable, being usable doesn't mean it's good, and being good doesn't mean it's sustainable.

Overheating while watching videos on your phone is no big deal, but even a slight increase in temperature on glasses can feel 'scalding' to users, especially when worn all day. Thus, transplanting smartphone solutions simply won't work.

Which Mainstream Chip Architecture Will Dominate the Future?

Without a perfect general-purpose chip solution, players in the market must innovate. Three mainstream chip architecture routes have emerged:

SoC (System-on-Chip) 'Lone Wolf' Approach: This scheme follows smartphone logic, cramming CPU, GPU, NPU, and ISP into a single SoC to pursue 'one chip for all.' The advantages are clear: low development barriers, mature ecosystems, and rapid mass production.

For example, the early Ray-Ban Meta first generation used Qualcomm's 4100. The second generation upgraded to the Snapdragon AR1, designed specifically for smart glasses. Even then, Meta had to introduce NXP's coprocessor to offload tasks—barely achieving around 8 hours of mixed usage (music + calls + photos).

This reveals a truth: even giants like Qualcomm admit that 'a single SoC' struggles in glasses scenarios.

SoC + MCU (Microcontroller Unit) 'Collaborative' Approach: The core idea is 'don't let one chip do everything.' This clear division of labor finds a better balance between performance and power consumption.

A prime example is Xiaomi's 2025 AI glasses, which adopt a Qualcomm AR1 + BES2700 dual-chip solution: Qualcomm handles complex AI tasks, while BES focuses on audio processing, significantly optimizing power consumption. Skyworth's A6 series AI glasses follow a similar route with Snapdragon AR1 + BES2800.

MCU + ISP (Image Signal Processor) 'Minimalist' Approach: Compared to the 'stacking' strategies above, this school embraces minimalism. The advantages include lower costs, lighter weight, and longer battery life, but with higher tuning complexity.

The most notable example is Li Auto's Livis glasses, which use a BES2800 main controller + Yanjiwei's standalone ISP. Instead of blindly stacking computing power, they precisely match hardware to core scenarios (e.g., vehicle-glass linkage, voice interaction), creating a differentiated edge.

At CES 2026, Infinix (under Transsion) showcased AI Glasses Pro, and Germany's L’Atitude 52°N also adopted the 'BES2800 + Yanjiwei ISP' combo. This signals the rise of a new technical path.

Meanwhile, a new force is emerging: standalone ISP chips tailored for AI wearables, led by domestic players like Gravichip and Rockchip, are becoming key variables in balancing performance and power. Their strength lies in 'specialization': deep optimizations for image processing, low-latency transmission, and low-power operation, making them ideal for energy-efficient terminals like smart glasses.

This 'chip-level war' has evolved from 'availability' to 'specialization.'

Chip Players: A Clash of Old and New Forces

Now, let's examine the key players in this invisible 'chip war.'

Traditional Giants: Steady Yet Evolving

Qualcomm: Remains the 'big brother' in AR/XR chips. In June 2025, it released the Snapdragon AR1+ Gen 1, designed for high-end smart glasses, marking its official 'All in on Smart Glasses.'

Huawei Hisilicon: Released an XR chip platform as early as 2020, with its first chip supporting 8K decoding, integrated GPU and NPU, and used in Rokid Vision glasses. However, it has since gone relatively quiet.

Domestic Self-Developed Forces: The 'Disruptors' Charge

Gravichip: With a core team from Apple and Meta, it entered the market with deep spatial computing expertise. In November 2025, it launched the ISP chip 'Genmu G-VX100' for AI glasses. Reportedly, it supports 16MP ultra-clear shooting, 4K 30fps video recording, spatial video capture, and eye-tracking. The chip features an ultra-low-power dedicated subsystem and the world's first multi-modal activation technology 'MMA (multi-model-activation).'

Rockchip: With long-term expertise in audio, video, display, and ISP, its RV series chips are used in multiple AI glasses projects. Its next-gen flagship chip is reportedly evolving heavily toward AI glasses.

XREAL: Not just making glasses but also chips. Its self-developed X1 spatial computing chip is the world's first tailored for spatial displays. Additionally, the XREAL 1S uses the X1 chip to achieve the world's first system-level real-time 2D-to-3D conversion.

BES Technology: Its BES2800 chip holds over 30% global market share and is the exclusive supplier for Meta Ray-Ban. The new BES3000 series reduces power by 40%, boosts computing power to 12TOPS, and targets standalone glasses.

Hexagonal Semiconductor: Specializing in high-performance chips, it released the 'Tianxiang Core' SoC for AI+AR glasses in January 2026, emphasizing low power and high integration.

Some players choose to 'bypass'—they don't build chips but define the rules.

For example, Thunderbird Innovation partnered with Pixelworks to launch the AR-specific display chip 'Vision 4000,' optimizing wearable screen color, contrast, and immersion to address the 'clear but unappealing' dilemma.

Summary

In this 'Hundred Glasses Battle,' while hardware forms may vary, the decisive factor is clear:

Whoever controls the chip 'born for AR' will truly hold the key to the next-generation gateway.

Now, the flames burn bright—only dawn awaits.

By Vivi

(All uncredited images are from the internet)

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.