05/15 2026
343
Article by | Intelligent Relativity
As 2026 nears its halfway mark, one of the tech world's most talked-about concepts from the first half of the year—'OpenClaw'—has taken on a clearer shape: its peak hype has passed, but the war rages on.
Star growth on GitHub has slowed, deployment screenshots in WeChat Moments have dwindled, and the price of proxy installation services on Xianyu has plummeted from 2,000 yuan to 200 yuan. These signals were once interpreted as the 'lobster' tide receding. However, industry developments over the past month show that the frenzy hasn't ended—it has simply evolved from a mass hysteria into a silent industrial war.
Just as outsiders thought the 'Hundred Shrimp Battle' would cool down, the conflict intensified.
Alibaba and Tencent both further emphasized AI's importance in their earnings calls, taking different routes but with a firmer direction. Before this, the 'landlord' battle over the lobster had already reached a fever pitch.
Alibaba CEO Wu Yongming personally took charge, establishing the Alibaba Token Hub (ATH) business group to integrate all AI forces, including Tongyi Labs, Qianwen, and Wukong, aiming for 'cloud and AI commercialization revenue to exceed $100 billion annually within five years.'
Tencent simultaneously announced the full launch of its all-scenario AI agent, WorkBuddy, eliminating deployment steps entirely—download and use immediately.
Baidu rolled out its zero-deployment service, DuClaw, with a first-month promotional price of just 17.8 yuan, slashing the barrier to 'out-of-the-box' usability.

ByteDance's TRAE SOLO evolved independently from an IDE form, attempting to accelerate its grab for AI assistant user mindshare.
Big tech's synchronized push is clear in intent. This isn't incremental competition—it's a resource-intensive war. Whoever can first popularize the 'lobster' path may secure the last ticket to the AI consumer era.
Meanwhile, looking back at the 'lobster farming' craze of the past six months, a core question demands deeper exploration: who fell into the pit, and who found the way?
Three Deployment Models, Three Distinct Fates
Revisiting the OpenClaw boom in the first half of 2026, the market primarily spawned three deployment models: local deployment, cloud instances, and big tech encapsulation. Each model caters to different user needs and market controversies, determining who holds the initiative in this competition.
1. Local Deployment: Data Sovereignty and Technical Freedom, but with a High Barrier.
The local deployment model initially attracted tech enthusiasts in the first half of the year. Its selling points were clear: private data stays local, no monthly membership fees, and greater autonomy in AI use. However, most soon found that environmental configuration alone could deter the majority. More concerning was AI's access to the entire file system, browser history, and even terminal commands. Once hijacked by malicious skills, users' AI could become tools for others, compromising their computers.
To address these pain points, some vendors tried to lower barriers and security risks. For example, Zhipu AutoClaw compressed environmental setup to 'one minute.' Tencent QClaw, relying on PC Manager 18.0, created a dedicated 'isolation zone' for AI—restricted to specified folders. NVIDIA NemoClaw directly sold all-in-one machines preloaded with hundreds of skills, ready to plug and play.
This model holds appeal, but for mass users, it's not enough. Environmental setup already blocks most, while maintaining security boundaries requires higher user awareness—two hurdles too high for the general public.
Of course, this model will always have a market. Tech geeks, privacy purists, and high-end users willing to buy all-in-one machines will prefer local deployment. But in the first half of 2026, it was destined to remain niche.
2. Cloud Instances: 24/7 Online and Elastic Computing, but Still Imperfect.
Local deployment solved autonomy issues but introduced another reality: ordinary computers struggle to keep 'lobster' AI continuously online. Users need it to monitor markets while sleeping, browse web pages while working, but most laptops can't handle this 'always-on' load—fans rage, batteries drain, and network/power outages cripple it.
Thus, cloud instances surged in popularity in the first half. Alibaba Cloud, Tencent Cloud, Baidu Intelligent Cloud, and Huawei Cloud all launched 'one-click deployment images,' allowing 'lobster' AI to be deployed on cloud servers for 24/7 online access. However, users soon noticed awkwardness.
First, cloud-based 'lobster' AI is isolated from local computers. It can execute cloud commands but can't directly operate local mice/keyboards or access local files. To organize local documents via cloud AI, users must fiddle with cloud disk mounting or file syncing. Second, users face dual costs: server rental fees plus token invocation fees. Finally, once automation workflows rely on cloud storage and scheduled tasks, migrating data or reconfiguring environments becomes as complex as local deployment.
Clearly, cloud solutions solved the 'computer can't handle it' problem but introduced new headaches: double costs and local-cloud isolation. For users needing long-term AI task execution, this feels transitional, not final. First-half user feedback confirms this.
3. Big Tech Encapsulation: Ushering in 'Zero-Barrier AI.'
The core conflict of the first two models boils down to one thing: users don't want to be tech operators or financial accountants. Local deployment offers 'freedom but hassle,' while cloud instances offer 'convenience but cost and isolation.'
Most users just want a 'black box' with a fixed monthly fee—'I pay, you deliver.'
Big tech sensed this demand and struck in late first-half 2026. Tencent WorkBuddy promoted 'download and use,' while Alibaba's 'Wukong' embedded directly into DingTalk, serving 20 million organizations. These products share traits: out-of-the-box usability, with model invocation fees bundled into membership. Users avoid Docker setup, environment configuration, or security isolation worries—big tech built 'isolation zones' into product layers. Local-cloud isolation is resolved too: via official client one-click authorization, AI can directly access local files without manual cloud disk mounting.
Users needn't know if it's Hunyuan or DeepSeek behind the scenes, or care about token price fluctuations—monthly deductions ensure 'lobster' AI is always ready. More importantly, these big tech versions don't lock users into a single model; WorkBuddy, DuClaw, and Wukong all support multi-model switching.

Users aren't buying a model's capabilities but 'an access point'—where they can choose any model they want. The first two models force users to pick models, pay tokens, and bear risks alone; big tech versions bundle everything into a subscription while bridging local-cloud gaps.
Of course, this model has trade-offs. Big tech is still preempt the market (market-grabbing), so prices are in flux—future membership fees will likely rise tier by tier. But the mass market doesn't chase absolute lowest prices—they want predictable costs and hassle-free experiences, depending on which vendor satisfies this first.
2026 'Lobster' Battle Lines Drawn: The Access Point War Decides the Final Ticket
Looking back from mid-2026, the divergence among the three deployment models is clear.
Local deployment retains tech geeks and privacy-sensitive users but fails to break into the mass market. Its value lies in proving that user demand for data sovereignty and subscription-free AI is real. The cost? Shifting technical barriers and security responsibilities entirely to users.
As a result, those who master local deployment remain few. First-half tech community Heat curve (hype curves) confirm this: initial explosive growth, then rapid decline, leaving a stable core but no scale effect.
Cloud instances followed a different trajectory. They solved the 'computer can't handle it' pain point and were initially seen as local deployment's upgrade. However, users quickly calculated another cost: server rentals plus token fees—double spending; local files versus cloud operations—two systems. Critically, once workflows depend on cloud storage and scheduled tasks, migration costs rival redeployment.
By late first-half, cloud instance growth slowed sharply. Many users stayed in 'trial' mode without converting to long-term payments, indicating that cloud solutions filled a transitional need but lacked stickiness.
In contrast, the big tech encapsulation model, though latest, achieved 0-to-1 breakthroughs in a month. Tencent WorkBuddy hit 1 million downloads in its first week; Alibaba's 'Wukong' activated over 30% within DingTalk; Baidu DuClaw rapidly acquired users with its 17.8 yuan promo.
These numbers signal a clear truth—mass users aren't against paying but against 'paying after hassle.' They prioritize 'hassle-free' over 'control.' Big tech encapsulation delivers exactly that: out-of-the-box usability, predictable billing, and customer support for issues.
Objectively, this divergence isn't accidental but driven by AI product competition's shift of focus (shift in focus). By 2026, gaps in underlying model capabilities among players have narrowed to what ordinary users can't perceive. GPT-5, Gemini Ultra, Tongyi Qianwen Max, Hunyuan Turbo—they trade blows in benchmarks but are indistinguishable in real use.
AI products now compete on three simpler metrics: onboarding cost, payment willingness, and retention. The product that gets users started fastest and retains them longest wins. And that's big tech's forte.
The mass market isn't won by the technically strongest but by the most convenient. Big tech's weapons aren't model parameters but channels, brands, payment systems, and customer service networks. DingTalk, WeChat, Baidu App, Douyin—these billion-user platforms are the best distribution channels. Paired with proven subscription models (video, cloud storage, office suites), big tech encapsulation is tailor-made for the mass market.
Now, the answer to 'who's silently making a fortune' is half-clear. Short-term, big tech encapsulation will dominate market share. Long-term, the winner won't be a single big tech firm but whoever first turns 'access points' into 'ecosystems.'
Why? Because AI agent competition isn't about models—it's about scenarios and data flywheels. The more you use it, the better it understands you; the better it understands you, the harder you are to leave. Once users accumulate workflows, personal data, and automation habits in one access point, migration costs become prohibitive.
This is the AI version of the super-app logic from the mobile internet era: grab access points first, collect rent later, and lock users in via ecosystems.
That's why Alibaba, Tencent, Baidu, and ByteDance all struck simultaneously. Traditional internet businesses—advertising, e-commerce, gaming—have hit growth ceilings. For these giants, AI is the only visible 'second growth curve.' In Q4 2025, Alibaba Intelligent Cloud grew 36%, with AI-related revenue maintaining triple-digit growth for 8–10 quarters. Baidu's 2025 AI revenue surpassed 40 billion yuan, far exceeding forecasts.
Meanwhile, all four giants hold vast private data—ByteDance owns Chinese short-video preferences, Alibaba holds decades of transaction records, Tencent possesses everyone's social graphs, and Baidu has accumulated 20 years of search history. If they don't leverage this data soon, competitors or startups with advanced tech could erode their advantages faster than expected. The AI access point war is essentially a defensive battle.
Will local deployment and cloud instances vanish? No. They'll resemble niche markets—serving users willing to spend time on setup and with hard requirements for data sovereignty. Think Linux's position in desktop OS—always a loyal user base but never mainstream.
Most ordinary users won't choose these paths, not out of ignorance but preference. They want tools that work out-of-the-box, with predictable billing and accountability. Big tech encapsulation delivers exactly that.
After the hype fades, what's left behind aren't the big tech firms collecting membership fees but the tutorials teaching users to set up Docker, configure MCP, or build self-hosted environments—they never offered what users truly wanted.
The real winners are those who make users forget the word 'deployment.'
*All images in this article are sourced from the internet.