05/15 2026
525

What caught my attention the most in Alibaba's recent conference call wasn't the 38% growth in cloud business or how much AI revenue had increased again. It was this statement: 'Almost no card is idle during its service period.'
This phrase is short but carries a lot of information. In simpler terms, it means: Don't just focus on how much money we're spending. The cards we buy aren't gathering dust in a warehouse—they're in high demand.
Over the past six months, one of the hottest debates in global capital markets has been: Has AI infrastructure been overbuilt?
Companies like Microsoft, Google, Amazon, and Meta are pouring hundreds of billions of dollars annually into data centers, GPUs, and power systems. While their profit statements still look decent, free cash flow has already collapsed.
This has Wall Street on edge: Could this be a repeat of the 2021 cloud computing bubble? Back then, everyone assumed cloud demand would keep growing indefinitely, but servers were deployed before customers caught up. When depreciation hit, stock prices took a beating.
Now, a popular trading logic in overseas markets is that AI Capex has peaked. The idea is that AI capital spending is nearing its limit, and pushing further would hurt shareholder returns.
But Alibaba's signal this time points in the opposite direction.
Wu Yongming compared AI data centers to factories, making a straightforward point: There's a shortage of training centers, a shortage of inference centers, GPUs are rarely idle, and investment returns over the next 3 to 5 years are highly certain. The scale of data centers will expand more than tenfold compared to 2022.
This isn't just another routine management statement. It's a message to the market: China's AI industry isn't even close to 'overcapacity'—in fact, much of the underlying infrastructure is still being built out.
The biggest disconnect in expectations lies here: The U.S. AI market has already gone through its first wave of aggressive expansion, and now everyone's doing the math: When will all these cards, this electricity, and these data centers pay off?
In China, the problem is more fundamental: Many companies haven't truly adopted AI yet, many industries haven't integrated AI into their workflows, and many companies still lack stable, affordable, and scalable computing power.
So while the U.S. market debates 'Is there too much computing power?' the real question in China is 'Is there enough computing power?'
Don't be fooled by the flurry of large language model announcements over the past two years, the constant refresh (refreshing) of leaderboards, and the ever-increasing parameter counts. Most of this excitement is still in the first phase: model training, technical catch-up, and proof-of-concept.
The next phase is where the real money, cards, and commercial value lie: inference.
Enterprise agents need inference, AI-powered office tools need inference, customer service systems need inference, marketing campaigns need inference, e-commerce recommendations need inference, smart hardware needs inference, and coding tools need inference. Training is like a massive concert—high peak demand, but not an everyday occurrence. Inference is like urban water supply—as long as businesses are running, the taps must stay on.
This is what Alibaba is truly betting on.
It's not betting on how long a specific model will stay popular or where it ranks on some leaderboard. It's betting that after Chinese enterprises fully adopt AI, society will generate sustained, ongoing demand for inference. At that point, cloud providers won't just be 'renting servers'—they'll become the underlying operators of the AI era, like power grids or water networks.
Is this expensive? Absolutely. Over 380 billion yuan in capital expenditures over the next three years is no small sum for any company. Especially since Alibaba has been labeled a 'slow-growth' company in recent years. E-commerce has matured, platform dividends are gone, local services still require subsidies, and international businesses need investment. Continuing to spend heavily on AI at this stage won't make the profit statement look pretty.
But the problem is, capital markets aren't afraid of spending—they're afraid of spending without direction.
If a company's cash flow deteriorates because its core business is failing, demand has collapsed, or profits have been eroded by competitors, its valuation will naturally suffer. But if cash flow deteriorates because it's preemptively securing capacity, entry points, and infrastructure in a new cycle, the market will take a second look.
Alibaba is telling the second story. It wants the market to stop viewing it solely as a mature e-commerce platform and instead see it as a core player in China's AI infrastructure.
In the past, investors focused on Alibaba's GMV, advertising revenue, commission rates, consumption recovery, and user growth. Ultimately, these are all traditional internet platform metrics.
But with AI, Alibaba's valuation anchor will gradually shift. The market will care more about how Alibaba Cloud's revenue is trending, what percentage of revenue comes from AI, whether GPU utilization remains high, whether inference demand is growing, and whether capital expenditures can eventually translate into higher cloud revenue.
With this metric shift, Alibaba's story changes.
It used to be an 'e-commerce platform stock.' Now it wants to become a 'China AI infrastructure stock.'
Of course, this transition isn't just about making announcements. Alibaba's greatest strength isn't Tongyi Qianwen (its LLM) itself. Models matter, but they're not everything. What truly matters is whether Alibaba can connect cloud, models, applications, scenarios, and customers.
This is where Alibaba differs from many pure-play model companies.
Pure-play model companies often face the problem of strong technology but insufficient scenarios, cool products but unstable revenue, and hot financing but ugly gross margins. If cloud providers just sell computing power, they'll end up in a price-war data center business. Alibaba is positioning itself in the middle: It has computing power at the bottom, models on top, and real business scenarios like Taobao, Tmall, 1688, DingTalk, Kuake, Gaode, and Cainiao on the side.
These businesses already consume massive amounts of inference demand. E-commerce search, recommendations, customer service, and ad generation; office document processing, meetings, and workflow automation; local services dispatch and marketing; logistics route optimization and fulfillment prediction—these aren't just flashy demos; they're long-term, real-world consumption.
This means Alibaba's AI investments don't rely solely on external customers for ROI. It has internal super-scenarios that can first adopt AI, then productize its capabilities, and finally sell them to more enterprise customers.
This is crucial. Microsoft relies more on enterprise AI budgets, Google balances search, cloud, and AI investments, and Amazon needs to drive AWS into a new growth phase. While Alibaba also faces profit pressure, it occupies a unique position in the Chinese market: It has cloud, models, super-apps, and is backed by China's complete industrial system.
One of the biggest issues with U.S. AI infrastructure today is rising costs. GPUs are expensive, electricity is expensive, data center construction is slow, land approvals are slow, grid upgrades are slow, and environmental pressures are high. AI data centers are essentially power-hungry factories—whoever can secure cheap, stable electricity, build data centers faster, and coordinate servers, liquid cooling, power supplies, and optical modules will have the staying power for a prolonged battle.
This is where China excels. China's greatest strength over the past few decades hasn't been making PPTs—it's been executing complex engineering projects at low cost, large scale, and with replicability. Servers, optical modules, liquid cooling, power supplies, network equipment, data center construction, and power infrastructure—when it comes down to it, these are all areas where China's industrial system excels.
So Alibaba's 380 billion yuan isn't just an internet company chasing the AI trend—it's using China's low-cost industrial system to fight a long-term war for AI infrastructure. That's why 'not a single card is idle' is so important.
Because it answers the capital market's most pressing question: Is there demand to absorb all the money you're spending? If cards sit idle, that's inventory. If cards run at full capacity daily, that's productive capacity.
What Alibaba wants to prove now is that it's not just buying expensive equipment—it's acquiring the means of production for China's AI era.
Of course, a good story doesn't mean there are no risks.
Will AI demand keep growing at a high rate? Will enterprise customers keep paying? Can domestic GPUs keep up? Will cloud providers restart price wars? Can inference revenue cover electricity, depreciation, and operational costs? Will free cash flow deteriorate for too long?
These questions can't be avoided. So what the market should really watch going forward isn't Alibaba's new model announcements, parameter counts, or leaderboard rankings—but three harder metrics:
First, can GPU utilization remain high? Second, can the proportion of AI-related revenue in cloud revenue continue to rise? Third, can capital expenditures and AI revenue gradually align?
As long as these three things keep happening, Alibaba's valuation logic will continue to shift. It won't just be about consumer internet recovery—it will be about AI infrastructure revaluation.
Personally, I think companies like Alibaba are easiest to misjudge. They're too complex, with too many businesses, and their short-term financials often look awkward. One quarter it's e-commerce competition, the next it's local services investment, then international expansion, and now cloud spending. If you only look at quarterly profits, it's easy to think they're not executing cleanly.
But complexity has its advantages. In the AI era, what's most scarce isn't a pretty demo—it's scenarios, customers, engineering capabilities, and capital endurance. Alibaba has a bit of each. While none is perfect, combined, they make it hard to simply label Alibaba as a 'legacy internet company.'
So the most important signal from Alibaba's conference call wasn't the 38% cloud growth or the rise in AI revenue share—it was that Alibaba is starting to treat AI as the industrial infrastructure of the next decade.
Behind the phrase 'not a single card is idle' is management's aggressive judgment on China's AI demand: Computing power isn't oversupplied—it's chronically scarce. AI isn't a passing trend—it's the new utilities (power, water, gas) that all industries will need to tap into.
While U.S. cloud giants grapple with cash flow squeezes from high capital expenditures, Alibaba aims to use China's lower electricity costs, faster construction efficiency, and more complete supply chain to fight this AI infrastructure war longer.
The next two years could be a critical window for China's AI cloud landscape. Whoever deploys computing power first, locks in enterprise customers first, and closes the loop between models, cloud, and applications first will likely secure the gateway to the next generation of the internet.
Alibaba isn't just betting 380 billion yuan.
It's betting on a judgment: What China's AI industry truly lacks isn't more stories—it's a thick, affordable, and stable computing power foundation.
If this bet pays off, Alibaba won't be valued just as an e-commerce company anymore. It will be re-evaluated at a different table—as China's AI infrastructure operator.
This card has just begun to be played.