Divergent IPO Trajectories for Large Models

01/16 2026 447

Author: Wu Kunyan

Editor: Wu Xianzhi

Both Zhipu and MiniMax have made their public debuts, presenting the market with a rare, side-by-side comparative analysis within the same sector.

Dubbed the "Little Dragons" in the primary market, these two large model companies experienced contrasting receptions during their IPO launches. Zhipu, touted as the "world's first large model IPO," initially saw its stock price dip before surging to close at HK$131.5, marking a 13.2% increase from its issue price. Conversely, MiniMax enjoyed a stellar first trading day, finishing at HK$345, a staggering 109.1% rise from its issue price.

While the primary market invests in potential, the secondary market focuses more on the tangibility of commercialization. At this early stage of monetization, the narrative's appeal to the capital market dictates which company garners more favor.

As a large model company approaches the IPO stage, its product vision and strategy shape distinct valuation models for investors. Zhipu opts to transform AI capabilities into "deliverable engineering" by integrating models into B2B services. In contrast, MiniMax initially converts capabilities into "consumable products" by developing applications and generating revenue through the C-end consumer market.

Consequently, we observe two contrasting market temperatures. Although short-term valuations of tech firms are often swayed by market sentiment, stepping outside the "short-term = sentiment" framework, this unprecedented comparison resembles a public pricing experiment.

What commercialization paths are capital markets rewarding when large models enter the IPO phase?

Does Zhipu Lack "Compelling Stories"?

The secondary market's assessment of a bell-ringing company's growth potential often stems from its prospectus. The most visible indicator of growth is the revenue structure outlined in the document.

The financial models of the two companies exhibit significant disparities: one is a "technical marathoner" making steady progress, while the other is an "efficient doer" achieving high impact.

In 2024 and the first half of 2025, Zhipu's localized deployment revenue constituted over 80% of its total revenue, indicating a clear B-end focus. This structure essentially defines Zhipu's business model: it resembles an "AI solutions provider" that embeds large model capabilities into enterprise data centers, dedicated environments, and specialized processes.

The previous AI wave centered around "selling algorithmic capabilities." The delivery targets for industry intelligence were primarily single-point demands in specific scenarios. Projects were isolated, with limited marginal reuse of algorithmic capabilities, and scaling relied more on expanding sales reach and project density.

In contrast, Zhipu is currently navigating a cycle where general-purpose large models serve as infrastructure. The model itself is not tailored to a specific industry but forms a "general capability pool" that can be repeatedly deployed across various scenarios through parameter scale, reasoning ability, and Agent frameworks. Under this logic, while localized deployment may appear "project-based" in terms of revenue structure, its underlying capabilities are not one-time deliveries but continuously evolving, reusable, and upgradable models and toolchains.

This is why Zhipu repeatedly emphasizes model iteration rhythm, Token usage scale, and platform capabilities in its prospectus, rather than just the contract amounts of individual projects. The scalable replication story that the secondary market seeks may not lie in the rise of cloud deployment ratios but in the deepening of commercial interactions with the same clients.

For Zhipu, projects serve as entry points for commercialization, not endpoints. Once models are integrated into organizational processes, subsequent capability upgrades, Agent expansions, computing power consumption, and tool subscriptions will naturally grow within the same client.

In other words, if previous project deliveries resembled "replicating engineering" across different clients, Zhipu aims to "replicate capabilities" within the same client. The former relies on horizontal expansion for scaling, while the latter depends more on vertical penetration.

It is at this juncture that true market divergence emerges. For some investors, what they see is a familiar path—high margins, project-based, localized deployment. However, for others, Zhipu appears to be using a project-based shell to house a platform-based business model that has not yet fully materialized.

Judging by its performance—from an initial dip to a rebound—there is a disconnect between market expectations of the company's appearance and its core, but it still needs to prove itself to the market.

Before general-purpose capabilities are fully unlocked through cloud-based usage, project-based revenue remains a pragmatic solution. Whether Zhipu can transition from "delivery-driven" to "usage-driven" will determine if it can avoid the pitfalls of previous AI companies.

The "ByteDance" of the Large Model Era?

Compared to Zhipu, which is valued within a familiar framework, MiniMax finds itself in a nearly opposite scenario—people struggle to find a similar template from the previous AI wave.

It is not that MiniMax's technological approach is more "revolutionary," but rather that its founder, Yan Junjie, emerged from the previous AI wave with a clearer and earlier understanding of AI commercialization than external observers.

From the outset, he steered the company away from a path similar to Zhipu's. He recognized the friction in customized delivery, which is why he allocated resources to products, platforms, and users earlier. For the secondary market, the imagination space of customized B-end services still needs expansion, but the native product model of the large model era appears more "enticing," which may explain its rapid market enthusiasm during the IPO.

Reflected in its revenue structure, MiniMax exhibits traces of an Internet-style "product + platform" company. Its prospectus discloses revenues of $13.552 million in 2024 and $53.432 million in the first nine months of 2025, with its product matrix of AI-native applications like Hailuo AI and Xingye contributing over 71% of the revenue.

Under the B-end narrative, the model itself is the deliverable. Under another narrative, the underlying model is encapsulated into a product delivered to C-end users, evolving from a deliverable into a cost center.

The benefits of path correction are evident. MiniMax aligns with investor expectations and has not experienced the turbulence Zhipu faced recently. Even from a business model perspective, MiniMax is easier to compare with consumer Internet companies, using the ARPU × MAU model to judge growth potential.

At this point, the market's judgment hinges on whether its growth is sustainable and whether costs can be diluted through scale.

It should be noted that this path may not be straightforward. Compared to project-based revenue, consumer products and platform usage expose costs earlier, with user growth and computing power expenditure increasing simultaneously. In this context, efficiency directly determines gross margins and overall enterprise operations.

We can also infer this from the prospectus. It reveals that MiniMax emphasizes it is still in the early stages of commercialization and has a high loss ratio. Its detailed disclosure of historical cash consumption often corresponds to the implicit message of "burning money for scale/growth."

This represents a more concentrated strategic bet. As long as the product flywheel keeps spinning, it can dilute model costs through scale and grow platform revenues. Conversely, if growth stalls, the cost curve will immediately become a pressure curve—under high-intensity R&D and computing power investment, if commercialization cannot continue to accelerate, valuation anchors will shift from revenue to cash flow.

From this perspective, MiniMax bears some resemblance to ByteDance, which aggressively purchased servers over a decade ago to enhance personalized algorithm efficiency. Interestingly, the founders of both companies have expressed similar views on different occasions, such as both focusing on globalization from the outset.

Compared to Zhang Yiming, Yan Junjie's entrepreneurial environment is more favorable. When Zhang Yiming started his business, the logic of "information finding people" was only vaguely reflected in a few niche overseas "algorithmic news" products. He had to explain the value of information and distribution to every investor. Yan Junjie, however, operates in a market where AI is largely a consensus. No wonder the market has shown its preference, giving MiniMax higher expectations.

Where Does the "Lucrative Opportunity" Lie?

AI is a technology-driven, hierarchical industry. The divergence in the commercialization paths of the two large model companies still requires time to validate, but there is another, colder reality relative to market capitalization—computing power is not in their control.

Take MiniMax as an example. In the first nine months of 2025, it spent $58.3 million purchasing cloud computing services from Alibaba Cloud. It is expected to set upper budget limits for cloud service procurement from Alibaba Cloud at $115 million, $125 million, and $135 million annually from 2026 to 2028.

Beneath the market sentiment reflected by market capitalization, truly stable, predictable, and continuously growing cash flows are flowing back to cloud providers through computing power bills. The so-called "debate over commercialization paths" is more akin to two companies paying for the same underlying bill in different ways.

MiniMax uses user scale and product growth to front (pre-expose) computing power costs in its financial reports. Zhipu uses project delivery and localized deployment to package computing power costs into overall solutions, which are then indirectly borne by clients.

Perhaps, this round of large model IPOs has another layer of implicit conclusion: regardless of which narrative the capital market favors in the short term, those truly standing upstream in the industry chain with bargaining power are still cloud platforms that control computing power, networks, and scheduling capabilities.

The answer to "Where does the lucrative opportunity lie?" is not complicated. After all, in this race for commercialization in the AI industry chain, computing power, as a production factor, has already been priced in advance. This may be the most certain fact in this round of the large model wave.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.