03/06 2026
476

Author: Xiaofang
Source: Bowang Finance
In the field of large-scale AI models, personnel changes often make headlines. Yet, for real R&D and strategic planning, what truly counts is not the drama but continuous investment, seamless teamwork, and a steadfast commitment to open-source principles.
Alibaba's recent personnel reshuffle at Tongyi Laboratory marks a significant enhancement in resource allocation. Notably, the creation of a foundational model support team, led by Alibaba CEO Wu Yongming to orchestrate group resources, highlights Alibaba's top-tier support for foundational model innovation. Internally, this move is characterized as a 'full commitment'.
01
Large Model Upgrade: Project No. 1
On March 5, 2026, Wu Yongming announced in an internal email: 'The company has accepted Lin Junyang's resignation. We express our gratitude to Lin for his contributions. Jingren will continue to oversee Tongyi Laboratory's ongoing projects. Meanwhile, we will establish a foundational model support team, with myself, Jingren, and Fan Yu collaborating to allocate group resources for foundational model advancement.'
The email also clarified that developing foundational large models stands as Alibaba's primary strategic focus for the future. While maintaining its open-source model approach, the company will keep ramping up R&D investment and step up efforts to recruit top-tier talent.
The crux of large model R&D lies not in concepts but in having enough top-notch talent to bring them to life. Alibaba's focus on ongoing talent acquisition signals a steady expansion in AI development and higher standards for talent quality.
Wu Yongming's tone in the email was firm: 'Technological advancement is a zero-sum game.' This declaration serves as both a rallying cry and a pledge, ensuring foundational model development remains a long-term priority with sustained resource allocation.
Large model R&D requires substantial resources and talent. Talent density doesn't just mean numbers but whether key roles cover essential areas. Foundational model training demands continuous research, cost reduction in inference, stability enhancements, and real-world application by product and industry teams. Gaps in any area create cost and user experience pressures.
Wu's email explicitly lists 'adhering to open-source principles,' 'increasing R&D investment,' and 'attracting top talent' as parallel priorities, clarifying long-term commitments. Organizational adjustments are common, but open-source principles, R&D investment, and talent acquisition will remain constant.
More importantly, Wu stressed that 'technological progress is a zero-sum game' and positioned foundational large models as 'a key strategic initiative for the future.' This underscores that foundational models are long-term projects where no team can maintain leadership solely through past achievements. Heightened competition demands greater talent density to shorten iteration cycles and more efficient organizational support to reduce resource fragmentation.
02
Large Model Team Steps into 'Elite' Mode
As competition intensifies, Alibaba AI's need to boost talent density and organizational agility is natural.
Historically, Alibaba has rarely restructured its large model team, maintaining stability longer than its peers. Adjustments at this stage reflect adaptability. Such stability fosters research continuity, engineering accumulation, and long-term team cohesion. However, as the industry shifts from capability demonstration to scale competition, stability requires upgrades in collaboration, resource coordination, and product-model alignment.
The establishment of the foundational model support team enhances cross-team collaboration at the group level, shortening decision chains, streamlining resource allocation, and prioritizing critical tasks.
For Alibaba, past stability is not discarded but built upon. Stability deepens technical expertise, ensures reliable delivery, and solidifies open-source reputation. The support team adds layered resource coordination, enabling seamless R&D-product collaboration, rapid market responsiveness, and sustained long-term investment.
Wu's email also reaffirmed a commitment to open-source principles and increased R&D investment. Open-source functions as ecosystem cultivation, attracting developers and toolchain development, while facilitating private deployments or optimizations for enterprise clients. A larger ecosystem yields more real-world feedback, necessitating continuous engineering refinement. The support team ensures robust resource backing for this ecosystem and smoother open-source-product collaboration.
03
Adhering to Open-Source Strategy
With organizational upgrades, QianWen Large Model enters a new phase characterized by stronger group-level resource coordination, clearer open-source pathways, sustained R&D investment, and aggressive talent acquisition.
To transition from 'leading' to 'sustained leadership' in foundational models, competition hinges on long-term value. The support team's practical significance lies in facilitating continuous resource access for R&D and prioritizing critical projects.
Models, products, and ecosystems must align tightly amid fierce competition. For Tongyi QianWen, open-source expands developer reach and ecosystem appeal. However, in prolonged competition, seamless open-source-product integration is vital to convert reputation into sustained user and client retention.
Notably, despite significant turnover in OpenAI's founding team, its technological progress remained uninterrupted. This underscores that in fast-evolving fields like foundational models, talent mobility is common; the key differentiator is organizational continuity in direction and resources. Wu's email outlines clear leadership succession and resource coordination mechanisms to ensure core R&D sustainability.
Alibaba's talent strategy is more open, with Wu explicitly stating 'continued intensification of top talent acquisition.' This is essential for scaling foundational models, as breakthroughs, engineering implementation, toolchain development, and industry solutions all require elite talent. A more open strategy and higher talent density sustain team innovation.
From ByteDance to Tencent, from Google to OpenAI, AI leadership adjustments are routine among major firms. In contrast, Alibaba's large model team maintained rare long-term stability until this strategic adjustment. In the large model era, true stability stems from better collaboration, faster resource organization, and clearer priority management. The support team's establishment and Wu's involvement ensure 'stability' through effective resource coordination.
While organizational adjustments may spark casual conversation, a firm's long-term AI competitiveness hinges on management, investment, open-source principles, and talent density. QianWen's new phase involves translating existing capabilities into scalable impact across broader fields, letting more users and developers experience the value of sustained investment in real-world scenarios.