Early Tuesday morning, Beijing time, OpenAI and Nvidia jointly unveiled a strategic cooperation letter of intent, outlining plans to deploy at least 10 gigawatts of Nvidia systems to bolster OpenAI's

09/23 2025 439

What does this landmark collaboration signify?

01 Unprecedented Scale

The 10-gigawatt project marks an unparalleled ambition in AI infrastructure development.

Nvidia CEO Jensen Huang, in an interview with CNBC, stated that the planned 10 gigawatts equate to the power consumption of 4 to 5 million graphics processing units (GPUs), matching Nvidia's annual GPU shipments and doubling last year's volume.

Construction for this investment will commence in the second half of 2026, proceeding incrementally, with Nvidia's investment aligning with each deployment phase.

02 OpenAI's 'Risk Management'

OpenAI's computing demands mirror the harsh realities of AI development. With 700 million weekly active users on ChatGPT, OpenAI must continually push technological boundaries to justify its $500 billion valuation. Training next-generation AI models necessitates massive computing power, reshaping how AI companies finance and allocate resources.

This partnership addresses OpenAI's immediate needs. Traditional financing would dilute shareholder equity, especially as OpenAI contemplates restructuring. Securing priority access to chips requires more than mere purchases; suppliers also need incentives.

However, the financial structure is somewhat intricate. OpenAI receives $100 billion from Nvidia, which it will gradually 'repay' through future chip purchases. Regardless of OpenAI's profitability, Nvidia stands to benefit.

The timing also reflects the fierce competition in AI infrastructure. OpenAI recently invested $350 billion in cloud services, primarily with Oracle, along with Microsoft and Google. This deal with Nvidia underscores OpenAI's desire to diversify risks while ensuring access to cutting-edge chips.

03 Nvidia's 'Grand Strategy'

For Nvidia, this transcends mere chip sales; it's about dominating the AI ecosystem. Nvidia has made broad investments in AI: $5 billion for custom CPUs with Intel, $700 million in UK data center company Nscale, and over $900 million acquiring the team and technology of AI startup Enfabrica.

Nvidia is evolving from a hardware supplier into an 'overseer' of AI infrastructure.

This investment also helps Nvidia counter competition. AMD is launching more powerful chips, and cloud providers are developing their own. OpenAI plans to mass-produce its chips next year. By holding stakes in clients, Nvidia raises the cost of switching suppliers.

The partnership also grants Nvidia insight into OpenAI's technological roadmap. Both will jointly optimize OpenAI's models and Nvidia's hardware and software, enabling Nvidia to steer its R&D based on OpenAI's direction. Even if OpenAI reduces Nvidia chip usage, Nvidia benefits as a shareholder.

04 Computing Infrastructure as the Foundation of the Future Economy

Both companies frame the partnership in transformative terms.

Altman emphasizes that 'computing infrastructure will become the bedrock of the future economy.' Huang describes supporting 'next-generation intelligence' through unprecedented data center construction.

Such rhetoric is compelling amid economic uncertainty.

Last week, the Federal Reserve cut interest rates amid signs of a cooling labor market. Chair Jerome Powell noted, 'The labor market is indeed cooling.' Traditional economic indicators advise caution, while AI investments are seen as the sole hedge against recession.

As an open-ended letter of intent, the partnership's longevity hinges on progress toward artificial general intelligence—a goal still unclear in definition and timeline. The collaboration assumes that scaling infrastructure will accelerate capability development, creating economic returns that justify the investment.

05 The End of AI is Electricity

To grasp this power demand, 10 gigawatts roughly equal the output of 10 nuclear reactors, each typically outputting about 1 gigawatt. Current data centers consume between 10 megawatts and 1 gigawatt, with most large facilities using 50 to 100 megawatts.

OpenAI's planned infrastructure will far surpass existing facilities, requiring power equivalent to multiple major cities.

Altman's ambition for hyperscale data center deals dates back a year. Last September, Constellation Energy CEO Joe Dominguez told Bloomberg he heard Altman wanted to build 5 to 7 data centers, each 5 gigawatts. Digiconomist's Alex de Vries told Fortune that seven 5-gigawatt units would consume 'twice New York State's electricity.'

Other large-scale AI infrastructure projects are emerging across the U.S. In July, Cheyenne, Wyoming, officials announced plans for an AI data center eventually reaching 10 gigawatts—even its initial 1.8-gigawatt phase would exceed the state's total household electricity consumption. Whether this project relates to OpenAI's plans remains unclear.

06 Computing Resources Edge Toward Monopoly

Against the backdrop of U.S. advanced semiconductor export controls, the Nvidia-OpenAI partnership represents a bet on maintaining U.S. AI leadership through superior infrastructure access while concentrating critical capabilities among few players.

This collaboration intensifies existing concerns about the AI industry's structure.

Nvidia, Microsoft, and OpenAI increasingly control key infrastructure layers through cross-investments. Microsoft holds a significant stake in OpenAI while providing cloud services. Nvidia deepens ties through direct investment and hardware supply.

The Nvidia-OpenAI deal crystallizes a trend: hardware and software giants deepen technical cooperation through financial bundling. This is both an infrastructure gamble and shrewd risk management.

The partnership may accelerate AI development but concentrates market power; it may justify investments but introduces systemic risks; it may solidify U.S. technological leadership but restricts competition.

The phased structure offers flexibility, but the core logic is unavoidable—AI development demands massive scale, and scale requires super-collaboration. Initially, this seemed temporary coordination during rapid growth. Now, it appears a permanent industry fixture.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.