04/10 2026
434
On April 10, 2026, the global AI industry witnessed two significant developments: OpenAI officially suspended its “Stargate” supercomputing center project in the UK, and Maine became the first U.S. state to pass legislation restricting the construction of large data centers. While these events may seem isolated, they reveal a common trend—global AI infrastructure is facing a “double squeeze” from escalating energy costs and increasingly stringent regulatory policies.
The impact of these events extends far beyond mere “project suspension” or “legislative restrictions.” They signal a fundamental shift in the logic of global computing power distribution, presenting China's computing power industry with a rare strategic opportunity.
■ The Suspension of “Stargate”: An Inevitable Retreat
High Costs Compel OpenAI to Pause
The direct catalyst for OpenAI's suspension of the UK “Stargate” project is clear-cut: the UK boasts one of the highest industrial electricity prices globally, with grid connection timelines extending up to 18 months. Coupled with ambiguous regulatory rules following the implementation of the EU AI Act and setbacks in copyright legislation, these factors have collectively caused the project's long-term return on investment to plummet below expectations.
This decision was driven by sound commercial logic. In September 2025, OpenAI partnered with UK data center startup Nscale and chip giant NVIDIA to launch the UK version of the “Stargate” project, planning to deploy up to 8,000 NVIDIA GPUs. This initiative was seen as a cornerstone of the UK's ambition to become a global AI hub, included in the country's £31 billion tech investment plan.
However, reality fell short of ambition. The UK's energy cost issues proved far more severe than anticipated. According to 2025 data from the International Energy Agency (IEA), the UK's industrial electricity price stands at approximately £258 per megawatt-hour (i.e., £0.258 per kWh, or about RMB 2.3649 per kWh), having surged by over 124% since 2023—the highest level among G7 nations.
For an AI supercomputing center with a design capacity of hundreds of megawatts, annual electricity costs would be astronomical. Even more daunting is the 18-month grid connection timeline, meaning that even if construction began immediately, the project would not be operational for another year and a half.
In its statement, OpenAI emphasized advancing “when regulatory environments and energy costs are conducive to long-term infrastructure investment”—a diplomatic way of saying “temporary abandonment.” Unless UK energy costs drop significantly or regulatory policies become clearer, this project is likely to remain on hold indefinitely.
Maine's “Construction Ban”: Balancing Livelihood and Environment
Concurrently, the Maine State Legislature passed a symbolic bill proposing a ban on the construction of data centers larger than 20 megawatts until the end of November 2027. This makes Maine the first U.S. state to halt large data center construction through legislation.
What does 20 megawatts represent? It is equivalent to the electricity consumption of 20,000 households. A medium-sized data center typically consumes between 5 and 50 megawatts. Maine's bill targets large data centers, reflecting deep concerns from local environmental groups and lawmakers about “energy competition”: high-energy-consuming data centers could drive up residential electricity costs and squeeze out power resources for civilian use.
Notably, the tech industry did not remain silent. They argued that data centers would bring jobs, tax revenue, and investment, filling the void left by the decline of traditional industries in the region. However, the legislative outcome indicates that local governments have prioritized “balancing livelihoods and industry” over “prioritizing business attraction.”
Maine's case sets a precedent, and more U.S. local governments may follow suit with similar energy consumption control policies in the future. For AI companies, this means that selecting locations for data centers will become even more challenging.
The Deeper Logic Behind the “Double Dilemma”
When viewed together—OpenAI's suspension of its UK project and Maine's legislative ban—a clear picture emerges: global AI infrastructure is facing a “double squeeze” from rigid energy cost constraints and tightening regulatory policies.
In recent years, tech giants have aggressively deployed AI computing power worldwide, following a simple logic—build data centers where there is demand and a market. But now, this logic is being disrupted. Rigid energy cost constraints mean that AI supercomputing centers are no longer something that can be built “on a whim,” while tightening regulatory policies have significantly increased uncertainty in investment returns.
The UK and U.S. cases are not isolated. The implementation of the EU AI Act has subjected AI companies to stricter compliance requirements, while uncertainties in copyright legislation have increased legal risks associated with training data usage. For AI giants pursuing global expansion, “stability” is becoming as important a location parameter as “performance.”
■ Global Computing Power Landscape Reshuffled: China's Strategic Window of Opportunity
China's “Proactive Deployment” in Computing Power Policy
Against the backdrop of a global AI infrastructure “cooldown,” China is demonstrating unique policy advantages. Since 2026, China has introduced a series of stringent policies, establishing the world's most comprehensive regulatory and support system for computing power.
The most symbolic policy is the “80% Green Power Hard Constraint.” The National Data Administration mandates that new and expanded data centers in the eight “East Data, West Computing” hub nodes must achieve a green power consumption ratio of ≥80%, with green certificates as the sole accounting voucher, implementing a “one-vote veto”—projects that fail to meet this standard will not receive energy consumption approvals or be allowed to connect to the grid. This requirement has been upgraded from a 2023 “target” to a 2026 “entry threshold,” eliminating the possibility of companies “cutting corners.”
The PUE (Power Usage Effectiveness) tiered red lines are equally strict. Western smart computing centers must maintain a PUE ≤1.20, eastern centers ≤1.25, and newly built large data centers ≤1.1—far below the national average of 1.38. PUE represents the ratio of a data center's total energy consumption to IT equipment energy consumption, with lower values indicating higher energy efficiency. A PUE of 1.1 means that for every 1 kWh consumed, only 0.1 kWh is used for non-IT equipment like cooling and lighting.
“Computing-Power-Electricity Coordination” is a more forward-looking initiative. The 2026 government work report included “computing-power-electricity coordination” in new infrastructure projects for the first time, promoting deep integration of computing power and power systems through direct green power supply and integrated source-grid-load-storage systems to achieve “strengthening computing with electricity and promoting electricity with computing.”
China's policy design thinking is clear: rather than making passive adjustments after problems arise, it is better to set high standards and strict requirements in advance to drive industry upgrades. This “proactive deployment” approach has positioned China favorably in the global computing power competition.
“Natural Advantages” in Energy Endowment
China's western provinces (Inner Mongolia, Guizhou, Gansu, Ningxia, etc.) possess vast wind, solar, and hydroelectric resources, providing low-cost and stable energy support for the computing power industry.
The cost advantage is significant. Green power direct supply prices in western hub nodes are as low as RMB 0.28–0.36 per kWh, with some areas even below RMB 0.4—far lower than overseas markets like the UK and the U.S. This means that for the same computing output, operating costs in China could be half or even lower than those overseas.
Resource matching is equally precise. The “East Data, West Computing” initiative aligns eastern computing demand with western green power supply, avoiding the “energy mismatch + high costs” dilemma seen in overseas markets. Eastern regions face land scarcity and high energy costs but have massive computing demand, while western regions boast abundant energy and land but lack sufficient consumption scenarios. “East Data, West Computing” achieves a win-win situation.
The consumption value is even more critical. The computing power industry has become a key scenario for green power consumption, helping to address the issue of “wind and solar curtailment” in the west. Wind and solar power generation are intermittent and volatile, with nighttime wind curtailment and daytime solar curtailment being common. However, if data centers are built alongside these resources, they can “consume” this otherwise wasted electricity, forming a virtuous cycle of “computing power + green power.”
“Full-Chain Advantages” in Industrial Ecosystem
At the equipment and technology level, China has formed a complete industrial chain. In areas such as liquid-cooled servers, intelligent temperature control, and data center infrastructure, leading companies have reached international advanced levels. Tech giants like Tencent, Alibaba Cloud, and Huawei have achieved advanced PUE levels of ≤1.06, with some projects even below 1.1.
In terms of implementation experience, hub clusters like Gui'an and Horinger have built large-scale computing centers with green power accounting for over 90%, establishing a replicable “computing-power-electricity coordination” model. These cases provide templates for nationwide promotion and prove the feasibility of the “Chinese approach.”
In terms of policy support, various localities have introduced “computing power vouchers,” energy-saving renovation subsidies, and other support policies to reduce corporate landing costs and accelerate the aggregation of the computing power industry. For example, Inner Mongolia not only offers low electricity prices but also provides tax incentives and land guarantees, creating a complete business environment.
■ Industrial Opportunities: Three Sectors Enter a Golden Period
1. Green Power Sector: From “Passive Consumption” to “Active Supply”
The setbacks in overseas AI infrastructure are fundamentally due to the unsustainability of high-energy-consuming models. This will directly benefit China's green power industry, driving a shift from “passive consumption” to “active supply.”
Rigid demand is surging. According to data from the China Academy of Information and Communications Technology, China's data centers consumed approximately 200 billion kWh of electricity in 2025, accounting for about 2% of total societal electricity consumption. With the rigid requirement of 80% green power consumption, new computing power projects will directly drive significant growth in green power consumption. Based on an average annual new computing capacity of 500,000 kW across the eight hub nodes and 8,000 operational hours per year, annual new green power demand would reach approximately 40 billion kWh.
Business models are accelerating innovation. Direct green power connections and aggregated green power supply will become mainstream models. Western green power resources will continuously flow eastward through “computing power corridors,” opening up new spaces for green power consumption. More and more energy companies will realize that instead of struggling to sell electricity on spot markets, they can sign long-term agreements with computing power companies to secure stable orders.
Policy dividends continue to be released. Computing-power-electricity coordination policies deeply bind green power with computing power, providing green power companies with stable long-term contracts and enhancing profit certainty. For wind and solar operators, computing power customers are becoming new “prized assets.”
2. Energy Storage Sector: From “Supporting Role” to “Standard Configuration”
Energy storage is the “stabilizer” for green power and a “standard component” of computing power infrastructure.
Rigid supporting demand is rising. Given the intermittency of green power, computing centers need energy storage to achieve “peak shaving and valley filling” and ensure stable computing operations. Energy storage has shifted from a past “optional accessory” to a “must-have.” This value is even more pronounced in regions with significant electricity price fluctuations.
Technological iterations are accelerating. Liquid-cooled energy storage, compressed air energy storage, and other technologies are being deployed rapidly, improving energy storage efficiency and safety while reducing overall costs. By 2025, energy storage cell prices had dropped by over 60% from 2022 levels, significantly improving economic viability. In 2026, as scale effects further materialize, energy storage costs are expected to continue declining.
Policy support is intensifying. Computing-power-electricity coordination pilots explicitly encourage the integrated construction of energy storage and computing centers, with local governments providing subsidies to further reduce corporate investments. In regions like Inner Mongolia and Gansu, the co-construction of energy storage projects and data centers has become the norm.
3. Computing Power Network Sector: From “Decentralized Layout” to “Regional Aggregation”
Global AI computing power resources will accelerate their aggregation in China, particularly in western hub nodes with green power and compliance advantages.
The reshaping of the landscape is accelerating. Overseas computing power deployments are facing obstacles, leading to a reconfiguration of global AI computing power resources. Leveraging its stable policies, low energy costs, and mature industrial ecosystem, China is poised to absorb more international computing demand. Western nodes like Gui'an, Ulanqab, Horinger, and Zhongwei are becoming significantly more attractive.
The division of labor is becoming increasingly refined. The eastern region will concentrate on low-latency computing and AI applications, whereas the western region will undertake high-energy-consuming computing training and data storage, creating a “coordinated east-west, nationwide” computing power framework. This division is not merely a relocation of “backward production capacity”; rather, it is a precise alignment based on comparative advantages.
Technological advancements are accelerating. The intelligent scheduling of computing power networks and the collaboration between cloud, edge, and end devices are being rapidly implemented, enhancing computing efficiency and reducing overall energy consumption. By 2025, a national integrated computing power network will have begun to take shape, with cross-regional scheduling channels set to open further in 2026.
■ Comparison and Insights: China's Comparative Advantages
Differences in Policy Approaches
The UK and the U.S. have adopted a clearly “market-oriented approach” toward AI infrastructure, with governments rarely intervening and leaving decisions to enterprises. However, the drawback of this model is that when energy costs rise and regulatory environments worsen, companies can only “vote with their feet” by withdrawing or postponing projects.
China's approach differs. The government plans proactively and guides the orderly development of the computing power industry through national-level strategies like “East Data, West Computing.” While less flexible than a purely market-oriented model, this “planning-first” approach demonstrates greater resilience in addressing systemic risks.
Differences in Energy Strategies
The UK's challenge lies in its high and volatile energy costs. Industrial electricity prices are steep, grid connection timelines are lengthy, and green power ratios are low—factors that collectively put the UK at a disadvantage in the AI computing power race.
Maine, in the U.S., has taken the opposite extreme—restricting data center construction due to concerns about energy competition. While this “one-size-fits-all” approach protects civilian electricity use, it also hampers industrial development to some extent.
China's strategy is one of “balance”—promoting green and efficient industry development through stringent policies like green power ratios and PUE controls, while achieving precise matching of energy supply and demand through initiatives like “East Data, West Computing.” This “carrot-and-stick” approach is more sustainable.
Differences in Industrial Ecosystems
Globally, China is one of the few countries with a complete computing power industrial chain. From chips to servers, data centers to network equipment, software to applications, China possesses the corresponding industrial foundations. This full-chain advantage gives China stronger resilience when facing external shocks.
The UK's issue is its fragmented industrial ecosystem. It relies on imports for chips and equipment and, while it has talent, it lacks the capacity to support a complete industrial chain. The U.S., despite its technological edge, faces mounting energy and regulatory constraints. China's strengths lie in its stable policies, low energy costs, and complete industrial ecosystem—all of which are essential.
■ Opportunities Amid Crisis
OpenAI's suspension of the UK “Stargate” project and Maine's legislative ban on large data centers may seem like “local issues,” but they are, in fact, pivotal events marking a new phase in global AI infrastructure development. Stringent energy cost constraints and tightening regulatory policies are reshaping the fundamental logic of global computing power distribution.
For China, this presents both challenges and opportunities.
The challenge lies in the need to continuously enhance technological capabilities to meet increasingly stringent environmental and energy consumption standards. The opportunity is that, leveraging its three advantages—policy coordination, energy endowment, and industrial ecosystem—China is well-positioned to secure a more favorable standing in the global computing power competition.
The author believes that the next 3–5 years will be a critical window of opportunity. China's computing power companies need to focus on three key directions:
First is green development. Early arrangements should be made for energy-saving technologies and green power support to comply with new regulations on energy consumption and green power supervision. A PUE below 1.1 and a green power ratio exceeding 80% will become standard requirements.
Second is regionalization. Emphasis should be placed on the western regions, which are rich in green power, and the hubs for computing power relocation from the east to the west, aiming to reduce costs and policy risks. The model of computing in the west for use in the east will become mainstream.
Third is collaboration. Strengthening cooperation with energy and power companies is essential to build an integrated ecosystem of 'computing power + green power + energy storage'. It is challenging for a single enterprise to navigate the complex multi-party game, and ecological alliances are the way forward.
The reconstruction of the global computing power landscape has just begun. In this competition without gunpowder smoke, China has already secured a favorable position. However, whether it can ultimately prevail depends on policy implementation and industrial development in the coming years.