03/10 2026
561
*The content is solely for presenting different market perspectives and research viewpoints and does not imply that this official account endorses the views or conclusions in the article.
When a technological system starts to look to "space" as the answer to industrial bottlenecks, it has often reached the end of its efficiency dividends.
Looking back through history, the decline of a major power's technological system rarely begins with the failure of a specific technology but rather with the misallocation of capital and the locking up of resources.
In the 1970s and 1980s, the Soviet Union had the world's top fluid dynamicists and material scientists, who devoted their most expensive industrial resources and brightest minds to the space race and military-industrial complex.
However, this excessive fascination with "grand projects" caused them to miss out on the personal computer, consumer electronics, and internet revolutions. While Silicon Valley engineers were writing code in garages, Moscow's elite were calculating how to get more tonnage into low-Earth orbit.
History often repeats itself in similar ways. Today, there is an intriguing sign emerging in U.S. tech capital: to satisfy the seemingly insatiable hunger for AI computing power, some capital is beginning to seriously discuss—moving data centers into space. This is not just a technological idea but also a profound metaphor for capital cycles, industrial bottlenecks, and the end of narratives.
Computing Power Anxiety:
AI is Pushing for a "Space Solution"
In the current AI frenzy, there is a widespread misconception in the market that the bottleneck for computing power lies in chips. However, for those deep in the industry, the real shackles are not GPU production capacity but energy and land.
The training and inference of large AI models are essentially battles of energy consumption. A large-scale data center with a capacity of 1 GW (gigawatt) consumes as much power as a medium-sized city. In the U.S., as the AI arms race escalates, the construction of large data centers has begun to encounter unprecedented "triple constraints."
The first is the lengthy process of power approvals. In the U.S., building a new large substation or connecting to the grid often requires years of environmental impact assessments and administrative approvals. The second is land and environmental restrictions. Data centers are sources of noise and heat emissions, making site selection increasingly difficult in densely populated states with strict environmental regulations. Finally, the cost of cooling systems has skyrocketed. As chip power density increases, traditional air cooling is nearing its limits. While liquid cooling is efficient, it places higher demands on water resources and infrastructure.
It is against this backdrop of dwindling ground resources that some extreme technological paths are being seriously considered—sending data centers into space.
This idea is not confined to the realm of science fiction. In 2025, Starcloud, a company invested in by NVIDIA, has already sent a satellite equipped with an H100 GPU into orbit via SpaceX's Falcon 9 rocket. Weighing just 60 kilograms and about the size of a small refrigerator, this satellite represents a huge industrial imagination: future data centers will no longer be located in Silicon Valley basements or Nevada deserts but in Earth orbit.
In orbit, theoretically, nearly unlimited solar energy can be harnessed, completely freeing it from the fluctuations and limitations of ground-based power grids. At the same time, the vacuum environment of space avoids atmospheric resistance and eliminates the need to deal with ground-based land acquisition, noise complaints, and environmental regulations. Starcloud has even set a more extreme goal: building an orbital data center cluster with a total power of 5 GW, whose physical size would reach the kilometer scale. This is equivalent to moving an entire "AI computing power city" into space.
This vision sounds exhilarating, as if a new physical space has been found for the infinite growth of AI. However, when the industry begins to seriously discuss such "ultra-high-cost, ultra-high-difficulty" solutions, it often means that the conventional growth space on the ground is no longer sufficient. The economics of this dream are almost entirely untenable, but as a signal, it is deafening.
Cost Realities:
The Economics of Space-Based Computing Power Are Almost Impossible to Justify
If we strip away emotion and vision and examine the issue solely from a financial and engineering perspective, the financials of space-based data centers are disheartening.
BNP Paribas once conducted a detailed calculation, and the conclusion was very direct: current space launch costs range from $1,500 to $3,600 per kilogram. Although SpaceX's Starship promises to significantly reduce launch costs, for orbital data centers to be economically feasible, costs must fall below $300 per kilogram. In other words, launch costs need to drop by more than 80%, and a very high launch frequency must be maintained.
If calculated at current costs, building a 1 GW space-based data center could cost more than $100 billion. Of this, launch costs alone would range from $30 billion to $75 billion, while the manufacturing costs of satellites and hardware systems would be about $50 billion. In contrast, a ground-based data center of the same scale would cost only $35 billion to $50 billion to build.
This means the space solution is not only technically more complex but also more than twice as expensive. And this is just the construction cost (CAPEX), not including operating costs (OPEX).
More importantly, the space environment places extreme demands on hardware. On the ground, servers can operate in a constant-temperature, normal-pressure, low-radiation environment, and hardware failures can be easily replaced. But in orbit, hardware must be equipped with aerospace-grade electronic systems to resist high-energy cosmic rays; otherwise, bit flips will cause computational errors or even system crashes. The cooling system also faces huge challenges. In a vacuum, convection is impossible, and only radiative cooling can be relied upon, requiring a huge cooling area and complex fluid circuits. Moreover, orbital maintenance is almost an impossible task. Once hardware fails, it must either be software-shielded or the entire satellite scrapped.
These factors combined will make the cost per unit of computing power for space-based servers far higher than that of ordinary data centers. In other words, AI has not solved the computing power cost problem but has simply pushed the costs to a more extreme place.
In commercial logic, when an industry begins to discuss "cost-irrelevant" solutions, it usually means that the industry's marginal returns are sharply declining. When the industry starts to rely on such "ultra-high-cost solutions" to sustain growth narratives, it often means that a technological cycle is nearing its end. Capital is willing to pay for it not because it is a good business but because the ground-based story can no longer be told.
Historical Metaphor:
U.S. Tech Capital is Retracing the Soviet Path
Looking at the bigger picture, we will find that this phenomenon hides a profound historical metaphor. U.S. tech capital is inadvertently retracing the Soviet path of resource misallocation.
The Soviet technological system once possessed the world's top scientific capabilities. They were the first to achieve the first artificial satellite, the first astronaut, and the first space station. In the fields of aerospace and military technology, the Soviet Union's engineering capabilities were awe-inspiring. But at the same time, the Soviet Union missed out on personal computers, the software industry, and the internet. The reason is simple: the best engineering resources, the cheapest energy, and the highest-priority supply chains were all locked up in "national-level technological projects." These projects were extremely grand and politically significant but commercially inefficient, unable to be converted into civilian productivity or form an ecological closed loop .
Today, the U.S. AI industry is showing a similar tendency. Capital is increasingly concentrated in ultra-large-scale infrastructure projects: hundreds of billions of dollars in AI computing power investments, clusters of millions of GPUs, nuclear-powered data centers, and now even space-based computing power infrastructure.
The common characteristics of these projects are their enormous scale, capital intensity, and extremely long return cycles. They resemble "national projects" more than commercial products. From an investment perspective, this is a typical cyclical signal: when industry growth begins to rely on giant infrastructure investments, it means that technological dividends are marginally declining.
In the 1840s, Britain experienced the "Railway Mania"; in 2000, the U.S. experienced the "Dot-com Bubble." The common feature of these two periods was that the technology was real, and the demand was real, but the scale of capital investment far exceeded the future realizable returns. Railways did transform transportation, and fiber optics did carry the internet, but at the peak of the bubbles, the railway lines built would never be profitable, and the fiber optics laid would never be fully utilized.
Today, more and more institutions on Wall Street are beginning to worry that the AI industry is entering a similar phase. Space-based data centers may be the extreme symbol of this round of AI capital narratives. It represents capital's almost desperate outward expansion in the face of ground-based bottlenecks.
If even computing power has to fly into orbit, it means that the growth space on the ground is no longer sufficient. This does not necessarily mean the end of the AI revolution—AI technology itself will still advance, and applications will still land. But it may mean that the narrative of U.S. tech stocks is nearing the peak of a cycle.
When capital no longer pursues cost reductions per unit of computing power but instead pursues the physical expansion of computing power, it is usually a sign that efficiency dividends have been exhausted. The Soviet Union did not lose in technology but in economic efficiency; the challenge U.S. tech giants face now may not be technological bottlenecks but bottlenecks in capital returns.
In this high-stakes gamble, space-based data centers may never become mainstream, but as a milestone, they mark the turning point of AI enthusiasm from "pragmatic optimization" to "grand narratives." For investors, when they hear stories like "moving data centers into space," perhaps they should be wary: this is not just a technological leap but possibly the final chapter of the bubble.