11/21 2025
552
On November 21, media reports cited a recent research report by Barclays, which indicates that the capital expenditure cycle in the AI sector will persist. It predicts a substantial spike in computing power demand between 2027 and 2028, propelled by breakthroughs in pivotal technologies.
The key rationale behind this forecast originates from OpenAI's recent revenue performance, which has surpassed expectations. OpenAI CEO Sam Altman disclosed that the company's actual revenue for 2025 was roughly 15% higher than the internal mid-year projections. Moreover, he forecasted that the revenue for 2027 would climb from $60 billion to $90 billion. Additionally, OpenAI is anticipated to achieve its annual recurring revenue (ARR) target of $100 billion a year earlier than planned, by 2027.
The upward adjustment in revenue expectations has also spurred enhancements in crucial operational metrics, including inference computing costs, weekly active user counts, and the average annual revenue per paying user. These data points mirror the accelerated commercialization of AI technology and mitigate the risk of an industry bubble burst.
Behind these impressive revenue figures lies OpenAI's significant investment in computing power resources. Barclays' analysis reveals that OpenAI's budget for computing operational expenditures from 2024 to 2030 surpasses $450 billion, with a projected peak of $110 billion in 2028.
To guarantee a steady supply of computing power, OpenAI has inked ten-year computing power leasing contracts, totaling approximately $650 billion, with tech firms like Oracle and Microsoft. These partnerships not only furnish infrastructure support for OpenAI's business expansion but also stimulate capital investment in AI computing power by its collaborators.
Furthermore, OpenAI's strategic investments in next-generation technology research and development have further escalated computing power demand. The company is vigorously advancing the development of its GPT-6 large language model and Sora 3 video generation model. More critically, the expected implementation of 'recursive self-improvement' technology between 2027 and 2028 will enable autonomous optimization and upgrades of AI models, substantially boosting performance and efficiency while also necessitating massive computing power support.
To this end, OpenAI has earmarked approximately $43 billion in monetizable computing power funds to meet the potentially soaring demand for computing power following the technology's implementation. Barclays projects that, driven by demand from companies like OpenAI, the total global capacity of AI data centers will double between 2024 and 2030, with OpenAI alone requiring its partners to undertake over $600 billion in capital expenditures for the construction and upgrading of computing clusters.
OpenAI's swift development has also spurred competitors to ramp up their investments, further cementing the sustainability of the AI capital expenditure cycle. To keep pace with its technological lead, tech behemoths like Google and Meta are actively expanding their user bases and accelerating model iterations. For instance, Google DeepMind continues to refine its Gemini series of multimodal models, while Meta is rolling out more customized versions based on its LLaMA open-source model, both of which hinge on substantial computing power investments.
Meanwhile, the high priority accorded to long-term AI competition by tech leaders like Larry Page has also spurred companies to persist in their investments despite market fluctuations, aiming to seize the technological high ground. This strategic resolve drives the entire industry to sustain high-intensity capital expenditures, while the intense investments by internet giants and hyperscale cloud service providers continue to stoke robust demand across related supply chains, such as semiconductors.
From an industry-wide vantage point, OpenAI's development trajectory is merely a microcosm of the sustained AI capital expenditure cycle. Recent industry trends also corroborate this trend: global semiconductor manufacturers are actively expanding AI chip production capacity, with the market maintaining a year-on-year growth rate exceeding 60% since 2025; Google announced a 30% increase in computing power investment for its Gemini model to support more complex multimodal tasks; Meta plans to triple its AI data center capacity by 2026.
With the gradual maturation of key technologies like 'recursive self-improvement' between 2027 and 2028, the AI industry's demand for computing power is expected to surge anew, further prolonging the capital expenditure cycle driven by continuous technological advancements.
