After analyzing OpenAI's AI demand, Barclays concluded that the AI capital expenditure cycle will continue, and technological breakthroughs may trigger a surge in computing power demand in 2027/2028

Wallstreetcn
2025.11.20 12:40
portai
I'm PortAI, I can summarize articles.

The Barclays report shows that OpenAI continues to exceed expectations in performance, confirming that the AI capital expenditure cycle will continue in the medium to long term. Performance growth directly drives the company's computing power investment, with continuous model iterations constantly increasing the demand for computing power, forcing computing partners to accelerate infrastructure deployment. It is expected that 2027-2028 will be a key landing window for achieving "recursive self-improvement," which will further increase the demand for computing power

According to Barclays' latest research report, OpenAI's revenue performance significantly exceeded its internal expectations, indicating that AI demand is rapidly growing and the large-scale capital expenditure cycle will not end in the short term. As long as OpenAI can maintain its current strong growth momentum, the risk of a bubble burst in the AI sector will remain low.

Analysis shows that OpenAI's revenue in 2025 is about 15% higher than mid-year internal forecasts, while the expected revenue for 2027 has been significantly raised by 50% compared to previous estimates.

For the capital markets, this trend means that internet giants and hyperscale cloud service providers will continue to maintain high levels of capital investment, and semiconductor demand will remain strong. Barclays expects that by 2028, OpenAI's computing expenditure will reach a peak level of approximately $110 billion, at which point technological breakthroughs are likely to trigger a new surge in computing power demand.

Barclays' report attempts to answer a core question: How far are we from a slowdown in AI investment? The answer is: still a long way off.

Revenue Significantly Exceeds Expectations

The Barclays report shows that OpenAI's revenue performance continues to exceed internal expectations, with actual revenue in 2025 about 15% higher than mid-year forecasts, and the expected revenue for 2027 raised by 50%.

Specific data shows that OpenAI's total revenue expectation for 2027 has been raised from $60 billion to $90 billion, with inference computing costs increasing from $21 billion to $30 billion, weekly active users (WAU) rising from 1.4 billion to 1.8 billion, and the annual average revenue per paid user (ARPU) increasing from $748 to $880.

Company CEO Sam Altman recently revealed in a public interview that OpenAI is expected to achieve a target of $100 billion in annual recurring revenue (ARR) by 2027, which is a full year earlier than previously predicted.

AI Capital Expenditure Cycle Will Continue

Barclays' latest research report points out that OpenAI's continued outperformance confirms that the AI capital expenditure cycle will continue in the medium to long term.

First, all of OpenAI's revenue translates to computing power, and performance growth directly drives the company's computing investments. The four main revenue sources for ChatGPT are: the paid version of ChatGPT, the free version of ChatGPT (advertising-supported), agents, and APIs, each with different demands for computing resources, but all these businesses use the same basic computing architecture. OpenAI's total budget for computing operating expenses from 2024 to 2030 has exceeded $450 billion, with expectations of peaking at around $110 billion in 2028.

Secondly, the continuous iteration of models is driving up computing power demand, forcing computing partners to accelerate infrastructure deployment. OpenAI is continuously advancing the development of next-generation models such as GPT-6 and Sora 3, with each model upgrade bringing significant increases in training and inference costs, thereby continuously driving investment in underlying computing facilities Barclays reports that OpenAI expects the key landing window for achieving "recursive self-improvement" will be in 2027-2028, which will further increase demand for computing power. This technology autonomously develops the next generation of models (such as GPT-6, Sora, etc.) through "drop-in AI researchers," forming a closed loop of "AI developing AI." The company has reserved approximately $43 billion in additional "Monetizable Compute" specifically for 2028 to support the implementation of this technology.

At the same time, OpenAI has signed approximately $650 billion in computing power leasing contracts with multiple partners, covering the next ten years. Among them, the Oracle OCI contract totals $300 billion, starting in 2027 for a duration of 5 years, averaging $60 billion per year; the Microsoft Azure contract totals $250 billion, starting in mid-2026 for a duration of 7 years, averaging $36 billion per year. Additionally, the Google GCP contract totals $40 billion for 7 years, and Amazon AWS provides a $38 billion contract for 7 years, while CoreWeave offers a $22.4 billion contract for 5 years.

Thirdly, intensified industry competition has triggered an "arms race." In response to OpenAI's current 6 to 12-month technological lead, competitors such as Google and Meta are forced to simultaneously expand their user base and accelerate model iteration speed. It is expected that from 2024 to 2030, the total capacity of global AI data centers will grow from 114.3 GW to 236 GW, doubling in size. Just for OpenAI alone, partners (such as Oracle, Microsoft, etc.) need to bear over $600 billion in capital expenditures to build computing power clusters.

Finally, the long-term strategic determination of tech giants further locks in a high-investment trend. Founders of tech giants place greater emphasis on long-term competition in AI, such as Larry Page stating "I'd rather go bankrupt than give up," indicating that even in the face of market fluctuations, they are still willing to continue investing to seize the market and drive industry capital expenditures to remain at a high level