Is the trillion-dollar AI investment return exaggerated? Now everyone is asking: How many years does a GPU last?

Wallstreetcn
2025.11.14 14:13
portai
I'm PortAI, I can summarize articles.

The shorter the depreciation cycle of equipment, the faster the erosion of corporate profits. Whether the valuable NVIDIA GPUs will rapidly depreciate due to technological iteration has become the focus of market controversy. Companies like Google, Oracle, and Microsoft expect their servers to be usable for up to six years. However, Microsoft's latest disclosure shows that the lifespan of its servers is between two to six years. Noted short seller Burry believes that the actual lifespan of server equipment is about two to three years, claiming that these companies have therefore exaggerated their earnings

As global tech giants prepare to invest $1 trillion in building AI data centers over the next five years, a mundane accounting issue is becoming the focus for management and investors: how should the depreciation period for GPUs be determined? This seemingly technical question is actually central to the calculations of corporate profit performance and investment returns.

Infrastructure giants like Google, Oracle, and Microsoft expect their servers to last up to six years. However, Microsoft disclosed in its latest annual report that the lifespan of its computing equipment ranges from two to six years, showing a significant span. In the short term, whether these valuable NVIDIA GPUs will depreciate rapidly due to technological iterations has become a focal point of market controversy.

The length of the depreciation period directly affects a company's financial performance. The longer the equipment retains its value, the more years the company can spread the depreciation costs over, thereby reducing the impact on profits. However, as a relatively new asset class, AI GPUs lack sufficient historical data to support this, creating uncertainty for investors and lending institutions providing financing for large-scale AI construction.

Concerns about AI spending have already emerged in the market. CoreWeave's stock has fallen 57% from its June peak, and Oracle's stock has plummeted 34% from its September peak last year, reflecting growing investor doubts about over-investment in AI.

The Estimation Dilemma of GPU Depreciation: Lack of Sufficient Lifespan Records

The depreciation assessment of AI chips faces unique challenges. NVIDIA's first AI processors aimed at data centers were launched around 2018, while the current AI boom began with the release of ChatGPT at the end of 2022. Since then, NVIDIA's data center annual revenue has skyrocketed from $15 billion to $115 billion as of January this year.

Compared to other heavy equipment that companies use for decades, GPUs lack sufficient lifespan records. Haim Zaltzman, Vice Chair of Emerging Companies and Growth Business at Latham & Watkins, stated in a media interview, "Is it three years, five years, or seven years? From a financing perspective, that's a huge difference." Zaltzman is involved in GPU financing.

Depreciation is the accounting practice of spreading the cost of hard assets over their expected lifespan. This concept is becoming increasingly important in the tech industry as companies need to predict how long the hundreds of thousands of NVIDIA graphics processors they purchase will remain useful or valuable.

Actual Lifespan Less Than Three Years?

There are starkly different opinions in the market regarding the value retention of GPUs. CoreWeave, which purchases GPUs and rents them to customers, has adopted a six-year depreciation cycle for its infrastructure starting in 2023. The company's CEO Michael Intrator stated after the earnings report this week that the company is evaluating GPU lifespan in a "data-driven" manner Intrator stated that CoreWeave's NVIDIA A100 chips (released in 2020) are fully booked. He also added that a batch of NVIDIA H100 chips from 2022 was immediately booked at 95% of the original price after being released due to contract expiration. "All the data points I have indicate that the infrastructure retains its value," Intrator said.

However, CoreWeave's stock plummeted 16% after the earnings report, as delays from third-party data center developers affected the full-year guidance. Notable short seller Michael Burry recently disclosed his short positions on NVIDIA and Palantir. Burry hinted this week that companies including Meta, Oracle, Microsoft, Alphabet, and Amazon have overestimated the lifespan of AI chips and underestimated depreciation. He believes the actual lifespan of server equipment is about two to three years, and that these companies have therefore exaggerated their earnings.

Accelerated Technological Iteration Pressures

Analysts point out that there are various reasons why AI chips may depreciate within six years: they may wear out or become damaged, or they may become outdated with the release of new GPUs. They may still be usable for certain workloads, but the economics significantly decline.

NVIDIA CEO Jensen Huang has hinted at this. Earlier this year, when NVIDIA released its new Blackwell chips, he joked that the value of its predecessor, Hopper, would decline. "When Blackwell starts shipping in large quantities, no one will want Hopper," Huang said in March at the NVIDIA AI conference. "In some cases, Hopper is still okay, but not many."

NVIDIA now releases new AI chips annually, whereas it previously had a two-year cycle. Its closest GPU competitor, AMD, has also adopted the same approach. NVIDIA will announce its quarterly results next week.

In a document from February, Amazon stated that it has shortened the lifespan of some servers from six years to five years because research found that "the pace of technological development is accelerating, especially in the fields of artificial intelligence and machine learning." Meanwhile, other large-scale cloud service providers are extending the estimated GPU lifespan of new server equipment.

Strategies of Tech Giants

Despite Microsoft's plans to heavily invest in AI infrastructure, CEO Satya Nadella stated this week that the company is trying to diversify its AI chip procurement to avoid over-investing in a single generation of processors. He added that the biggest competitor to any new NVIDIA AI chip is its predecessor.

"One of the biggest lessons we've learned from NVIDIA is that their pace of migration has accelerated," Nadella said. "This is an important factor. I don't want to carry four or five years of depreciation on a single generation of products."

Depreciation expert and Emrydia Consulting founder Dustin Madsen stated that depreciation is a financial estimate by management, and the progress in rapidly developing industries (such as technology) may change the initial forecasts. Madsen said that depreciation estimates typically consider assumptions such as technological obsolescence, maintenance, historical lifespans of similar equipment, and internal engineering analysis "You must convince the auditors that the useful life you propose is indeed its true life," Madsen said. "They will look at all these factors, such as engineering data indicating that these assets have a lifespan of about six years, and will audit at a very detailed level."