Where Money Talks & Markets Listen
Dark
Light

AI chip boom raises new questions over depreciation

November 14, 2025
ai-chip-boom-raises-new-questions-over-depreciation

As the world’s largest technology companies prepare to spend an estimated one trillion dollars over the next five years on artificial intelligence data centers, a single accounting issue is shaping investor sentiment: the useful life of AI chips. Depreciation, the method by which companies allocate the cost of hardware over time, has become central to assessing the true economics of hyperscale AI investment.

Firms such as Google, Microsoft and Oracle have said servers can remain useful for up to six years. But rapid technological turnover may shorten that timeline, affecting how profits are reported and how lenders value the billions being deployed into new GPU infrastructure. Microsoft, for example, says its hardware lasts between two and six years, a wide range that leaves considerable room for interpretation.

AI GPUs create new challenges for investors

Nvidia’s data center business illustrates how quickly the AI landscape has shifted. Since the launch of ChatGPT in 2022, the company’s annual data center revenue has climbed from 15 billion dollars to 115 billion. That swift acceleration leaves little historical guidance on how long high-performance GPUs retain economic value.

Legal and financial advisors say the lack of a track record makes depreciation a significant risk factor. “Is it three years, is it five, or is it seven?” said Haim Zaltzman of Latham & Watkins, noting that the difference has major implications for financing models. Much of the industry’s current optimism is based on the idea that GPUs will remain useful across multiple generations of AI workloads.

Cloud provider CoreWeave has used a six-year depreciation cycle since 2023 and says demand for older chips remains strong. CEO Michael Intrator said the company’s A100 processors, launched in 2020, are fully booked, and recently released H100 units immediately resold at roughly ninety five percent of their original value. “All of the data points that I’m getting are telling me that the infrastructure retains value,” he said.

Market volatility reflects concerns over overspending

Despite such confidence, market reactions have been mixed. CoreWeave shares fell sixteen percent after delays at a partner data center developer reduced its full-year guidance. Oracle’s stock has dropped more than a third since reaching record highs in September as investors reassess the sustainability of heavy AI capital spending.

Some prominent skeptics argue companies are overstating the useful life of their chips. Short seller Michael Burry has taken positions against Nvidia and Palantir, suggesting that hyperscalers are underestimating depreciation and inflating earnings. He believes server equipment may last only two to three years in real economic terms, far shorter than some corporate filings indicate.

While several companies declined to comment on depreciation assumptions, the debate underscores the challenge of scaling AI infrastructure in an industry where chip refresh cycles are accelerating. Nvidia now releases new AI processors annually, compared with a two-year cadence in the past. AMD has adopted a similar approach.

Rapid chip cycles increase the risk of obsolescence

Nvidia CEO Jensen Huang has hinted at how quickly hardware can lose value. Introducing the Blackwell architecture earlier this year, he joked that older Hopper chips would become far less appealing once the new generation ships widely. “When Blackwell starts shipping in volume, you couldn’t give Hoppers away,” he said.

Obsolescence is not the only risk. Chips can fail, degrade or simply become uneconomical to run, even if technically functional. Amazon recently shortened the useful life of part of its server fleet from six years to five, citing a faster pace of development driven by AI and machine learning.

Other cloud providers are taking the opposite approach. Microsoft says it is staggering GPU purchases to avoid locking in multi-year depreciation on a single hardware cycle. CEO Satya Nadella said migration speeds have increased, making it risky to be “stuck with four or five years of depreciation on one generation.”

Auditors ultimately evaluate depreciation assumptions based on engineering data, historical use patterns and projected technological shifts. Dustin Madsen of the Society of Depreciation Professionals said companies must demonstrate their estimates are realistic. “You’re going to have to convince an auditor that what you’re suggesting is actually its life,” he said.