The AI infrastructure gold rush has delivered a $371 billion reality check. What began as a seductively simple narrative → artificial intelligence demands unprecedented computational infrastructure → creating a digital boom that would lift all boats in the data centre ecosystem, has now collided with the immutable laws of physics.
In other words, an unstoppable force meets an immovable object.
The sector requires very high levels of expenditure - Source: Deloitte
Recent Q2 2025 earnings from industry titans like Equinix and Amazon reveal an uncomfortable truth: you cannot download more electricity, you cannot venture-capital your way around transformer shortages, and the euphoria is giving way to operational constraints that billions in capital expenditure cannot immediately resolve.
Our analysis reveals the first visible cracks in what many believed was an unstoppable infrastructure boom, with market leaders facing margin compression despite record capital deployment and hyperscaler spending reaching historically unprecedented levels.
Few energy sources match the timelines of data centers - Source: Deloitte
This week, we analyse the forces behind this great recalibration, where companies like Equinix post record revenues yet see stock prices tumble on elevated CapEx guidance, AWS admits to capacity constraints despite $100+ billion in planned spending, and the industry grapples with power grid bottlenecks that seven-year connection delays in Northern Virginia cannot solve through financial engineering alone.
Equinix shows the first crack
Equinix's Q2 2025 results presented a paradox that perfectly captures the industry's predicament. The company delivered what should have been a victory lap: revenue of $2.26 billion (up 5% year-over-year), adjusted EBITDA margins hitting 50% for the first time in company history, and $345 million in annualized gross bookings. The company closed 4,100 deals across more than 3,300 customers, with interconnection revenues crossing the $400 million threshold for the first time.

Yet the forward guidance reveals the underlying strain. While Equinix raised its 2025 revenue guidance by $58 million, capital expenditure has ballooned to an unprecedented $3.8-4.3 billion range, representing approximately 42-47% of projected revenues. Compare this to the 20-25% of revenue that data center operators typically spend on capex, and the margin compression becomes clear. Traditional data center economics assumed 8-10 kilowatt racks. AI demands 120 kilowatts or more. We’re now talking about an entire infrastructure overhaul.
Timeline of NVIDIA GPU generations with projected peak and average rack densities, showing rising power demands across AI pods and industry installations. Source: Vertiv
CEO Adaire Fox-Martin acknowledged the challenge directly, stating "We were built for this moment" while simultaneously revealing the company has 59 major projects underway globally, including 12 xScale projects.
The company's non-recurring capex surged 54.9% YoY to $934 million in Q2 alone, with approximately $450 million of xScale-related spend expected to be reimbursed later in 2025. These are not the metrics of a company riding a smooth growth wave; they are the symptoms of an industry grappling with fundamental physics constraints and the desperate need to double capacity by 2029 to meet AI infrastructure demands.

Equinix - Source: MarketScreener - ProRealTime
AWS brings scaling challenges to the forefront
If Equinix represented the first visible crack, Amazon's Q2 2025 earnings call delivered what amounted to a confession from the cloud computing titan. AWS posted $30.4 billion in quarterly revenue, maintaining its position at the top of the leaderboard as the world’s largest cloud provider. However, investors heard CEO Andy Jassy deliver an uncomfortable admission: "AWS continues to build a large, fast growing, triple digit year over year percentage, multibillion dollar business with more demand than we have supplied for at the moment".
The numbers are underwhelming. AWS revenue grew just 17.5% year-over-year in Q2 2025, significantly trailing Microsoft Azure's 31% and Google Cloud's 32% growth. For a business that once routinely posted 40%+ growth rates, this deceleration stings. Jassy cited three specific constraints: chips from third-party partners arriving "a little bit slower than before," power constraints, and supply chain bottlenecks for components like motherboards.
Amazon's response reveals the desperation: throw money at the problem. The company plans to spend $105 billion on capital expenditure in 2025, up from $83 billion in 2024. Yet even this astronomical sum comes with caveats. Amazon is shortening server depreciation timelines from six years to five, acknowledging that AI development is rendering infrastructure obsolete faster than anticipated. This accounting change alone will reduce 2025 operating income by $700 million.

Source: Synergy Research Group - Statista
When physics meets market demand
The bottlenecks cascade through the entire system like an electric shock. According to Goldman Sachs Research, global data centre power demand will increase 165% by 2030, reaching 165 gigawatts - roughly equivalent to Argentina's entire power consumption. Gartner predicts that 40% of AI data centres will face operational constraints due to power availability by 2027.
Historical and projected data center occupancy rates (2022–2031), showing supply sufficiency peaking mid-decade before gradually declining. Source: Goldman Sachs
The power generation crisis is already visible. New connections in Northern Virginia, the world's largest data centre market, now face delays of up to seven years. Dublin and Amsterdam have implemented moratoriums on new data centre construction. Singapore only recently lifted a four-year ban. These are not temporary bottlenecks; they represent structural constraints that billions in capital expenditure cannot immediately resolve.
Source: Bain & Co “AI Changes Big & Small” Report
Lead times for critical components like transformers have stretched to nearly two years. The specialised equipment needed for liquid cooling systems, essential for high-density AI workloads, faces similar delays. The US construction industry has hundreds of thousands of job openings, with acute shortages in electrical trades. Building AI-ready infrastructure requires specialised expertise that simply does not exist at scale.
The semiconductor supply chain presents another vulnerability. Over 60% of advanced semiconductors are manufactured in Taiwan, creating dangerous concentration risk. Manufacturing delays are becoming endemic, with data centre construction projects experiencing delays in 9 out of 10 cases, averaging 34% extensions. For a typical two-year data centre project extending to three years, delay costs can reach $14.2 million per month for a standard 60MW facility.
The scale of investment has reached unprecedented and potentially unsustainable levels. Microsoft ($80 billion), Amazon ($100 billion), Meta ($67.5 billion), and Google ($75 billion) have committed over $322 billion in combined 2025 capital expenditure. This represents hyperscaler CapEx as a percentage of revenue crossing 22% - nearly double the historical average of 11-16%. An interesting article from Bernie Ahkong, the CIO at UBS O’Connor, one of their largest in-house multistrat, concludes that that tech companies with the fastest CapEx growth rates have underperformed their sector peers by approximately 7% annually since 2022 (approximately the start of the AI CAPEX gold rush).
Source: Morgan Stanley
Companies to Watch
Broadcom Inc. (AVGO) - The AI Networking Brains
Broadcom has quietly become the networking brain of the AI revolution. Q2 2025 AI semiconductor revenue surged 46% year-over-year to $4.4 billion, with AI networking sales increasing 70% due to growing demand for routers and switches.
The company's newly launched Jericho4 ethernet fabric router can interconnect over one million XPUs across multiple data centres, handling roughly four times more data than its predecessor.
Management projects AI revenue could reach $60-90 billion by fiscal 2027, representing a potential 5x increase from current levels. The company's complete AI networking portfolio - Tomahawk switches, Jericho routers, and network interface cards - provides end-to-end solutions from individual racks to distributed data centre networks.
As infrastructure operators struggle with power constraints, Broadcom's technology enables the distributed computing model that physical limitations are forcing.
Amphenol Corporation (APH) - The Omnipresent Player
Amphenol is the quintessential "picks and shovels" play in this AI gold rush. The company's Q2 2025 results were exceptional: revenue of $5.65 billion (surpassing estimates by 13.55%) and earnings per share of $0.81 (beating forecasts by 22.73%).
The communications segment delivered $2.91 billion in net sales, posting 91% year-over-year growth and accounting for over half of total revenue.
Every AI server, regardless of manufacturer, requires Amphenol's high-speed connectors and interconnect systems. Content per AI server is increasing 3-4x versus traditional computing, and as data rates jump from 56 gigabits per second today to 224 Gbps and beyond, Amphenol's signal integrity expertise becomes mission-critical.
CEO Adam Norwitt remarked: "We're in the early innings of the adoption of AI on a broad basis across the economy".
Fujikura Ltd. (5803.T) - The Optical Infrastructure Enabler
Fujikura represents the optical backbone enabling AI's massive data flows, with exceptional execution with a revenue of ¥979.4 billion (+22.5% YoY), delivering robust 9.3% net profit margins.
Their Telecommunications Systems Business Division, which includes optical fibers and cables for data centers, has been the primary growth driver amid surging AI infrastructure demand.
Fujikura's Wrapping Tube Cable (WTC) with SpiderWeb Ribbon (SWR) technology enables impressive fiber density - their latest 13,824-fiber cable is specifically designed for hyperscale data center applications.
The company's competitive advantage lies in vertical integration and manufacturing excellence. Their new Sakura Works plant that started production in February 2025, increased production capacity by 30%.
Fujikura's optical solutions have become mission-critical for enabling the copper-to-optical transition that power constraints are accelerating.
MarketScreener - Smaller Companies in AI Infra
To dig further into smaller companies bound to benefit from the current AI Infrastructure reshuffle, create your own StockScreener like we did:

Conclusion: The Great Recalibration
We are witnessing the end of AI infrastructure's naive optimism phase. The assumption that throwing money at data centres would seamlessly scale to meet AI's demands has collided with harsh realities of power grids, supply chains, and physics. This is not a bear case for AI, rather it’s more like a reality check on implementation timelines and cost structures.
The winners in this next phase will not be those who build the fastest or biggest. They will be those who enable others to build smarter, who provide the critical components that make 224 gigabit connections possible, who solve thermal dissipation at 120 kilowatts per rack, who manufacture the optical highways that move exabytes of data efficiently.
For investors, this recalibration offers genuine opportunity. The market is beginning to differentiate between companies that merely ride the AI wave and those that provide fundamental building blocks. As capital expenditure balloons and margins compress for pure-play operators, real value accrues to component suppliers, connection specialists, and optical enablers who collect tolls on every bit of data flowing through the AI economy.
The AI revolution is not ending. But the easy money phase is over. What comes next will separate the infrastructure tourists from the component suppliers who make the whole system possible. In a world where you cannot download more electricity, those who master the connections will master the profits.
The gold rush continues, but the easy money phase appears to be ending, giving way to the smart money.
Stay invested, cautiously.




















