The Best AI Stocks Are Falling for the Wrong Reason VIEW IN BROWSER  Last week, Big Tech reset the scoreboard for the 2026 fiscal year. Collectively, the "Hyperscale Five" – Amazon (AMZN), Alphabet (GOOGL), Meta (META), Microsoft (MSFT), and Oracle (ORCL) – revealed they are on track to spend over $700 billion on AI infrastructure in 2026. To put that in perspective, that is nearly $2 billion every single day being poured into silicon, steel, and power. Predictably, Wall Street is having a panic attack. Investors are looking at these massive checks and asking, "Where is the ROI?" They fear we are at "Peak Capex," and that once this 2026 build-out is complete, the orders for AI supply chain stocks will vanish into a decade-long "digestion" period. So, naturally, AI stocks are crashing. But I'm here to tell you that the bears are wrong – because they're misunderstanding where AI spending is actually headed. What's happening right now isn't a temporary construction binge. It's a fundamental shift in how AI compute gets consumed across the economy. We believe this $700 billion-plus in AI capex will prove to be a floor, not a peak, for annual spending – because AI compute is shifting from something companies build once to something they consume forever. We're entering what I call the Inference Inversion. AI Capex Is Shifting From Training to Inference The biggest misunderstanding in the market today is the difference between AI training and inference. For the last two years, the bull market was driven by training as companies spent billions to build AI's "brain." Bears seem to believe that once the models are trained, the spending stops. But the data from this February 2026 earnings season shows the opposite: Inference compute volume has officially exceeded training compute. - Training is a CapEx event: You build it, and you're done for a while.
- Inference is an OpEx utility: It scales linearly with every single user.
In other words, training creates one-time demand. Inference creates recurring demand. And Wall Street consistently underestimates businesses that turn capital spending into ongoing utilities. As more advanced "reasoning" models become the standard, they rely on something called test-time scaling – meaning the model deliberately runs more compute per query to arrive at better answers. That's crucial: unlike training, this compute load never shuts off. Every additional user, prompt, and interaction permanently increases infrastructure demand. Test-time scaling turns AI from a bursty workload into a 24/7 industrial process. Inferencing is the real deal. And while it shows up as ongoing operating demand, it forces continuous capital investment to keep up. Why AI Hardware Upgrade Cycles Are Accelerating In AI, falling one hardware generation behind isn't a nuisance; it's an existential risk. Pre-AI, data centers upgraded servers about every five years. But that cycle timeline has collapsed to 12 months. AI hardware is no longer improving incrementally. It's improving exponentially – and that has broken the old data-center upgrade cycle. Nvidia's (NVDA) roadmap – moving from Hopper to Blackwell and now to the Vera Rubin architecture – has forced a "death march" on the hyperscalers. The Rubin GPU (shipping late 2026) promises a 10x reduction in token cost. If Google moves its stack to Rubin and cuts its AI operating costs by 90%, Microsoft and Amazon have no choice but to follow – or risk being structurally uncompetitive on price, margins, and latency. The ROI Signal Wall Street Is Ignoring The "Where's the ROI?" crowd is ignoring the most important number in Google's latest earnings report: $240 billion. This is Google's Cloud Backlog – demand that cannot be deferred, canceled, or “digested away” – which grew 55% year-over-year. Google isn't spending $180 billion on AI-related capex because it "hopes" people use AI; it's spending because it already has $240 billion in signed contracts it cannot fulfill yet without more chips. It is supply-constrained, not demand-constrained. And when demand is locked in via contracts, spending accelerates. Alphabet just delivered its strongest Google Cloud growth since the pandemic, accelerating from ~30% to nearly 50% – driven largely by Gemini. Amazon's AWS posted one of its best quarters in years while retail margins quietly expanded. Microsoft's cloud backlog has ballooned to roughly $625 billion. Meta's ad machine is still at a mid-20s clip despite being one of the most mature ad platforms on Earth. Yet the market is pricing these companies as if spending is about to fall off a cliff – even as every fundamental signal suggests the opposite. And when markets misprice a structural shift like this, the opportunity shows up first in the companies closest to the spending. AI Stocks to Buy On the ‘Peak Capex’ Dip For all these reasons – the inference shift, the intense competition to avoid existential crisis, the rapid upgrade cycles, and the massive ROI – we are confident that we have not reached "peak AI capex" here in 2026. Instead, we believe that – as much as $700 billion is – the Hyperscale 5's capex budgets will be at least that big or bigger for the next 10-plus years… Because Big Tech isn't just racing to build entirely new AI infrastructure. It's also being forced to upgrade and retrofit legacy data centers – all while early AI deployments are already translating into measurable revenue growth. As returns begin to materialize, the incentive to keep spending only increases. That means that AI supply chain stocks – the direct beneficiaries of all this spending – are great long-term plays. Yet those stocks are crashing right now due to fears of AI overspending… which, of course, is an opportunity for those who know that this, indeed, isn't "peak capex." We have a few strong AI supply chain candidates on our "buy-the-dip" list. Nvidia: The AI Capex Kingmaker Nvidia (NVDA) is like the ultimate AI toll booth. With the DGX B200 retailing for approximately $300- to $400,000 per unit, and the Rubin platform already in production, Nvidia's roadmap ensures it captures the lion's share of the hyperscalers' $700 billion. What matters most isn't just that Nvidia sells the chips; it's that Nvidia controls the cadence of the entire AI upgrade cycle. Each new architecture resets the economic bar for inference. When Blackwell halves cost per token and Rubin cuts it again, hyperscalers don't slow spending. They redeploy savings into more usage and more models. That dynamic turns Nvidia's roadmap into a self-reinforcing demand engine, not a one-time upgrade. - The Play: Buy the dip as the market realizes that 2027 guidance will likely be higher than 2026 due to the Rubin rollout.
Micron: The Memory Bottleneck AI chips are useless without High-Bandwidth Memory (HBM). Micron's (MU) HBM capacity is effectively sold out through 2026, giving the firm immense pricing power. As AI shifts from training to inference, memory intensity rises. Reasoning models must keep more parameters, context, and intermediate states active at once. That makes HBM the choke point. You can't cheap out on it without slowing everything down, which is why Micron's HBM is suddenly one of the most scarce and valuable parts of the AI stack. - The Play: Micron is the "beta" to Nvidia's "alpha." When the hyperscalers hike capex, they are essentially handing Micron a blank check for more memory.
Wesco: The Physical Infrastructure Play Every new AI rack requires high-voltage power distribution, advanced thermal management, and dense networking just to stay online. In many regions, it's the electrical and cooling infrastructure – not the chips – that determines how fast capacity can be deployed. Wesco (WCC) is a massive beneficiary of this data center "build" phase, acting as the primary distributor for the electrical and network infrastructure that turns a shell into a data center. - The Play: This is a "boring" stock that wins no matter which AI model (Gemini, Llama, GPT) wins.
AI Capex Is a Floor, Not a Peak We are in the "installation phase" of a new era. The $700 billion-plus being spent in 2026 is the foundation for a multi-trillion dollar AI economy. Don't let the short-term noise distract you. The "peak capex" narrative is a gift – because it's creating rare entry points in the picks-and-shovels companies powering the AI revolution. Now, here's the part it seems most investors are still missing. The U.S. government is now directing capital into the exact choke points of the AI supply chain that you just read about. And when Uncle Sam steps in as a buyer or partner, stocks don't drift higher. They surge. I recently put together a briefing on what I call the President's Market – how it works, why it's accelerating now, and how investors can position before Washington's next wave of capital hits the tape. If you want exposure, this is where to start. Sincerely, |
No comments:
Post a Comment