Every once in a while the market hands you a moment when the smartest people in the room are basically shouting “this train hasn’t even left the station yet.” Right now, that moment feels like artificial intelligence chips.
I’ve been watching the semiconductor space for years, and I can’t remember the last time a single analyst note made me sit up quite like this one did. A major Wall Street firm just told the world they see the dominant player in AI silicon climbing another 40%+ from current levels, and its closest peer still has double-digit upside. If you’ve been on the fence about the AI trade, this might be the nudge.
Why the Street Suddenly Got Even More Bullish
Let’s cut through the noise. The core thesis hasn’t changed: demand for AI training and inference hardware is exploding, supply is tight, and the company that owns roughly 90% of the high-end market is printing cash faster than the Fed ever dreamed.
What has changed is the degree of confidence. Analysts are no longer hedging with “if demand stays strong…” They’re saying customers are panicking because they literally can’t secure enough product for 2026 plans. That’s a very different conversation.
Nvidia: From “Expensive” to “Still Cheap on Forward Numbers”
Look, I get it — Nvidia’s valuation looks scary at first glance. But when you dig into the actual numbers coming out of hyperscalers and enterprise buyers, something interesting happens: the forward earnings power starts to make today’s multiple look almost reasonable.
“Customers’ biggest anxiety for the next 12 months is their ability to procure enough product generally, and the next-gen architecture specifically.”
Lead semiconductor analyst, major investment bank
That single sentence tells you everything. The conversation has flipped from “will Nvidia keep its moat?” to “how fast can they ramp production before customers start tearing their hair out?”
And before anyone says “Blackwell delays,” the latest checks suggest those issues are largely in the rear-view mirror. The Vera Rubin architecture (the one after next) is already seeing pre-orders that make Blackwell look modest.
Broadcom: The Quiet AI Giant Everyone Sleeps On
While Nvidia grabs headlines, Broadcom has been quietly building one of the most profitable AI-adjacent businesses on earth. Their custom silicon deals with hyperscalers — especially Google’s TPU program — are scaling faster than almost anyone expected.
Here’s the part that blew my mind: multiple supply-chain sources report Google has been increasing TPU orders aggressively, even as some people assumed custom chips would eat Nvidia’s lunch. Turns out both can win when total compute demand is growing 70-100% annually.
- Google TPU ramp accelerating
- Hyperscalers building hybrid fleets (Nvidia + custom ASICs)
- Broadcom networking business also riding the AI wave
- Margins expanding as mix shifts to higher-value silicon
Yes, there’s some cannibalization — Meta reportedly pushed out its own ASIC program and is using more TPUs instead — but the pie is growing so fast that Broadcom still comes out ahead.
The Supply Crunch Is Real (And Bullish)
I’ve spoken to several investors who worry the rally has gone too far. Fair concern. But then you talk to actual enterprises trying to build AI clusters and the story changes completely.
Companies aren’t asking “should we buy Nvidia?” anymore. They’re asking “how do we get on the list?” Allocation has become the new currency in Silicon Valley. If you have guaranteed supply for 2026, you basically have a competitive advantage for years.
That dynamic creates pricing power most industries can only dream of. And it’s showing up in the numbers — average selling prices climbing, lead times stretching, premium pricing for next-gen parts accepted without pushback.
What Could Competition Finally Matter?
Everyone wants to know when AMD, Intel, or the hyperscalers’ internal teams will dent Nvidia’s dominance. Honest answer? Probably not in 2026, maybe not even 2027.
The software moat — CUDA — remains enormous. Every major AI framework is built on it. Porting costs are measured in hundreds of millions and years of engineering time. Most companies look at that math and decide it’s cheaper to just pay Nvidia’s “tax.”
Custom ASICs make sense for the biggest players at inference scale, but training clusters? Still overwhelmingly Nvidia. And training is where the real capex dollars flow.
Valuation Reality Check
Let’s do some quick math (feel free to follow along):
- Current run-rate revenue heading toward $130B+ annualized
- Growth still accelerating, not decelerating
- Gross margins north of 75%
- Free cash flow margins that would make Apple blush
When you have that combination, paying 35-40x forward earnings suddenly doesn’t look crazy. Especially when the alternative is missing another leg of a once-in-a-generation tech platform shift.
Broadcom, trading closer to 30x forward, actually looks cheaper on some metrics despite its own monster growth.
Where We Go From Here
Short term? Normal pullbacks will happen. Guidance beats followed by “sell the news” dips are part of the game.
Long term? If AI capex keeps growing at current rates — and every hyperscaler budget I’ve seen for 2026 says it will — then today’s prices might look quaint in 24 months.
I’ve learned not to fight trends this powerful. When the biggest, smartest buyers on earth are scrambling to lock in supply years in advance, that’s usually a sign to pay attention.
The AI build-out is still in early innings. The leaders today are likely to be the leaders tomorrow. And right now, Wall Street’s best are telling us those leaders have plenty of runway left.
Sometimes the simplest trades are the best ones.