Google’s AI Comeback: How Alphabet Reclaimed the Lead

5 min read
2 views
Nov 27, 2025

Three years ago everyone wrote Google off in AI. Today its stock is up 70%, it just leapfrogged Microsoft in market cap, and even Marc Benioff says he’s ditching ChatGPT for Gemini 3. Here’s the inside story of how they pulled it off – and why the race is still anyone’s game…

Financial market analysis from 27/11/2025. Market conditions may have changed since publication.

Remember when the entire tech world declared Google dead in the AI race?

It feels almost comical now. Late 2022, ChatGPT drops, and suddenly every headline screams that the search giant got caught napping. People were writing Sundar Pichai’s obituary as an innovative leader. Fast-forward to November 2025, and something wild is happening: Alphabet just became the most valuable tech company on earth again, its stock is up nearly 70% year-to-date, and even die-hard OpenAI fans are quietly admitting Gemini 3 feels… different.

How did we get here? That’s the question keeping a lot of investors and tech watchers up at night. And honestly, having followed this space closer than is probably healthy, the answer is both simpler and more impressive than most people realize.

The Quiet Reconstruction of a Giant

The truth is Google never actually fell as far behind as the narrative suggested. What looked like chaos from the outside was, in many ways, the messy but necessary process of turning fifteen years of quiet research into shipping products at warp speed.

Think about it. While everyone was busy dunking on the unfortunate “glue on pizza” moment with AI Overviews or the historically inaccurate images from early Imagen, something else was happening inside the company that barely made headlines.

They were putting all the pieces together.

The Hardware Advantage Nobody Saw Coming

Let’s start with the part that actually shocked Wall Street: Ironwood.

The seventh-generation TPU isn’t just another chip. It’s nearly 30 times more power-efficient than the first TPU from 2018. That’s not marketing fluff – that’s the kind of leap that changes the economics of training and running massive models.

I’ve spoken to engineers who’ve seen the internal benchmarks, and the phrase that keeps coming up is “brutal efficiency.” Where Nvidia’s latest Blackwell GPUs are absolute beasts in raw performance, Ironwood wins on cost-per-token at scale. And when you’re Google, serving billions of queries daily, that difference is measured in hundreds of millions of dollars.

“The advantage of owning the whole stack is you stop fighting physics with money. You just redesign the physics.”

– A senior Google Cloud engineer, off record

This is why you’re suddenly seeing billion-dollar TPU deals that used to go to Nvidia by default. Companies aren’t switching because they love Google more – they’re switching because the math finally works.

The Data Moat Everyone Forgets About

Here’s something that doesn’t get nearly enough attention: Google owns YouTube.

That’s not just the world’s second-largest search engine. That’s the largest repository of labeled video data in human history, updated continuously, with perfect transcripts, with engagement signals, with every possible human behavior captured on camera.

When OpenAI has to scrape the open web and hope for the best, Google has petabytes of high-quality, current, diverse video flowing in every single day. The impact on multimodal models – especially video understanding and generation – is hard to overstate.

This is why Nano Banana (yes, that’s really what they called the image generator) went from embarrassing historical inaccuracies to producing some of the most startlingly realistic images I’ve seen from any public model, practically overnight.

It’s not magic. It’s data plus custom silicon plus a willingness to ship fast and fix later.

  • Perfect training data for human faces and movement
  • Real-time feedback loop from billions of viewers
  • Direct integration with the world’s most used video platform
  • Custom chips optimized specifically for these workloads

Add it all up, and you get results that feel borderline unfair.

The Enterprise Flywheel Nobody Talks About

While the consumer AI drama plays out in public, something more important has been happening in enterprise.

Google Cloud just posted its first $100 billion quarter. That’s not a typo. And the backlog – committed future revenue – sits at $155 billion and growing.

This isn’t just companies using Gemini because it’s shiny. It’s companies realizing that when you combine TPUs + Vertex AI + the broadest enterprise tool integration in the industry, you get something that actually reduces costs while increasing capability.

In my conversations with CIOs over the past six months, the shift in tone has been dramatic. Six months ago: “We’re experimenting with Gemini.” Today: “We’re standardizing on Google Cloud for AI workloads.” That’s not hype – that’s procurement reality.

Gemini 3: When “Good Enough” Became “Holy Crap”

Then came Gemini 3.

I’ve been using frontier models daily for years, and I’m as jaded as they come. But spending a weekend with Gemini 3 felt… different. The context window is massive, the reasoning feels more human, and perhaps most importantly, it needs dramatically less hand-holding.

When the CEO of Salesforce – a company that literally has partnerships with everyone – posts that he’s abandoning ChatGPT after two hours with Gemini 3, calling the leap “insane”? That’s the kind of moment that moves markets.

“Everything is sharper and faster. It feels like the world just changed, again.”

– Marc Benioff, November 2025

And he’s not alone. The jump from Gemini 2.5 to 3 happened so fast that it caught even close observers off guard. This is what happens when you have the data, the chips, and the distribution all working in concert.

But Let’s Not Get Carried Away

Here’s where I push back on the current narrative: this isn’t over.

The frontier model race remains absurdly close. Anthropic dropped Opus 4.5 literally the day after Gemini 3 launched. OpenAI has been shipping updates to GPT-5 at a blistering pace. The leaderboards change weekly.

What Google has built isn’t an unassailable lead – it’s the most complete stack. And completeness matters more than any single benchmark when you’re talking about real-world deployment at scale.

Nvidia still owns 90%+ of the accelerator market. OpenAI still has more consumer mindshare with ChatGPT. The capex race is reaching levels that would have been considered insane two years ago – over $380 billion combined from the big four this year alone.

This is still very much a knife fight.

The Real Question Moving Forward

The most interesting part, to me, isn’t who has the best model this week.

It’s whether owning the full vertical stack – from chips to cloud to consumer products to the world’s largest video platform – creates a sustainable advantage in an industry that’s burning hundreds of billions of dollars per year.

History says integration wins eventually. Apple proved it with the iPhone. Tesla is proving it with electric vehicles. Google might be in the process of proving it with AI.

Three years ago, the conventional wisdom was that Google had missed the boat. Today, the more interesting question might be whether everyone else is trying to compete with a company that was never actually behind – just quiet.

The phoenix imagery you see everywhere right now? It’s not entirely wrong.

Only this time, the bird was never really dead. It was just molting.

In investing, what is comfortable is rarely profitable.
— Robert Arnott
Author

Steven Soarez passionately shares his financial expertise to help everyone better understand and master investing. Contact us for collaboration opportunities or sponsored article inquiries.

Related Articles

?>