Why Arm Holdings Is Poised for Massive Growth in AI Agents Era

11 min read
4 views
Apr 21, 2026

We've just pulled the trigger on a new position in a key player in the semiconductor world that's perfectly positioned as AI moves into its next big phase. The numbers coming out of their latest developments are eye-opening, and the potential upside has us genuinely excited. But what makes this move so timely right now?

Financial market analysis from 21/04/2026. Market conditions may have changed since publication.

Have you ever wondered what really powers the next wave of artificial intelligence? Not just the flashy training of massive models, but the everyday workhorses that will keep intelligent agents running smoothly in data centers around the world. I’ve been following the semiconductor space for years, and something clicked for me recently when a particular chip designer made a bold move that could reshape how we think about AI infrastructure.

After careful consideration, we’re starting a position in this innovative company, buying shares at around the current market levels. It’s not every day that a firm with such deep roots in efficient processor design decides to step directly into manufacturing its own advanced chips for the agentic AI era. The potential here feels substantial, especially as the industry grapples with surging demand for more balanced computing resources.

The Shift Toward Agentic AI and Why CPUs Matter Again

For the longest time, the AI conversation centered almost entirely on graphics processing units. Those powerful GPUs handled the heavy lifting of training large language models and running complex inference tasks. But as we’ve moved into what many are calling the agentic phase of AI, something interesting has happened. The focus is broadening, and central processing units are suddenly back in the spotlight.

Agentic AI refers to systems where intelligent agents don’t just respond to queries but actively reason, plan, and take actions autonomously. These agents generate far more tokens and require constant coordination, data movement, and decision-making. That shift creates a completely different workload profile compared to traditional model training.

In my experience watching tech cycles, these kinds of transitions often catch investors off guard. The market had grown accustomed to pouring resources into GPUs, sometimes overlooking the foundational role that efficient CPUs play in scaling real-world applications. Now, the numbers suggest we might need significantly more CPU capacity than previously anticipated.

Industry estimates point to today’s AI data centers requiring around 30 million CPU cores for each gigawatt of power capacity. With the rise of agentic systems, that figure could multiply by a factor of four for the same amount of infrastructure. It’s no wonder there’s talk of potential CPU shortages on the horizon if demand accelerates as expected.

Understanding the Unique Position of This Chip Architect

The company in question has built its reputation on designing high-performance, low-power, and cost-effective processor architectures. Rather than building chips itself for most of its history, it focused on licensing its intellectual property to major players across the industry. This approach allowed it to influence everything from mobile devices to servers without the capital-intensive burden of fabrication.

Its designs are known for exceptional energy efficiency, which has made them a favorite for companies looking to optimize power consumption in everything from smartphones to cloud infrastructure. Major technology firms have long relied on these architectures to power their custom silicon solutions.

The transition to agentic AI is driving a fundamental rethink of data center architecture, with efficiency becoming the new battleground.

What makes this moment particularly compelling is the company’s decision to evolve beyond pure licensing. They’ve recently introduced their first in-house data center CPU specifically engineered for these emerging agentic workloads. This represents a significant strategic pivot that could open up entirely new revenue streams while building on their established strengths.

How Their Business Model Has Traditionally Worked

At its core, the traditional model involved two main sources of income. First, customers pay upfront licensing fees to access the processor designs and related technology. Then, as those customers produce and sell chips based on the architecture, the company earns ongoing royalty payments – typically a percentage of each chip’s selling price.

This royalty stream has proven remarkably resilient because it continues for as long as the chips remain in production. It’s a bit like earning recurring revenue from intellectual property that keeps working long after the initial deal is signed. Over time, this has created a stable and growing income base.

The efficiency advantage has been a key selling point. Processors based on this architecture often deliver better performance per watt compared to traditional x86 designs. In an era where data centers are consuming enormous amounts of electricity, that difference can translate into meaningful cost savings for operators.

The Bold Move Into Building Their Own Chips

The introduction of their new AGI-focused CPU marks more than just a product launch – it’s a strategic evolution. Instead of only licensing designs, they’re now preparing to sell complete processor solutions tailored for the demands of intelligent agents. This could position them more directly in the value chain for AI infrastructure.

The timing seems particularly astute. As hyperscale operators look for ways to handle the exploding volume of agentic workloads without exponentially increasing their capital expenditures, solutions that promise better performance per rack become incredibly attractive.

According to their projections, their architecture could deliver more than double the performance per rack compared to conventional x86-based CPUs. If those claims hold up in real-world deployments, the potential capital expenditure savings could reach significant figures – potentially up to billions per gigawatt of data center capacity.

Real-World Efficiency Gains and Customer Interest

One of the most promising aspects is the early traction they’re seeing with major technology companies. Having already secured interest from leading social media and AI research organizations speaks volumes about the practical value of their approach. These aren’t small pilot projects – they’re serious engagements that could scale rapidly.

The sales pitch is straightforward yet powerful: do more computing with less hardware and lower power consumption. In an industry where every percentage point of efficiency can translate into millions in savings, this message resonates strongly with decision-makers responsible for massive infrastructure builds.

Perhaps what impresses me most is how this new CPU complements rather than replaces their existing licensing business. The royalty stream isn’t going away – in fact, management anticipates continued strong growth there, with expectations of around 20 percent compound annual growth over the coming years.

Projecting the Financial Trajectory Ahead

Looking further out, the company has laid out an ambitious but seemingly achievable path to substantial revenue growth. They’re targeting around 25 billion dollars in annual revenue by fiscal year 2031. What’s particularly notable is the split – with a significant portion expected to come from their new in-house chip sales.

This would represent a dramatic increase from current levels, where analysts anticipate roughly 4.9 billion in revenue for the upcoming fiscal year. The earnings power could also scale meaningfully, with projections suggesting earnings per share could exceed nine dollars by 2031 compared to around 1.75 dollars expected sooner.

Of course, these are forward-looking estimates and come with the usual caveats about execution risks and market conditions. But the underlying drivers – the shift to agentic AI and the need for more efficient computing – feel fundamentally sound to me.

Addressing Potential Challenges in Scaling Production

No major technological shift happens without hurdles, and this one is no exception. The company has pointed to memory constraints as a current bottleneck limiting how quickly customers can deploy these new processors. It’s a reminder that the entire semiconductor ecosystem remains interconnected, with shortages in one area potentially affecting adoption rates elsewhere.

Shipping is expected to begin toward the end of this year, which gives time for both the company and its partners to work through supply chain details. In my view, successfully navigating these initial deployment challenges could be a key catalyst for investor confidence.

Another consideration is the competitive landscape. While their efficiency advantages are compelling, established players in the CPU space have deep relationships and significant resources of their own. The battle for market share in data center processors will likely intensify as AI infrastructure spending continues to climb.

Why This Fits Into a Broader Investment Thesis

When evaluating opportunities in the tech sector, I often look for companies that sit at the intersection of multiple powerful trends. Here, we have the continued expansion of artificial intelligence, the growing importance of energy efficiency in computing, and a strategic evolution in business model that could unlock new growth avenues.

The relationship with major GPU designers remains strong, suggesting that this isn’t about competing in every segment but rather complementing the existing AI hardware stack. That collaborative history could prove valuable as the industry seeks more balanced systems for agentic workloads.

From a portfolio perspective, adding exposure to this name at current levels feels like a measured way to participate in the next phase of AI development. We’re starting with a modest position size, representing a small percentage of the overall holdings, which allows room to add on dips or as milestones are achieved.

Setting Realistic Expectations and Price Targets

We’ve established an initial price target that implies roughly 16 percent upside from recent trading levels. This isn’t meant to suggest we’ll see that move immediately, but rather reflects our assessment of fair value based on the growth prospects over the coming years.

Tech investments, especially in semiconductors, can be volatile. News about supply chains, customer wins, or broader market sentiment can cause significant swings. That’s why a disciplined approach to position sizing and ongoing monitoring feels essential.

The Bigger Picture for AI Infrastructure

Stepping back, what’s happening here is part of a larger transformation in how computing resources are allocated for artificial intelligence. The early days focused heavily on raw training power, but sustainable scaling requires attention to efficiency across the entire stack – from training to inference to agent coordination.

Companies that can deliver meaningful improvements in performance per dollar or per watt will likely capture substantial value. The architecture we’re discussing has a proven track record in mobile and edge computing, and extending those advantages to the data center could be transformative.

Efficiency isn’t just a nice-to-have in AI anymore – it’s becoming the primary competitive advantage.

I’ve spoken with fellow investors who remain skeptical about any single company’s ability to disrupt entrenched players in server CPUs. That’s a fair point. Yet the combination of architectural efficiency, established ecosystem relationships, and timely product development creates a compelling case worth watching closely.

Potential Risks Worth Considering

Like any investment in the fast-moving tech sector, there are risks to acknowledge. Regulatory considerations have impacted similar deals in the past, though the current focus is on organic growth rather than large acquisitions. Geopolitical tensions around semiconductor supply chains could also introduce volatility.

Execution risk around their first major in-house chip launch remains real. Delays in shipping or performance shortfalls compared to projections could affect market sentiment. Additionally, broader economic conditions that impact technology spending could slow adoption rates.

That said, the long-term royalty business provides a cushion that many pure-play chip manufacturers lack. Even if the new CPU ramp takes longer than hoped, the core licensing model continues to generate cash flow from a wide range of customers.

Comparing Architectural Approaches in Modern AI

To better understand the opportunity, it helps to contrast different processor architectures. Traditional x86 designs have dominated data centers for decades, offering broad software compatibility and strong single-threaded performance. However, they often come with higher power consumption profiles that become increasingly costly at scale.

In contrast, the architecture we’re examining prioritizes energy efficiency and scalability across many cores. This approach aligns particularly well with the parallel nature of many AI workloads, where distributing tasks efficiently can yield better overall system performance.

The custom silicon efforts by major cloud providers demonstrate the appetite for alternatives to legacy designs. Several have already adopted this architecture for their internal infrastructure, suggesting that the ecosystem support is already in place for broader adoption.

Architecture TypeKey StrengthTypical Use CasePower Efficiency
x86 TraditionalSoftware compatibilityGeneral computingBaseline
Arm-based DesignsEnergy efficiencyAI agents, mobileSuperior
GPU AcceleratorsParallel processingModel trainingHigh for specific tasks

This simplified comparison highlights why many see potential for greater balance in future AI systems. Rather than an either-or situation, the most effective data centers will likely combine different architectures optimized for their respective strengths.

What Success Might Look Like in the Coming Years

If things play out as management envisions, we could see this company transition from primarily an IP provider to a more diversified player with significant product revenue. The royalty business would continue growing alongside new chip sales, creating multiple growth engines.

Customer expansion beyond current partners could accelerate as more organizations recognize the efficiency benefits for their agentic AI deployments. Success might also involve deeper integration with the broader ecosystem, including software optimizations that make adoption even smoother.

From an investor standpoint, consistent execution on product roadmaps combined with steady royalty growth could support multiple expansion over time. Of course, the market will ultimately decide how to price these prospects, but the fundamental setup appears attractive.

Lessons From Past Technology Transitions

Looking back at previous shifts in computing – from mainframes to personal computers, or from desktops to mobile – the companies that adapted their business models while leveraging core strengths often emerged stronger. This feels like a similar inflection point for processor architectures in the AI age.

The move toward more specialized hardware for different workload types mirrors what happened in graphics and machine learning acceleration. Those who positioned themselves early in those trends captured significant value as adoption scaled.

I’ve found that staying patient through the initial phases of these transitions often rewards thoughtful investors. The hype cycles come and go, but the underlying technological merits tend to win out over longer periods.

Portfolio Integration and Position Management

We’re approaching this addition with a measured mindset. Starting with a position that represents about one percent of the overall portfolio allows us to participate without overcommitting capital upfront. This size provides flexibility to adjust based on how the story develops.

Ongoing monitoring will focus on several key metrics: progress on CPU shipments, royalty growth trends, customer announcements, and any updates regarding supply chain constraints. Positive developments in these areas could warrant adding to the position over time.

Conversely, if challenges emerge that fundamentally alter the growth thesis, we’ll reassess accordingly. Discipline in both entry and potential exits remains crucial in the volatile world of technology investing.

The Human Element Behind the Technology

Beyond the charts and projections, it’s worth remembering that these advancements ultimately serve human needs. Whether it’s more responsive AI assistants, efficient autonomous systems, or tools that augment human creativity, the underlying infrastructure matters because of its real-world impact.

Companies that can deliver both technological innovation and practical efficiency contribute to broader progress. In that sense, backing firms that push the boundaries responsibly feels aligned with longer-term societal benefits as well as investment goals.


As we watch this story unfold, the combination of established strengths and forward-looking innovation makes for an intriguing opportunity. The agentic AI wave is just beginning to build momentum, and having exposure to key enablers could prove rewarding.

Of course, past performance doesn’t guarantee future results, and all investments carry risk. This discussion reflects our current thinking but shouldn’t be taken as personalized advice. Each investor needs to consider their own situation and risk tolerance.

We’ll continue tracking developments closely and sharing updates as meaningful progress occurs. For now, the decision to initiate this position stems from a belief that this particular chip designer has the right ingredients to thrive as artificial intelligence enters its next evolutionary stage.

The coming years should reveal whether their ambitious plans translate into the kind of sustained growth that justifies investor enthusiasm. In the meantime, the fundamental drivers around efficiency and expanding AI workloads provide a solid foundation for optimism.

I’ve always believed that the best investment opportunities often emerge when a company leverages its core competencies in new and creative ways. This evolution from design licensing to offering complete solutions for cutting-edge workloads feels like exactly that kind of strategic expansion.

Whether you’re an investor following AI trends or simply curious about the technology shaping our future, keeping an eye on developments in efficient processor architectures seems worthwhile. The agentic era promises to be transformative, and the companies enabling it efficiently could play starring roles.

Time will tell how this particular story plays out, but the early signals certainly have our attention. The intersection of proven design expertise and bold product ambitions creates a narrative worth following closely in the months and years ahead.

A journey of a thousand miles must begin with a single step.
— Lao Tzu
Author

Steven Soarez passionately shares his financial expertise to help everyone better understand and master investing. Contact us for collaboration opportunities or sponsored article inquiries.

Related Articles

?>