Nvidia Shares versus Custom AI Processors: Should Investors Show Concern?

Nvidia Shares versus Custom AI Processors: Should Investors Show Concern?

Nvidia Shares had an impressive 2024, climbing by nearly tripling to around $135 per share. Business has flourished, driven mainly by the increased demand for graphics processing units (GPUs), which have become fundamental components of the era of generative artificial intelligence. However, over the past month or so, investors have seemed to shift their attention towards another sector of the semiconductor market – ASICs (application-specific integrated circuits) – which could potentially play a larger role in AI computing.

This shift comes following two major ASIC players, Broadcom and Marvell Technology, reporting a substantial increase in demand for their ASICs from prominent cloud customers during the recent quarters. Is Investing in Custom AI Chips Worthwhile for Marvell Stock? To give an example, Broadcom's sales from its custom AI chips and networking processors skyrocketed by 220% to $12.2 billion in 2024, going from $3.8 billion in revenue generated from AI silicon in FY’23. It's essential to note that while Nvidia's sales still take the lead, estimated to be around $129 billion this fiscal year, its growth rates are starting to slow down. Thus, there's a question of whether ASICs could potentially challenge Nvidia's leadership in the AI field as the market matures.

Furthermore, if you're looking for above-average returns with less volatility, you might want to consider investing in the *Top-notch Portfolio**, which has outperformed the S&P by more than 91% since its inception.

ASICs vs GPUs

ASICs have been in existence for more than five decades, but they have seen renewed interest in the era of AI. While GPUs from companies like Nvidia are flexible and can be programmed for AI as well as various other tasks, ASICs are uniquely designed semiconductors built for specific purposes, offering certain advantages over general processors. By honing in on targeted functionalities, these chips offer various advantages compared to GPUs for AI.

For instance, ASICs tend to be less expensive than GPUs, which are designed for a broader range of applications. They also consume less electricity, making them ideal for data centers aiming to cut down on electricity costs, which is a substantial expense for large AI systems. ASICs can also deliver higher performance than general-purpose GPUs from Nvidia or AMD for dedicated tasks as they are specifically built for that purpose. These chips can be a smart choice for large cloud computing providers as they can justify the design and development costs of ASICs at their scale.

Change in the AI Landscape Benefits ASICs

Major companies have dedicated significant resources to building AI models over the past two years. Now, training these vast models requires substantial computing power, with Nvidia's GPUs considered the fastest and most efficient for these tasks. However, the AI landscape might be shifting. As models grow larger, incremental performance gains are likely to diminish. Moreover, the availability of high-quality training data could become a bottleneck as much of the Internet's high-quality data has already been processed through large language models.

Given the current situation, the significantly front-loaded AI training phase could come to an end soon. The economic dynamics of the GPU chip market and the broader AI ecosystem are not that promising, and most of Nvidia's consumers may not be generating substantial returns on their investments yet. See How Nvidia Stock could Drop to $65.

With investors eventually seeking better returns, we could see more customers turning towards ASICs to reduce both initial costs and ongoing expenses. The future focus of AI will be on inference, where trained models are used in practical applications. This phase is less computationally intensive, potentially allowing the use of less powerful AI processors that are well-suited for these tasks. ASIC chips customized for inference could be perfect for these tasks as well.

What This Means for Nvidia

While Nvidia stock has shown impressive growth lately, the *Top-notch Portfolio, which consists of 30 stocks, has outperformed the S&P 500 index with lower risk over the past four years, providing a smoother ride. So, what's in store for Nvidia stock?

To be clear, it's unlikely that Nvidia's business will be entirely replaced by these emerging processors, given the company's advantage in the AI market and its deeply ingrained CUDA software stack and development tools, which result in high switching costs for customers. If the market witnesses a significant shift, Nvidia could also potentially expand its presence in this sector. However, Nvidia's premium valuation may not fully account for potential risks and slowing growth. We estimate Nvidia stock to be worth around $93 per share, about 32% below its current market value. See our analysis on Nvidia's Valuation: Expensive or Cheap.

Invest with Trefis *Elite Portfolios*.

See all Trefis *Price Estimates*.

In light of the shifting investor interest towards ASICs for AI computing, analysts might question whether investing in custom AI chips could significantly impact Nvidia's earnings and NVDA stock. As Nvidia's growth rates begin to slow down, the potential challenge from ASICs in the AI field becomes more pertinent as the market matures.

Considering the increasing demand for ASICs from cloud providers and the potential cost savings they offer, investors may start considering ASIC chips for inference tasks, posing a potential threat to Nvidia's GPUs in the future AI landscape.

Read also: