There exists a prevailing sense of assurance on Wall Street regarding Nvidia Corp.'s forthcoming earnings report, with the expectation of a "beat-and-raise" scenario. However, the pertinent inquiry now revolves around the extent of the advantageous trajectory that this prominent chip behemoth can realize.
In a landscape where various companies harbor optimism about their prospective capacity to harness artificial intelligence (AI) demand in the foreseeable future, Nvidia (NVDA) grapples with a distinctive challenge. It appears that the demand for its AI equipment has reached such heights that the company is confronting constraints imposed by limited supply.
In May, Nvidia made a remarkable impression on Wall Street by unveiling its revenue forecast of $10.78 billion to $11.22 billion for the fiscal second quarter. This projected performance on the top-line scale significantly outpaces the company's prior quarterly revenue pinnacle of $8.29 billion. The forecast notably exceeded the $7.17 billion average estimate formulated by analysts at the time.
According to FactSet's survey of analysts, an anticipation of $11.19 billion in revenue accompanies the outlook, alongside adjusted earnings of $2.08 per share.
Morgan Stanley's Joseph Moore highlighted that this forward outlook "marked the most substantial single increase in one quarter in semiconductor history." Nonetheless, he acknowledged that the prevailing demand remains insufficient to meet even half of the current demand. This dynamic suggests the prospect of a heightened data-center initiative once supply dynamics ameliorate. Moore has accorded Nvidia's stock an overweight rating, setting a target price of $500.
Concurring with the prevailing sentiment, C.J. Muse of Evercore ISI expressed, "It is widely assumed at this point that NVDA is posting a beat and raise." He observed that the company has capitalized on the demand for generative AI, and the central focus now shifts to quantifying the magnitude of this growth.
Muse noted, "The ongoing AI infrastructure build and continued Hopper ramp will be the primary drivers, with demand across both training and inference applications resulting in Data Center GPU supply falling well short of demand."
A pivotal topic of discussion pertains to the company's outlook on its operations in China, considering the potential impact of additional export restrictions. Citi Research analyst Atif Malik noted the potential long-term ramifications of sales restrictions in China, given that historical data reflects China accounting for 20-25% of Nvidia's data center sales.
While Nvidia's forthcoming report is poised to reflect record revenue figures, Jefferies analyst Mark Lipacis envisions another milestone. He predicts that Nvidia's data-center GPU revenue will surpass the data-center CPU revenues of Intel and AMD for the first time in history. This achievement, according to Lipacis, signifies the fourth tectonic shift toward a Parallel Processing Computing Era, as advocated since 2017.
Lipacis envisages Nvidia's eventual dominance of 80% of the data-center market, driven by the surge in AI applications. He bases this projection on the premise that every computing era expands by a factor of 10 compared to its predecessor, with one ecosystem typically capturing 80% of the value generated. He cites examples from different eras to illustrate this phenomenon, emphasizing the pivotal role played by a single, vertically integrated entity.
Nvidia's recent introduction of the DGX GH200 Grace Hopper Superchip, tailored for large-memory generative AI models, reflects the company's strategic response to the evolving AI landscape. This endeavor is positioned to launch in the second quarter of 2024, underscoring Nvidia's commitment to advancing AI capabilities.
As a leading independent research provider, TradeAlgo keeps you connected from anywhere.