- AiNews.com
- Posts
- Nvidia’s Q3 Earnings and CEO's Defense of AI Strategy Amid Changes
Nvidia’s Q3 Earnings and CEO's Defense of AI Strategy Amid Changes
Image Source: ChatGPT-4o
Nvidia’s Q3 Earnings and CEO's Defense of AI Strategy Amid Changes
Nvidia's third-quarter earnings showcased the company’s dominance in the AI chip market, but investor reactions were mixed. The company reported an impressive 94% year-on-year revenue increase, reaching $35.08 billion, exceeding analysts' predictions of $33.16 billion. Adjusted earnings per share stood at $0.81, also surpassing expectations.
Despite the strong results, Nvidia shares fluctuated throughout Thursday, falling 1.5% in late-morning trading. Analysts attributed this volatility to sky-high expectations. William de Gale, lead portfolio manager at BlueBox Asset Management, explained: “Insane GPU demand has become the ‘bare minimum’ expected of the company.” He added, "There is a risk here … that Nvidia’s current overearning will begin to come to an end,” de Gale said. “There’s considerable risk in this name at the moment. But it’s exciting.”
While Nvidia’s revenue nearly doubled year-over-year, its growth rate has slowed compared to previous quarters, which saw increases of 122% and 262% in Q2 and Q1, respectively. Analysts are now focused on Nvidia’s next-generation Blackwell chip, which CEO Jensen Huang revealed is facing demand far outstripping supply.
Other semiconductor companies reacted to Nvidia’s results, with AMD shares slipping 1%, while Qualcomm and Intel saw modest gains of 1% and 1.2%, respectively.
CEO Jensen Huang Defends Nvidia’s Competitive Edge
During Nvidia’s earnings call, CEO Jensen Huang addressed concerns about the company’s positioning as AI labs adopt new methods like “test-time scaling.” This technique, exemplified by OpenAI’s o1 model, allows AI systems to allocate extra computing resources during the inference phase, potentially reducing reliance on Nvidia’s high-powered chips for pretraining.
Huang called test-time scaling “one of the most exciting developments” and emphasized Nvidia’s readiness to adapt. He reassured investors that Nvidia’s scale and reliability remain unmatched, particularly in AI inference workloads.
Huang reassured investors that scaling laws—adding more data and compute during pretraining—are still driving model improvements but acknowledged their limitations: “As you know, this is an empirical law, not a fundamental physical law, but the evidence is that it continues to scale. What we’re learning, however, is that it’s not enough.”
Huang emphasized that Nvidia remains well-positioned for future developments, highlighting the company's dominance in both training and inference workloads.
Rising Competition in AI Chips
While Nvidia dominates the market for training AI models, its competitors are targeting the growing AI inference space. Startups like Groq and Cerebras are developing specialized chips to challenge Nvidia’s dominance in this segment. Huang downplayed the threat, pointing to Nvidia’s reputation as the world’s largest inference platform.
“Our hopes and dreams are that someday, the world does a ton of inference, and that’s when AI has really succeeded,” Huang said. He highlighted Nvidia’s CUDA ecosystem and architecture as key advantages, enabling faster innovation for developers.
What This Means
Nvidia’s strong Q3 results reaffirm its leadership in the AI chip market, but the fluctuating stock prices and slowing growth rates suggest that investor expectations may need recalibrating. CEO Jensen Huang’s defense of Nvidia’s strategy reflects the company’s efforts to stay ahead as the AI landscape evolves, particularly with the rise of test-time scaling and increasing competition in inference chips.
As Nvidia prepares for the launch of its Blackwell chip and adapts to shifts in AI development, its ability to maintain its market edge will be critical in determining its long-term success.
Editor’s Note: This article was created by Alicia Shapiro, CMO of AiNews.com, with writing, image, and idea-generation support from ChatGPT, an AI assistant. However, the final perspective and editorial choices are solely Alicia Shapiro’s. Special thanks to ChatGPT for assistance with research and editorial support in crafting this article.