• AiNews.com
  • Posts
  • Nvidia CEO: Reasoning AI Will Depend on Lower Computing Costs

Nvidia CEO: Reasoning AI Will Depend on Lower Computing Costs

A realistic illustration of Nvidia CEO Jensen Huang discussing the future of AI with advanced reasoning capabilities. In the background, Nvidia chips and data processing models are depicted, alongside representations of AI development. A progress bar or graph indicates the increase in chip performance and reduction in computing costs, symbolizing the key points of Huang's vision for the future of AI. The overall theme emphasizes innovation, affordability, and technological advancement in AI systems.

Image Source: ChatGPT-4o

Nvidia CEO: Reasoning AI Will Depend on Lower Computing Costs

NVIDIA's CEO, Jensen Huang, stated that the future of artificial intelligence lies in services that possess the ability to "reason." However, for AI systems to reach this advanced stage, the cost of computing must come down.

Huang discussed this during a podcast hosted by Arm Holdings CEO, Rene Haas, explaining that future AI tools will be able to process queries through hundreds or even thousands of steps and then reflect on their conclusions. This type of reasoning would set these future systems apart from current models, such as OpenAI’s ChatGPT, which Huang admitted he uses daily.

Boosting Chip Performance to Enable Reasoning AI

To prepare for these advancements, Nvidia is working to increase its chip performance by two to three times annually while maintaining the same cost and energy consumption levels. According to Huang, these improvements will revolutionize how AI systems handle inference—the process of recognizing patterns and drawing conclusions.

“We’re able to drive incredible cost reduction for intelligence,” Huang said. “We all realize the value of this. If we can drive down the cost tremendously, we could do things at inference time like reasoning.”

Nvidia's Dominance in the AI Chip Market

Nvidia currently controls over 90% of the market for accelerator chips, which are crucial for speeding up AI processing. The company has also expanded into selling complete computing solutions, including AI models, software, networking, and other services. This is part of a broader push to encourage more companies to adopt AI technologies.

Competition on the Horizon

However, Nvidia is facing increasing competition. Data center operators like Amazon’s AWS and Microsoft are developing their own in-house alternatives to Nvidia’s chips. Meanwhile, Advanced Micro Devices (AMD), Nvidia’s long-standing rival in gaming chips, is also emerging as a key player in the AI space. AMD is expected to reveal more about its AI products at an upcoming event on October 10.

What This Means Moving Forward

Huang’s vision of "reasoning" AI represents a significant leap forward in artificial intelligence, where future systems will have the capability to think more like humans. However, achieving this milestone hinges on reducing the cost of computing. Nvidia’s ongoing advancements in chip performance will play a crucial role in making reasoning AI a reality. As competitors like AMD and cloud giants like AWS and Microsoft continue to develop their own AI hardware, the race to lower the cost of computing while enhancing AI capabilities will intensify. This means that the next era of AI development will focus not only on performance but also on affordability and accessibility for broader adoption.