Jensen Huang claims Nvidia's AI chips are outpacing Moore's Law

Skye Jacobs

Posts: 338   +9
Staff
Forward-looking: Challenging conventional wisdom, Nvidia CEO Jensen Huang said that his company's AI chips are outpacing the historical performance gains set by Moore's Law. This claim, made during his keynote address at CES in Las Vegas and repeated in an interview, signals a potential paradigm shift in the world of computing and artificial intelligence.

For decades, Moore's Law, coined by Intel co-founder Gordon Moore in 1965, has been the driving force behind computing progress. It predicted that the number of transistors on computer chips would roughly double every year, leading to exponential growth in performance and plummeting costs. However, this law has shown signs of slowing down in recent years.

Huang, however, painted a different picture of Nvidia's AI chips. "Our systems are progressing way faster than Moore's Law," he told TechCrunch, pointing to the company's latest data center superchip, which is claimed to be more than 30 times faster for AI inference workloads than its predecessor.

Huang attributed this accelerated progress to Nvidia's comprehensive approach to chip development. "We can build the architecture, the chip, the system, the libraries, and the algorithms all at the same time," he explained. "If you do that, then you can move faster than Moore's Law, because you can innovate across the entire stack."

This strategy has apparently yielded impressive results. Huang claimed that Nvidia's AI chips today are 1,000 times more advanced than what the company produced a decade ago, far outstripping the pace set by Moore's Law.

Rejecting the notion that AI progress is stalling, Huang outlined three active AI scaling laws: pre-training, post-training, and test-time compute. He pointed to the importance of test-time compute, which occurs during the inference phase and allows AI models more time to "think" after each question.

During his CES keynote, Huang showcased Nvidia's latest data center superchip, the GB200 NVL72, touting its 30 to 40 times faster performance in AI inference workloads compared to its predecessor, the H100. This leap in performance, Huang argued, will make expensive AI reasoning models like OpenAI's o3 more affordable over time.

"The direct and immediate solution for test-time compute, both in performance and cost affordability, is to increase our computing capability," Huang said. He added that in the long term, AI reasoning models could be used to create better data for the pre-training and post-training of AI models.

Nvidia's claims come at a crucial time for the AI industry, with AI companies such as Google, OpenAI, and Anthropic relying on its chips and their advancements in performance. Moreover, as the focus in the tech industry shifts from training to inference, questions have arisen about whether Nvidia's expensive products will maintain their dominance. Huang's claims suggest that Team Green is not only keeping pace but setting new standards in inference performance and cost-effectiveness.

While the first versions of AI reasoning models like OpenAI's o3 have been expensive to run, Huang expects the trend of plummeting AI model costs to continue, driven by computing breakthroughs from hardware companies like Nvidia.

Permalink to story:

 
Hah, I love this guy. He could even sell refrigerators to the Eskimos and space heaters to Egyptians.

I dunno why, Huang reminds me more and more of Saul of "better call Saul" fame.
 
It's a neat idea to train an AI model on your game's graphics to have it predict frames. It won't help unless they support it in their game though, so like old games won't benefit from it. In my opinion, using software tricks doesn't really count when you're talking Moore's law. Moore's law is about transistor count, not about software. AI and DLSS are software. Though it is cool they figured out a way to get better visuals out of less computational power. It's not technically a factor in Moore's law. I do get his point though. It is cool stuff.
 
I think the people at NVIDIA are happier not because of the advances in AI or the economic success, but just because they have archived real-time ray-tracing graphics. Their DNA as a company is still about graphics. That's why they did DLSS and FG, not to increase fps, but to archive real-time ray tracing light, that was the holy grail of graphics.
 
Going against the grain here and gonna say I actually do believe him. Normally I would point to a major process shrink as your indicator of ML progress but that's just not the case anymore. At the end of the day its about creating advanced chips for the mass's, meaning affordable. Any company can make a $10,000 chip but this is not game changing (no pun intended).
 
He's got the art of swooning investors with mumbo jumbo right down to a T
You're about three decades out of date. Today, NVidia's success has nothing to do with Huang's public statements, and everything to do with the fact NVidia makes the most valuable and sought-after chips on the planet.
 
"At the end of the day its about creating advanced chips for the mass's, meaning affordable."

I don't think the guy gives a flying f%$k about affordability for the masses. It's all about hubris and $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$.
 
What a guy. Less than a year and a half ago he declared Moore's Law dead when it was a convenient way to say "shut up gamers, we're not making our stuff cheaper any more." But now, when it's going to sell more AI garbage to investors, all of a sudden not only is Moore's Law alive and well but they're actually flying past it. This is why I've jumped off the nVidia train entirely.
 
What a guy. Less than a year and a half ago he declared Moore's Law dead when it was a convenient way to say "shut up gamers, we're not making our stuff cheaper any more." But now, when it's going to sell more AI garbage to investors, all of a sudden not only is Moore's Law alive and well but they're actually flying past it. This is why I've jumped off the nVidia train entirely.
And he can't decide if a singularity will produce AGI from one year to the next. Must be reading the wrong comic books.
 
Back