Speaking at the Communacopia + Technology conference in San Francisco, Huang criticised traditional “giant” data centres as highly inefficient.
“These giant data centres are super inefficient because [they are] filled with air, and air is a lousy conductor of electricity,” Huang said.
He advocated for compact, energy-efficient data centres that use advanced technologies like liquid cooling, which is challenging to implement in sprawling facilities.
“What we want to do is take that few, you know, call it 50-, 100-, 200-megawatt data centre, which is sprawling, and you densify it into a really, really small data centre,” he said, highlighting that modernisation efforts could take the next decade to complete.
Despite the high upfront cost of Nvidia’s server racks, which can run “a couple million per rack,” Huang argued that their efficiency justifies the investment.
“The amazing thing is just the cables of connecting old general purpose computing systems cost more than replacing all of those and densifying into one rack,” he said.
Generative AI and accelerated computing
Huang also celebrated the potential of generative AI to redefine industries, describing it as more than just a tool.
“This generative AI is not just a tool, it’s a skill,” he said, predicting that AI will expand beyond traditional data centres and IT into creating digital skills, such as virtual chauffeurs.
“For the very first time, we’re going to create skills that augment people. And so that’s why people think that AI is going to expand beyond the trillion dollars of data centres and IT and into the world of skills. So what’s a skill? A digital chauffeur is a skill,” he said.
Nvidia’s focus on accelerated computing promises instant returns on investment (ROI), Huang said, by improving virtual data centre utilisation.
Virtualisation allows workloads to move seamlessly across the data centre rather than being tied to specific machines, slashing costs. Cloud hosting further optimises resources by enabling multiple companies to share infrastructure.
“The return on that is fantastic because the demand is so great,” Huang said. “For every dollar that they spend with us translates to $5 worth of rentals ... And so, the demand for this is just incredible.”
Huang emphasised that the future of computing is about systems not standalone chips.
“The way that computers are built today ... we designed seven different types of chips to create the system. Blackwell is one of them.
“So, the amazing thing is, when you want to build this AI computer, people say words like ‘supercluster’, ‘infrastructure’, ‘supercomputer’ for good reason because it’s not a chip, it’s not a computer per se. And so, we’re building entire data centres,” he said.
A vision of digital engineers
Huang also forecast a future where software engineers collaborate with “digital companion engineers” powered by AI, effectively multiplying productivity.
The days of software engineers writing every line of code “are completely over”, he said.
In Nvidia’s vision, 32,000 employees will eventually be supported by 100 times more digital engineers.
“And so, the way I look at Nvidia, we have 32,000 employees, but those 32,000 employees are surrounded by hopefully 100x more digital engineers,” Huang said.
This week, Nvidia is set to report its third-quarter results, prompting analysts to question whether the company will set the stage for a strong year-end for tech giants – or if the final weeks of 2024 will see the market surrender its impressive year-to-date gains.