Microsoft CEO Satya Nadella recently had an exclusive interview with OpenAI CEO Sam Altman on the Bg2 Pod show. The problem facing the AI industry is not the oversupply of computing resources, but the lack of enough power to use GPUs. Nadella even s...
Microsoft CEO Satya Nadella recently had an exclusive interview with OpenAI CEO Sam Altman on the Bg2 Pod show. The problem facing the AI industry is not the oversupply of computing resources, but the lack of enough power to use GPUs. Nadella even said that the current difficulty is that some of the GPUs in stock cannot find enough power to use.
"I think in this case, the cycle of demand and supply is difficult to predict. The point is: What is the long-term trend? The long-term trend is what Sam said. Frankly speaking, our biggest problem now is not excess computing power, but electricity, more precisely, the ability to complete construction close to the power supply quickly." Nadella answered the host of the Bg2 Pod program and is also the founder of the venture capital company Altimeter Capital Brad Gesner Gerstner said this when asked, "So if you can't do that, you might actually have a bunch of chips stacked in inventory that can't be plugged in. In fact, that's the problem I'm facing today. The problem is not the chip supply, it's that I don't have an environment to plug in."
Since last year, the power consumption of AI computing has been a topic discussed by many experts. When NVIDIA solved the GPU shortage problem, this problem was highlighted, and many technology companies are now investing in the development of small modular reactors (SMRs) as a source of power to expand large data centers.
This has also caused the electricity bills of ordinary households to soar, revealing that the AI infrastructure under construction has a negative impact on the average American people. OpenAI has even called on the U.S. government to add 100 GW of new power generation capacity each year, saying it is a strategic asset as the U.S. competes with China for AI leadership. The claims come after experts pointed out that China is ahead in electricity supply because of its heavy investment in hydropower and nuclear power generation.
In addition to power shortages, the two also discussed the possibility of advanced consumer hardware. "One day, we will have an amazing consumer device that can fully run GPT-5 or GPT-6 level models at very low power consumption, which is currently unimaginable," Altman said. "That would be incredible, and it's something that scares people who are building large centralized computing clusters," Gessner said.
This highlights the risk that large companies must bear when betting billions of dollars to build huge AI data centers: Although training new models requires AI infrastructure, if semiconductor technology advances to the point where models can be executed on the ground, then many people predict that the demand for data centers brought about by the widespread application of AI may not occur.
Microsoft CEO says the company doesn’t have enough electricity to install all the AI GPUs in its inventory you may actually have a bunch of chips sitting in inventory that I can’t plug in Microsoft CEO Doesn’t Want to Buy NVIDIA’s AI GPUs “Beyond One Generation,” Hints at a Compute Glut Driven by Energy Constraints