The DeepSeek-R1 model required far less energy than was previously thought possible. It may be good news for the US energy sector.
DeepSeek’s latest breakthrough has caused US experts to take a closer look at the actual energy demands of AI models.
The Chinese company announced that it used far less electricity and computational power than was previously thought possible. It shows that the power demands of AI development may have been overestimated.
DeepSeek claims its R1 model was developed using only about 2,000 Nvidia H800 GPUs. This is an incredible difference from OpenAI’s o1, which is estimated to have used up to 25,000 of the more advanced H100 GPUs.
OpenAI’s approach, which has so far been believed to be the standard, also requires investing billions of dollars in data centers that guzzle electricity. DeepSeek’s more efficient approach, on the other hand, would require significantly less investment in data centers—and less need for electricity.
Data centers consumed nearly 5% of total US electricity in 2023, with that number potentially rising to 12% by 2028. Meta is planning to build a massive data center complex in Louisiana. Microsoft is investing up to $80 billion to expand Azure. President Trump’s announcement of Stargate last week also involves investing up to $500 billion in AI infrastructure, including data centers. The first one is already under construction in Abilene, Texas.
While these plans include increase in electricity generation, concerns remain over the insatiable appetite of data centers.
According to reports last week, tech giants are considering circumventing electricity grids and plugging directly into power plants. Amazon has already tabled a bid for an arrangement that could send up to 40% of electricity generated in a nuclear power plant directly to its data center in Berwick, Pennsylvania. Experts are worried that deals of this nature could carry significant repercussions, including increasing energy costs for the public.
But the problem is that massive data centers have thus far been regarded as vital to the country’s efforts to outpace China in the AI arms race. So big tech wanted them up and running, and they wanted it fast. Trump’s executive orders in his first week showed that his administration was prioritizing development of data centers and AI technology. An energy crisis seemed unavoidable.
However, DeepSeek has shown that AI models can be built much more efficiently. If so, it may be unnecessary to invest in extensive data centers with an enormous energy diet. Total energy consumption for building future AI models could in fact be significantly lower, according to analysts at Rystad.
Experts predict that there will still be increased consumption of energy if more efficient AI models like DeepSeek-R1 become the standard. However, it wouldn’t be with the intensity and haste assumed necessary only a week ago. When DeepSeek’s claim about its efficient AI model made the news on Monday, it seemed to have thrown the US off the lofty perch that is AI supremacy. Even so, it may also be key to settling some looming issues in the country—in the energy sector at least.






Leave a reply to Europe and Nvidia Announcements Show That AI Is Transforming Weather Forecasting – AI News Monitor Cancel reply