The Massive Power Draw of Generative AI: Overtaxing Our Grid
The rapid advancement of generative AI technologies has led to a significant increase in energy consumption, placing unprecedented strain on our power grids. This surge in electricity demand from expansive data centers not only challenges existing infrastructure but also raises concerns about the environmental footprint of AI developments.
There are more than 8,000 data centers globally, but it’s insufficient to keep up with the power needs of generative AI. One ChatGPT query takes about 10 times as much energy as a typical Google search. Training one large language model can produce as much CO2 as the entire lifetime of five gas-powered cars and use as much water as a small country. Even if we generate enough power, our aging grid is increasingly unable to handle transmitting it to where it’s needed. That’s why data center companies like Vantage are building closer to where power is generated while the industry invests in alternate energy sources and creative ways to harden the grid.
Chapters:
You can return to the main Books & Films page, or press the Back button on your browser.