NVIDIA AI Podcast
NVIDIA AI Podcast

Powering the AI Inference Wave with EPRI's Ben Sooter - Ep. 292

32 min

This podcast explores the critical intersection of AI data centers and energy grids, focusing on the immense energy demands of AI inference. Ben Sooter from EPRI details how micro data centers, strategically located near underutilized substations, can efficiently power the **80% of a model's lifetime energy consumption** attributed to inference. This distributed approach leverages existing infrastructure, offers faster deployment, and provides crucial flexibility for the energy grid.

Summarized by Podsumo

Key Takeaways

💬 Notable Quotes

Get every episode summarized
Delivered to Telegram. Ask questions about any episode.
Start on Telegram