Primary sources, related work, and datasets powering this project.
Official IEA data product covering AI energy demand, data centre growth, and electricity projections by region and scenario.
Live interactive portal tracking real-time AI energy metrics, data centre capacity, and country-level consumption breakdowns.
Lawrence Berkeley National Lab report covering PUE ranges, total U.S. data centre electricity use, and hardware efficiency trends. Source for our 1.1–2.4x PUE assumption.
Five static charts arguing AI energy demand is growing fast but from a small base — currently a minor share of global emissions. Highlights geographic concentration (Ireland, Virginia) and uncertainty in projections depending on AI adoption assumptions.
Stacked bar chart (Highcharts) showing how power splits across servers, storage, networking, cooling, and infrastructure for enterprise, colocation, and hyperscale facilities. Hover tooltips enabled.
Introduces a performance-vs-energy tradeoff metric across six architectures (AlexNet → Swin Transformer) on two NVIDIA GPUs. 14 figures, 11 data tables. Measures energy via OpenZmeter, CodeCarbon, and Carbontracker. Code & data open source.
Benchmark scores across GPU generations — used to quantify gen-over-gen efficiency improvements in our hardware analysis.
Comprehensive GPU spec sheet including TDP, memory bandwidth, compute units, and release dates across NVIDIA and AMD product lines.
IEA report chapter on projected AI electricity demand through 2030, broken out by training vs. inference workloads and scenario assumptions.