Grid-Aware Websites Won't Save the Planet — But They Might Distract You From What's Broken
TL;DR
Grid-aware websites change what they show based on how clean the electricity grid is at that moment. It sounds smart, but the actual carbon savings are tiny — fractions of a gram per page view. Fixing page weight and third-party bloat delivers far bigger results with far less effort.
The Green Web Foundation recently published work on "grid-aware websites" — the idea that your website should detect the real-time carbon intensity of the electricity grid and adapt its behaviour. When the grid is dirty, serve lighter pages. When it's clean, serve the full experience.
It's a lovely idea. It's also almost entirely pointless. And if you look at why they're building it, the answer is more interesting than the technology itself.
What grid-aware actually means
The concept is straightforward. Your server (or client) queries a grid intensity API — typically Electricity Maps or the UK Carbon Intensity API — and gets a number in grams of CO₂ per kilowatt-hour. If the number is high (fossil-heavy generation), you degrade the experience: smaller images, fewer fonts, reduced animations, lighter layouts. If the number is low (lots of wind or solar), you serve the full site.
On paper, this is demand-responsive computing applied to the web. In practice, it's a conference talk masquerading as climate action.
The numbers don't work
Let's do the maths that the grid-aware advocates consistently avoid.
A typical web page transfers around 2.5 MB. A "degraded" version might transfer 800 KB. That's a saving of roughly 1.7 MB per page view.
Using the IEA's energy intensity figures for data transfer — around 0.06 kWh per GB for the full delivery chain — that 1.7 MB saving equates to approximately 0.0001 kWh per page view.
On a dirty grid (say 500 gCO₂/kWh), that's 0.05 grams of CO₂ saved per page view. Five hundredths of a gram.
A site with 10,000 monthly page views, degrading its experience half the time, would save roughly 250 grams of CO₂ per month. That's the equivalent of driving a car for about three seconds.
But here's the part nobody mentions: the API call to check grid intensity in the first place has its own energy cost. The DNS lookup, the TLS handshake, the response parsing, the conditional logic. For low-traffic sites, the overhead of checking whether to optimise may well exceed the savings from optimising.
It solves the wrong problem
Demand-shifting works brilliantly for flexible, energy-intensive workloads. Running your CI pipeline at 2am when the wind is blowing. Scheduling batch processing for low-carbon windows. Training ML models when solar generation peaks. These are genuine, measurable interventions.
A website request is not a flexible workload. When someone clicks a link, they want the page now. You can't say "come back in four hours when the wind picks up." The demand is immediate and non-negotiable.
Grid-aware advocates would argue that you're not shifting demand — you're reducing it during peak carbon periods. But the reduction is so microscopically small that it exists only as a rounding error in any real emissions accounting. Meanwhile, you've given your visitor a degraded experience they didn't ask for, punishing them for the state of the national grid.
So why build it?
This is where it gets interesting. The Green Web Foundation maintains the Sustainable Web Design Model (SWD), which powers CO2.js and, by extension, virtually every website carbon calculator you've ever seen — including Website Carbon Calculator, Ecograder, and dozens of others.
The SWD model has a fundamental problem. It estimates website carbon emissions using a single global average grid intensity: 494 gCO₂/kWh. That's it. One number for the entire planet.
A website hosted in France (56 gCO₂/kWh nuclear grid) gets the same grid intensity applied as one hosted in Poland (680 gCO₂/kWh coal grid). The model is wrong by a factor of 12 before it even starts calculating.
It gets worse. The SWD model treats the entire delivery chain — data centre, network, end-user device — as a single energy calculation scaled by page weight. It doesn't measure actual device energy consumption. It doesn't account for execution time, JavaScript parse cost, or rendering complexity. A 2 MB page of compressed images and a 2 MB page of unoptimised JavaScript get identical scores, despite wildly different real-world energy consumption.
The model also applies a static 75/25 split between first-time and returning visitors, assumes all caching behaves identically, and makes no distinction between a static HTML page and a single-page application that executes megabytes of client-side code on every navigation.
In short: the Sustainable Web Design Model is catastrophically wrong. Not slightly imprecise. Not "good enough for a rough estimate." Fundamentally, structurally wrong in ways that produce results divorced from physical reality.
Grid-aware as misdirection
When your core model is broken, you have two options. Fix the model, or build something new and exciting that draws attention away from the model.
Grid-aware websites are option two.
Rather than addressing the fact that their carbon calculator uses a single global average for every website on earth, the Green Web Foundation is investing in a system that dynamically responds to real-time grid data — but only at the presentation layer. The underlying calculation still uses the same flawed energy intensity constants. The same bytes-in, carbon-out formula. The same wilful ignorance of where servers actually are and what devices actually consume.
It's like putting a real-time weather display on a car with a broken speedometer. Yes, the weather data is accurate. No, it doesn't help you know how fast you're going.
What actually reduces digital emissions
If you genuinely want to reduce the carbon footprint of a website, the interventions are well-understood and none of them require checking the grid:
- Ship less JavaScript. Parse and execution costs dominate device energy consumption. A 500 KB JavaScript bundle forces the CPU to work for hundreds of milliseconds on every page load. Removing it saves real watts on real devices, on every single visit, regardless of grid mix.
- Optimise images properly. Serve modern formats (AVIF, WebP), use responsive sizing, and lazy-load below-the-fold content. This saves bytes on every request — not just when the grid is dirty.
- Choose your hosting location. Moving from a US-average grid (370 gCO₂/kWh) to France (56 gCO₂/kWh) or Sweden (8 gCO₂/kWh) reduces your data centre emissions by 80–98%. This is a one-time infrastructure decision with permanent impact.
- Use green-verified hosting. The Green Web Foundation's own hosting directory — ironically, their best contribution — lets you verify that your provider uses documented renewable energy procurement.
- Reduce third-party requests. Every analytics script, chat widget, font service, and tracking pixel adds network requests, DNS lookups, and device processing. Most of them add more carbon than your entire first-party codebase.
A 50 KB reduction in page weight, applied to every page view, saves more carbon over a year than a grid-aware toggle ever will — because it works 100% of the time, not just during high-carbon periods.
The deeper issue
The digital sustainability space has a credibility problem. Tools that produce precise-looking numbers from imprecise models give organisations false confidence that they've measured something meaningful. A website carbon badge showing "0.24g per page view" implies a level of accuracy that the underlying methodology cannot support.
Grid-aware websites compound this by adding another layer of apparent sophistication — real-time API calls, dynamic adaptation, responsive design based on energy data — on top of a foundation that doesn't hold up to scrutiny.
The question isn't whether grid-aware websites are technically interesting. They are. The question is whether they represent a meaningful climate intervention or an elaborate distraction from the fact that the industry's most widely-used carbon model needs rebuilding from scratch.
We think the answer is obvious.
What we do differently
Our EcoPigs methodology uses location-specific grid intensity — live data from the UK Carbon Intensity API, Ember's country-level figures for 24 nations, EPA data for the US — applied to the actual server location, not a global average. We instrument real device energy consumption via the Chrome DevTools Protocol. We split the delivery chain into four independently calculated segments. We measure JavaScript execution cost, not just transfer size.
None of that requires degrading your visitors' experience based on the weather. It just requires doing the measurement properly in the first place.
EcoPigs is a website carbon measurement engine built by OYNK. It powers the emissions analysis behind our PEER Audit reports, giving businesses accurate, location-aware carbon data for their digital presence.
Ready to reduce your digital waste?
Book a free consultation to discuss how OYNK can help your organisation achieve its sustainability goals.
Book a Consultation