When Google announced a 33-fold reduction in energy consumption per Gemini query in 2025, the tech world celebrated a sustainability milestone. Each text prompt now uses just 0.24 watt-hours, equivalent to watching television for nine seconds. The achievement represents genuine technical progress: optimised hardware, refined software, and improved data centre operations working in concert to slash both energy use and carbon emissions.
Yet this efficiency breakthrough may herald a larger challenge. As AI becomes cheaper and more accessible, total usage explodes, potentially overwhelming individual gains. This dynamic, where technological efficiency drives increased overall consumption rather than conservation, follows a pattern identified 160 years ago by British economist William Stanley Jevons. His observation about coal and steam engines now applies to the digital age, revealing fundamental tensions in AI's sustainability narrative.
The paradox in action
The principle operates across technologies with deceptive simplicity. "If a technology gets more efficient, Jevons paradox explains that instead of using less of the resource, you end up using more, although it gets more efficient," explains Eli Toftoy-Andersen, Digital Sustainability Officer at Sopra Steria. Her work helping organisations implement sustainable AI practices reveals the gap between technical optimisation and actual environmental impact.
The pattern manifests everywhere. When cloud storage became cheaper and more efficient, people stored more data rather than deleting files. More fuel-efficient cars enabled longer commutes and heavier vehicles. LED lighting did not reduce electricity consumption proportionally: people simply installed more lights. Now the same dynamic threatens to undermine AI's efficiency gains.
The numbers tell a stark story. Global data centres consumed approximately 415 terawatt-hours in 2024, about 1.5% of worldwide electricity. The International Energy Agency projects this will more than double to 945 TWh by 2030, equivalent to Japan's entire annual electricity demand. Data centre consumption is growing at 15% annually, more than four times faster than total electricity consumption from all other sectors. In the United States, data centres could account for 12% of national electricity demand by 2028, up from 4.4% in 2023.
Despite Google's per-query improvements, the company's overall carbon footprint has surged 48% since 2019, driven largely by AI infrastructure expansion. This disconnect illustrates the core challenge: making AI more efficient doesn't necessarily reduce its environmental impact when deployment scales exponentially.
The complacency risk
Efficiency announcements carry hidden dangers. "When announcements like this are made, that might lead to people thinking, 'All right, it's more efficient now, so I don't have to do anything,'" Toftoy-Andersen cautions. The risk lies in complacency: assuming technical progress alone resolves sustainability concerns. While per-query metrics improve, aggregate impacts escalate.
The architectural shift toward AI-specific infrastructure amplifies this effect. Accelerated servers designed for AI workloads account for nearly half the projected increase in global data centre electricity consumption through 2030. These specialised systems enable more sophisticated models and applications, driving demand growth that outpaces efficiency improvements. Geographic concentration intensifies local impacts: nearly half of U.S. data centre capacity clusters in five regions, straining electrical grids. In Ireland, data centres now consume 22% of total electricity.
Toftoy-Andersen emphasises that "these kinds of announcements are not enough on their own to make the emissions from the IT sector and from AI in particular to go down." The fundamental question isn't just how efficiently AI operates, but whether and when deployment serves genuine needs versus expanding into marginal applications enabled by lower costs.
Governance beyond optimisation
Breaking the paradox requires shifting conversations from technical capability to purpose. "It always comes down to what you use the technology for, and we need to start talking more about that: what are you using the technology for and when is it necessary to use this technology," Toftoy-Andersen argues. This is not purely technical analysis as it demands ethical frameworks.
Sopra Steria recommends clients "establish an ethical committee for AI projects, so that it's not up to the single department or a single decision maker" to greenlight deployments. These committees need diverse perspectives to evaluate sustainability alongside other ethical dimensions, from content moderation's impact on workers to employment displacement concerns.
The sustainability mandate operates on dual tracks: "Using AI for things that are beneficial for society and humanity and does not harm it. It's about developing AI models in an environmentally friendly and energy efficient way." Neither dimension alone suffices. Efficient technology serving harmful purposes doesn't constitute progress, nor does deploying inefficient systems for beneficial applications.
Measurement complicates accountability. "One thing is the energy consumption, which you can get numbers for, but you have to learn how to do the calculations and the industry has to agree on what to include in those calculations so that you can compare apples to apples," Toftoy-Andersen explains. Without standardised methodologies, organisations cannot meaningfully evaluate competing environmental performance claims.
The full environmental, social, and governance (ESG) framework extends beyond energy metrics. "You can talk about the impact of people who have to sit and tag harmful content. You can talk about the people losing jobs. There are so many impacts that AI can have on human beings that these things have to be accounted for." Supply chain transparency matters: examining manufacturing conditions, rare earth extraction, and electronic waste disposal alongside operational energy consumption.
Unexpected constraints and opportunities
Paradoxically, infrastructure limitations may support sustainability goals. Geopolitical disruptions extending delivery times and increasing component costs could make repair and longevity more economical than constant replacement. "When these happen together, it will make it more profitable to repair or to keep your old servers," Toftoy-Andersen notes. "Often sustainability is just one of several factors that goes into decisions where time and money and security and sustainability play different parts."
Digital sovereignty concerns might accelerate sustainable practices, though not primarily for environmental reasons. Organisations seeking resilient, secure systems may adopt design principles (redundancy, low-bandwidth operation, efficiency) that align with sustainability objectives. "I think it can be used as a lever to also talk more about things like digital sobriety and the techniques you can use to make AI solutions and IT solutions in general more efficient."
The complexity doesn't justify inaction. "You just need to get started. Use the numbers that you have and work from there," Toftoy-Andersen advises. Her assessment of the sector's prospects balances hope with urgency: "I'm optimistic and a bit impatient. There are a lot of smart people, a lot of people that are good at numbers, that are good at making things efficient in a lot of ways and good at making decisions. And as soon as they get their heads around how important this topic is, the sooner we can do something about it."