IT Brief New Zealand - Technology news for CIOs & IT decision-makers
New Zealand
Everywhen warns AI use is straining energy & water

Everywhen warns AI use is straining energy & water

Fri, 8th May 2026 (Today)
Sean Mitchell
SEAN MITCHELL Publisher

Everywhen has raised concerns about the environmental sustainability of artificial intelligence, warning that current levels of use risk placing growing pressure on energy and water resources.

The concern centres on the energy demands of AI queries and the water consumption tied to the data centres that run large language models. As AI tools become a routine part of search, administration and content generation, their spread across business and personal life is increasing that burden.

Neil D'Mello, client director, south division, at Everywhen, said many users still do not understand the environmental cost of AI. He pointed to the scale of daily interactions with generative AI systems, arguing that each request carries a resource cost that mounts rapidly across billions of prompts.

"We need to be asking ourselves an important question: can we keep using AI the way we currently are without harming the planet?

"Many people aren't aware of the environmental impact that comes with a single AI query, but the truth is that if we continue to use AI at the current rate, it will put a significant strain on our natural resources," D'Mello said.

Research cited by Everywhen estimates that ChatGPT receives 2.5 billion prompts a day, with more than 600 million linked to information searches. A single AI query can use five to 10 times the energy of a conventional search engine request, while more complex prompts draw more power.

Everywhen also highlighted figures suggesting one ChatGPT query uses about the same amount of electricity as leaving a lightbulb on for 20 minutes. On that basis, energy demand rises sharply when usage reaches billions of prompts a day.

Water demand

Beyond electricity use, the group drew attention to the water needed to cool hardware in data centres. Cooling systems are critical to keeping servers operating safely, but can consume large volumes of water, particularly where computing loads are intense and continuous.

According to the figures cited, generating 100 words with ChatGPT can use more than 500ml of water. At current levels of use, that would equate to almost one million baths a day.

The debate comes as AI use spreads deeper into everyday operations across the economy. Everywhen said most organisations now use AI in at least one business function, while studies suggest that by 2025, 78% of companies had adopted AI and about 89% of small businesses were using it for day-to-day tasks.

That growth has made AI a fixed part of corporate workflows, from drafting text and analysing data to automating routine processes. The wider the adoption, the harder it becomes for businesses and policymakers to separate the productivity gains from the underlying resource demands.

Regulatory push

Everywhen said AI is not sustainable in its current form, but noted that governments and organisations are working to improve energy efficiency. It pointed to measures in the European Union and the United States aimed at cutting AI's environmental impact by reducing the power and resources needed to produce comparable results.

The company also referred to the growing discussion around so-called sovereign AI, in which countries or organisations design, build, deploy and govern their own systems. That approach could give operators more control over security, data handling and energy sourcing, including choices that lower the environmental impact of computing infrastructure.

The issue has become more pressing as AI shifts from a specialist technology to a form of infrastructure that shapes how companies operate and how people interact with digital services. As a result, questions about efficiency, emissions and water use are moving from the margins of the technology debate into broader discussions about business responsibility and public policy.

D'Mello said users and organisations should think more carefully about when AI is necessary and when simpler tools may be enough. Even small changes in behaviour, he said, could reduce avoidable demand.

"Looking ahead, we can only hope that sustainability will be a priority, with more conscious choices about how and when AI is being used. Even something as simple as turning off our Google AI Overviews when carrying out traditional searches can help reduce unnecessary demand and help protect our planet," D'Mello said.