The latest AI panic is wrapped in plastic and posted as a selfie.
Ever since OpenAI’s new imaging tool dropped, people have been turning themselves into boxed dolls and action figures, and the backlash landed almost as fast as the trend.
The claim now is that these images are not just silly, but also environmentally reckless.
Then came the BBC warning that AI’s energy use is climbing fast, and the argument quickly narrowed into a familiar question: are these tools worth the cost?
The energy use is real. The problem is that the debate often stops there.
The harder question is what that energy is starting to produce, and what gets missed when the conversation collapses into memes, guilt, and power bills.
At every major event I’ve covered this spring, from Sustainability Week and the Energy Transition Summit, to the Business Innovation Summit and Fusion Fest, the message from engineers, researchers, and policy leads was strikingly similar.
AI is energy-hungry, yes. The bigger story is what it may help unlock if it is pointed at the right problems.
It’s already being used to optimise grids, predict failures, cut research time, and speed up parts of scientific discovery that used to move far more slowly.
So the real question is not whether AI uses power. It does. The real question is whether we are using that power for outcomes that actually matter.
That is where this debate gets more serious. 👇🏻
Burning power, building progress
A lot of the criticism is coming from people working on the same energy and climate problems AI may end up helping solve.
From the outside, it looks like a contradiction: we’re trying to build a greener world, yet we’re fuelling it with tech that chews through electricity and water.
And the numbers aren’t small. Microsoft’s emissions rose by a third. Google’s went up by half. In some data centres, water usage is already triggering shortages in communities.
At the Business Innovation Summit, Tikiri Wanduragala from Lenovo didn’t sugar-coat it: “This power story… we’ve known this problem for a long, long time. But what AI has done is put it on steroids.”
Data centres are scaling 3x, 5x, even 10x, and infrastructure is struggling to keep up. Cooling systems, grid access, power redundancy… suddenly, “very boring” problems are board-level crises.
What gets missed is that the same technology adding pressure to the system may also help solve some of it.
Lenovo’s already using AI to cut waste in packaging and optimise logistics chains. Microsoft is running physics-informed AI to simulate fusion materials, replacing months of lab time with models that learn how atoms behave.
And in climate tech, DeepMind’s AlphaFold saved “a billion years of alternative research” by cracking protein folding.
Yes, AI is energy-hungry. It is also already being used in places where efficiency, waste reduction, and system optimisation matter.
AI is being judged as if it’s final, when it’s still evolving. This is still the messy, energy-intensive beginning of something far bigger.

No meltdown, just upgrades
AI is putting serious pressure on the grid. And the more we scale it, the harder that strain hits.
Philip Meier from L.E.K. Consulting laid it out at the Energy Transition Summit: data centre energy use could grow “three to six times over the next 10 years.” That’s not a rounding error, but a full-blown energy crisis in the making.
And yet, nobody in the room seemed panicked. Why? Because this is a problem with solutions.
Max Beverton-Palmer from NVIDIA reminded us that semiconductors have improved “100,000 times in energy efficiency over the last 10 years.” Not 10%. Not 2x. One hundred thousand times.
We’ve seen versions of this before. Demand jumps, then the engineering catches up.
Lucy Yu, CEO of the Centre for Net Zero, pointed to the real bottleneck: bureaucracy. “The challenge isn’t about the supply to power these data centres,” she said. “It’s around regulation, permitting, and speed of grid connections.”
Fix the system, and AI can actually help the grid. Smart data centres can soak up surplus solar, stabilise frequency, and even boost resilience in blackout-prone regions. Put them next to wind or nuclear and the fit starts to make a lot more sense.
Advanced nuclear players like Newcleo are already moving in. “There’s a perfect synergy between data centres and nuclear,” said Andrew Murdoch. They both want low-cost, reliable power. They both need tight grid connections. And in a world where AI workloads are only going up, pairing them makes economic and engineering sense.
As Yu put it: “Realistically, we’re unlikely to see new nuclear sooner than a 10-year time horizon.” In the meantime, renewables need to bridge the gap. Dynamic pricing, demand-side flexibility, and smarter grids could help carry the load.
We already know where the bottlenecks are. Now the question is whether we’re willing to fix them.
From memes to megawatts
The irony is that the tech criticised for guzzling power is now helping us figure out how to generate far more of it.
That’s the view from Fusion Fest, where researchers, founders and futurists made one thing clear: AI isn’t just powering memes, it’s powering breakthroughs.
Kenji Takeda from Microsoft Research didn’t hold back. “We’re really excited about the possibilities for artificial intelligence to accelerate fusion to the grid.”
From simulating plasma behaviour to spotting materials nobody’s thought of before, AI is tackling the kind of physics that used to take supercomputers and full research teams.
This is where the argument gets harder to dismiss. Shruti Rajurkar described how Microsoft’s AI systems take a list of desired material properties: strength, temperature resistance, conductivity, and generate potential atomic structures to match. In other words, systems that can generate possible material structures based on the properties researchers need.
The jump from image generation to this is huge. We’re now asking AI to invent entire molecules that could survive inside a nuclear reaction.
ITER, the world’s largest fusion project, is already deploying AI to speed up everything from code safety to workflow fixes, even building a digital twin of the entire reactor system.
The logic is simple. Fusion requires extraordinary precision. AI spots the failure points, runs the simulations, learns from noise, and uncovers options human engineers would miss. And it does it faster than any team alive.
This matters, not only because of what fusion could power, but because of how much time it could save. Every delay pushes clean power further away. Every bottleneck in material science, in engineering, in modelling, keeps us stuck.
With AI in the loop, we’re speeding up simulations and accelerating science itself.
That does not make the energy cost irrelevant. It does show why the conversation cannot stop at the cost alone.

Missing the real AI payoff
Yes, AI is using more electricity, and yes, its footprint is growing. That part is real.
The mistake is treating that as the whole story. AI is already being used on the same energy, logistics, materials, and grid problems it is accused of making worse.
That doesn’t give it a free pass. It does change the standard we should use to judge it. The better question is whether that energy use is tied to outcomes that actually matter.
The real challenge now is coordination: better data centres, faster grid connections, smarter infrastructure, and clearer incentives around where this technology gets deployed.
If that part is handled properly, the payoff is not theoretical. We’re already starting to see it.








