Tensions Between AI and Climate Accountability

Artificial intelligence is often described as an invisible technology, but its environmental footprint is anything but.

Behind the scenes of every AI-powered product are servers that run day and night, consuming electricity and cooling water. As AI accelerates innovation across every sector, it’s also accelerating energy demand at staggering scales. Just this month, the Guardian reported that OpenAI’s new partnership with AMD will deploy hundreds of thousands of AI chips equivalent to six gigawatts of power, or the energy needs of five million US households, beginning in 2026. That tension between AI’s potential for public good and its physical cost to the planet is one we can’t ignore.

At Portable, we see this as one of the defining sustainability questions of our time. How do we use AI responsibly, when its power comes with a measurable environmental price?

A simple rule: Impact up, emissions down

In August 2025, Portable ratified its first AI and Environmental Impact Statement, setting out a clear rule for how we build and deploy AI: “The impact goes up while the emissions go down.”

That principle guides every decision we make. From choosing models to selecting cloud regions. We commit to lowering, or where that’s not yet possible, selecting the lowest-impact option for the energy and water footprint of AI solutions under our control. Our goal is to halve our Scope 2 and 3 emissions by 2025 and reach net zero by 2030.

As a certified B Corp, we believe business should be a force for good. The same must hold true for AI. It’s a powerful tool with enormous public-benefit potential, but it also carries a physical footprint, from the megawatts that train large language models to the litres of water that cool data centres. Stewardship, for us, is non-negotiable.

We draw inspiration from Microsoft’s pledge to be carbon-negative, Google’s push for 24/7 carbon-free energy, and Hugging Face’s model-card “nutrition labels” that disclose the CO₂ cost of training. In that spirit, we commit to transparency, measurement, and ongoing learning.

The balance

Critical engagement and practical use

Internally, our Sustainability Committee and Senior Leadership Team have spent the past year shaping what it means to be a critical friend of AI. This phrase, coined during a Slack thread that kicked off the conversation in late 2024, captures the balance we aim for: to engage critically and transparently with AI’s impacts without shying away from its potential to do good.

We use AI where it improves experiences and increases access to services, especially in justice, health, and government. But we also challenge its default use, asking hard questions about necessity, scale, and sustainability. Every AI deployment is treated as a design decision, not a default.

What responsible AI practice looks like in practice

To translate principles into practice, Portable commits to:

  • Match the model to the task. Use the smallest, most energy-efficient model that achieves the outcome, consulting AI energy leaderboards such as the AI Energy Score from Hugging Face.
  • Run on the cleanest power possible. Prioritise cloud regions with ≥90 % renewable energy mix while keeping data in Australia.
  • Disclose the numbers. Publish annual energy, carbon, and water-use summaries for production AI systems.
  • Push for transparency. Write to vendors like OpenAI requesting per-tenant energy and water data, and encourage clients to do the same.
  • Support remediation. Help clients estimate AI-related emissions and identify carbon- or water-positive offset options.
  • Stay ISO 42001-ready. Embed environmental factors into every AI system deployment by 2027.

These steps are imperfect but deliberate. The science and accounting of AI’s footprint are still emerging, and provider data remains opaque. By publishing our assumptions and sharing what we learn, we aim to help set expectations for transparency across the sector.

This approach is aligned to our broader Sustainability Strategy (2023–2026). It focuses on three layers of change: individual responsibility, sustainable project practice, and operational transformation. All this is backed by commitments to open learning, measurement, and advocacy.

Download our latest progress report

Co-evolving AI and climate accountability

AI and sustainability aren’t opposing forces, they are intertwined.

The same innovation that enables better forecasting, smarter infrastructure, and faster research can also drive accountability for its own footprint.

Our goal is to help that accountability mature. We believe ethical AI and environmental responsibility must co-evolve not as a trade-off, but as a shared responsibility between creators, users, and the planet that sustains both.

Portable will continue to share what we learn publicly from impact reports to client case studies and open-source tools and we invite partners and peers to do the same.

The future of AI can be both useful and sustainable, but only if we make climate accountability part of its code.

Sign up to our email newsletter to get updates about our events, work and research

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.