Insights

Sustainable AI in practice: the energy trade-off nobody wants to talk about

Most organisations building AI aren't measuring its energy costs, and the ones that are, find it harder than expected. Here's what the gap between sustainable AI theory and real-world deployment actually looks like.
orange coloured sparks

From conference room to carbon reality

In the realm of sustainable AI, there is a growing emphasis on utilising artificial intelligence to advance sustainable objectives.

Recently, I participated in a conference hosted by Responsible AI UK (RAi UK) and UK Research and Innovation, which delved into the application of AI to facilitate the attainment of net-zero goals in the UK. The discussions showcased intriguing initiatives led by researchers, ranging from leveraging AI to rejuvenate ecosystems by identifying optimal tree species based on environmental factors such as soil quality and climate, to employing AI for carbon capture and storage and for enhancing the energy efficiency of vehicles.

In light of these challenges, the following paragraphs present a personal perspective on the complexities of energy efficiency in AI development and the trade-offs between accuracy and sustainability. While innovative ideas abound, a significant hurdle lies in bridging the gap between theoretical models and practical implementation in real-world scenarios.

The gap between AI models and the real world

Many AI projects face challenges transitioning from development to market or public deployment because models inherently struggle to fully capture the complexities of their operational environments.

The energy cost of complex AI, and why it’s hard to measure

Moreover, from a sustainability perspective, the computational demands of complex models for data generation, training, and operation pose significant energy consumption concerns, particularly in the absence of readily available data. Some argue that researchers should quantify the energy costs associated with developing, deploying, and operating AI models – a consideration that extends to all organisations involved in AI system development and deployment.

However, accurately measuring energy consumption remains a formidable task, exacerbated by reliance on cloud infrastructure. While cloud providers could offer insights into energy consumption at the instance or virtual CPU level, the feasibility and business benefits of such endeavours remain uncertain. Furthermore, many organisations engaged in AI development lack the incentive to prioritise energy efficiency in their operations.

Efficiency vs accuracy: a trade-off that can’t be ignored

Suggestions have been made to adopt energy-efficient algorithms or less power-intensive models; however, these proposals often overlook the profit-driven nature of most organisations and the inherent trade-off between energy efficiency and model accuracy.

Balancing energy efficiency with accuracy requires a case-by-case evaluation aligned with each business or adopter’s specific needs and goals.

What sustainable AI actually requires

Ultimately, the success of sustainable AI initiatives hinges on their alignment with organisations’ operational standards and business practices. The value of an AI system and its associated services lies in their ability to integrate seamlessly with an organisation’s existing frameworks and operational norms.


If this has got you thinking about how your organisation handles AI, data or technology, we’d enjoy the conversation – talk to us.

More insights