Our always-on culture of internet access, SaaS products, online gaming, social media networks and access to streaming services takes a lot of energy to power.
As the weight of data creation, access and storage expands exponentially year on year, the Tech world is having to contend with providing services that are environmentally friendly yet faster; cheaper, yet more innovative, especially in energy-intensive tech sectors like AI and machine learning.
We generate a lot of data. In 2020, people created 1.7 MB of data every second. 463 exabytes of data will be generated each day by people, as of 2025. In 2024, the number of emails sent will stand at about 361 billion a day. As digital services move to the computational edge and the Internet of Things becomes ubiquitous, developers are having to contend with vast energy requirements to develop and power these innovations.
We also use a lot of energy using stuff. 70 billion kilowatt-hours a year are required to run the internet, amounting to “1.8% of total American electricity consumption…to generate 70 billion kwh you’d need power plants with a baseload capacity of 8,000 megawatts — equivalent to about 8 big nuclear reactors, or twice the output of all the nation’s solar panels”.
All of these services have to be powered somewhere, stored somewhere, made accessible somewhere, and powered by something, and, as more computing power is required to do more complex tasks such as machine learning, more energy will be needed.
As our digital world expands into machine learning territory, energy consumption skyrockets – the algorithmic data crunching needed to do complex tasks, and to learn from it and improve on it, takes literally power plants’ worth of energy.
Consider this story about an AI powered robotic hand, tasked with solving a Rubix Cube: it took 2.8 gigawatt-hours of electricity to power the machine learning code, equivalent to the power output of three nuclear power plants running for an hour.
Natural language programming is also incredibly energy intensive, with researchers finding that “training a single large NLP model may consume as much energy as a car over its entire lifetime—including the energy needed to build it”.
The vast amounts of energy needed to power AI GPUs, processors and machine learning computing power, and the scope, time and resources it needs to crunch the data, needs to be both sustainable, affordable and scalable. In many cases, the training of neural language networks within AI is where sustainability comes into question, rather than the distribution of it’s learning, due to it being energy and time intensive.
Websites such as ML CO2 Impact help companies measure their machine learning carbon footprint, but what else can people, companies and governments do to mitigate the damage done by unsustainable practice, and can consumers leverage their buying power to improve energy draining systems?
- Green AI training: the ability of developers within AI to create training systems and learning modules on one chip, dramatically reducing chip-to-chip communication, therefore lowering energy use.
- Sustainable hardware: the data centres powering AI learning can be part of the fix by transferring their power source over to renewable sources, as can better integration between data centre and grid, with bespoke energy access plans to help better deal with energy loads around peak or fallow times, better helping and making more efficient supply and demand of energy.
- Hybrid AI systems: deep learning systems that use a mix of neural networks and rules based learning significantly lower energy usage.
The future is inevitably digital. The systems that power these digital networks – the cloud services, data centres and number crunchers – need to exist, and be able to innovate, on a secure, renewable and sustainable footing.
We cannot afford to approach the 5th industrial revolution from a basis of environmental credit.