First, let’s discuss why AI is so energy intensive. AI systems demand a huge amount of computing power. The creation and use of AI involves training the programmes on models and algorithms that must be invented and calibrated, all of which demands computing power. Then, that AI model must draw conclusions from the new data it is fed, which is another energy-intensive process in itself.
The need for more and more computing power has risen sharply as AI has become more sophisticated. Computing power is becoming scarce as a result and is a major bottleneck for the further development and use of AI. Indeed, the UK’s national AI strategy published in 2021, recognised that computing power capacity must be increased if the potential of AI is to be realised.
The more sophisticated the AI, typically, the more energy intensive it is. This has significant implications for the UK.
How much energy does the AI rollout need?
Data centres (facilities that store, process and distribute data) are a significant and growing consumer of electricity. From training complex AI models, which requires immense computational power and data storage, to running data through trained AI models to make predictions or solve tasks, data centres are central to every stage of AI’s use and development.
According to estimates by the International Energy Agency, data centres globally account for approximately 1%-1.3% of total electricity consumption. One recent observation suggests that developing the most sophisticated AI systems currently requires a fourfold increase in the amount of computing power annually. The total amount of data required for AI training has also risen by 2.5 times a year, increasing reliance on data centres.
In the UK, AI and related infrastructure consumed around 3.6 terawatt-hours (TWh) of electricity in 2020. If this consumption increases twentyfold, as per the government’s target, it could reach 72 TWh by 2030. This would represent over one-quarter of the UK’s total electricity consumption in 2021, which was approximately 261 TWh.
The rapid growth in AI computing requires careful planning. However, data centres are only part of the equation. The devices that use AI, such as sensors in smart homes, gas and electricity meters, routers, wifi hubs, streaming devices and social media platforms, could add significant energy demand that is difficult to estimate.
These additional components of AI’s total energy consumption are often overlooked.
Continues at…
For the full article by Professors Tom Jackson and Ian Hodgkinson visit the Conversation.