COMPUTING
THE ENERGY IMPACT IN THE AI ERA
When someone says cryptocurrency, people pay attention. It invokes curiosity. With all the groundbreaking crypto products emerging there’s little doubt that cryptocurrency is making a splash in many ways. But did you know that the process of mining crypto, which takes place in data centers, accounts for nearly half a percent of annual global electricity demand?
Maybe that’s still not daunting to some even, so let’s unpack it further. We’re talking about the energy consumed when mining (not necessarily transacting) Bitcoin currency which by some accounts amounts to 0.4% to 0.55 % of global energy consumption. This happens to be about half of all global data center energy consumption, equivalent to ~110 terawatt hours (TWh).
Intrigued yet? While a seemingly low percentile which to some may be insignificant, based on our back-of-the-napkin calculations, we estimate that 110 TWh may be the same as the energy required to power the 2022 Beijing Winter Olympic venues 690 times over - and in spite of the fact that the Beijing Olympics are considered the first “green” Olympic games, who’d want the supercomputers that enable Bitcoin to continue to have that scale of energy impact, right?
But the energy it takes to mine Bitcoin - while still relevant - isn’t the point we’re trying to get to, here. Frankly, crypto energy consumption is somewhat yesterday’s news.
Enter AI into the mix, and put simply, in this new AI era computing is using more energy than ever before.
Considering the scale at which AI is being used, at unprecedented speeds, AI when combined with Crypto’s impact will undoubtedly increase energy use beyond our desired/optimal levels.
We at OwlVoices are simply left pondering, are people talking about the implication of AI-era computing on energy efficiency, sufficiently? Are the decision-makers assessing the impact of these technological advances, deploying greater AI workflows, and also considering the implications of energy consumption at length?
This is a worthy topic. As a matter of fact, according to Bloomberg, training a single AI consumes enough electricity to power 100 US homes for 1 year.
Thus, regardless of who is the purveyor of AI, and whether or not they're dutifully conscious about the energy implications here, know that it’s highly likely that the resulting data center energy use will increase in the immediate future - and that this may affect your bottom line too.
With data centers already consuming about 1.0-1.3% of total global electricity use (based on 2022 estimates per the IEA), there’s no doubt that computing in this AI era will have a significant impact on the collective demand for electricity.
And this just in ... we learned that there’s a new generation of AI devices emerging that will enable more efficient smart homes - empowering customers to optimize their homes in ways that deliver greater energy efficiency. Along this same vein, data center operators running advanced computing processing units, graphics processing units (and perhaps even quantum computers in the foreseeable future) require similar know-how, and all the means to properly monitor and optimize their data center energy consumption, which is expected to increase by 12 percent by 2030. With energy already being a significant driver of cost within data center operations, energy efficiency and the cost savings it can yield will continue to be of utmost priority to many.
We at OwlVoices seek to engage with you about just this, because exploring the energy implications of all computing - from the classical to the quantum, to that of the exact technologies enabling data centers to operate with AI, is important. To be more cost, operationally, and energy-efficient in this AI era is worthy of conversation. So, let's make it a point to continue the dialogue, and to explore ways in which we can collectively employ AI while also discussing the need for sustainable and environmentally friendly technology.