COMPUTING
AN ABUNDANCE OF ENERGY SURROUNDING AI
AI is here to stay, and yes, to generate; that is, new use cases for generative AI are emerging every day. Still, it's the AI we can't quite see, the one deeply ingrained in the technology, the one working behind the scenes ... it's this AI's impact that hasn't yet been fully realized, and is quite frankly hard to fathom.
Nevertheless, it's safe to say that there’s an energizing abundance, surrounding the application of AI these days. Yes, it’s the energy impact of AI, in general, that we might all want to keep a better pulse on.
Have you considered the application of AI in energy management? As you might have imagined, AI is already capable of governing energy applications - and likely the energy consumed by these applications too. In our recent post, we invited you to explore the ways in which we can collectively employ AI while also discussing the need for sustainable and environmentally friendly technology.
We’re putting on a set of binoculars to better gauge this one very important aspect of AI - investigating the extent to which people are talking about it, and about ways to address how AI’s energy impact can be managed better.
In a post by the IEA about coupling AI and energy, they describe potential use cases for AI across power systems - and these may be summarized as follows:
Improved forecasting of energy supply and demand - leading to improved energy reliability
Predictive maintenance - including continuous monitoring of utility-scale energy assets
Managing and controlling grids - strengthening the flow of power at the distribution level
Facilitating demand response - forecasting electricity pricing, and improving dynamic pricing mechanisms
Expanded consumer services - which may offer improved visibility into the cost of electricity for the end-use customer.
Certainly, many of these applications of AI in energy and utilities have promise. However, beyond consumer services, we believe that utilities employing AI these days can harness AI capabilities to enable us to better understand and stay tethered to the applications that make AI’s energy impact a little more tangible. Moreover, we might benefit from further exploring how end-user energy value streams are accessed, and how they benefit the collective energy efficiency effort too.
One perspective holds that a single large language model (a type of AI program) may consume as much energy as leaving your LED light bulb running for an hour. While some may think that an LED’s energy is nominal, when taking into account the number of generative AI-enabled technologies alone that have surfaced this past year, the presumed scale and energy impact (or equivalent aggregate LED-led energy usage) would be immense.
We know this to be so because a recent post by IEEE indicates that AI programs are “on track to annually consume as much electricity as the entire country of Ireland (29.3 terawatt-hours per year)” in aggregate. Said more simply, when considering the millions of servers running these large language models and the annual hours of electricity this requires, powering AI models takes a lot of energy, as also noted in Scientific American.
While it’s apparent that AI will have a significant energy impact, it’s also the case that AI can be applied to answer the energy challenges we’ll eventually face. Fortunately, some are already developing solutions that deliver immediate benefits - made possible through practical and meaningful applications of AI-enabled technologies.
Take for instance emerging AI tools that enable utility customers to get quick and personalized recommendations from their utilities, which may include providing suggestions about how one can become more energy efficient. Moreover, we’ve learned about energy demand forecasting applications enabled by AI, that will surely help owners and operators of commercial buildings anticipate and reduce their energy costs. In addition, there are even some efforts to leverage AI to improve renewable energy integration to deliver carbon-free energy supply portfolios.
And then others are turning to nano-electronics and devices that lead to incremental reductions in energy consumption; as evidenced by the work of a nanotechnology expert at Northwestern’s McCormick School of Engineering who has created an AI-enabled device “so energy efficient that it can be deployed directly in wearable electronics for real-time detection and data processing, enabling more rapid intervention for health emergencies.”
As you reflect on these insights, we urge you to think beyond the energy impact of these specific technologies; yes, together we can aim to explore nanotechnology, but much more. In subsequent posts, we might dive into where the nano meets the quantum. Or why not even take a glance at the consumption of the world’s supercomputers as we know it? And while we keep our eye on the kilowatts consumed by “energy-hungry servers” as some say, we think it’s worth a trip to areas where we find others seeking new ways to improve the computational efficiency of quantum calculations even so - because we know that this is an area that will undoubtedly continue to evolve during this AI era.
We at OwlVoices are simply using our talons to help peel this AI and energy onion for you - and as we march ahead on this journey exploring the energy implications of AI, we remind you once more that to be more cost, operationally, and energy-efficient in this AI era is always worthy of conversation.
We’ll have to make one exception of course ... we won't stop expending our own energy keeping you engaged about this topic. While the promise for more energy-efficient, sustainable and environmentally friendly AI-enabled technology, and the chatter, does matter, for us, it’s addressing your need for a heightened level of awareness about what’s essential in energy that matters most.
For related articles in our Essential Energy series see here.