AI’s Energy Usage: The Cost of Convenience

AI is all around

We are currently living in the time of an artificial intelligence (AI) boom. From asking ChatGPT about everyday queries, utilising Grammarly to perfect essays, to using C.AI to fight loneliness by chatting up fictional characters you wish were real. AI has seemingly grown to cover every aspect of our lives. We are sure that you have noticed how it has become near impossible to avoid seeing those “try our AI chatbot!” pop-ups on every other website you visit, or scroll through TikTok or Instagram without stumbling across one of those AI videos featuring Biden streaming Minecraft gameplay while making friendly banter with Trump. And yet, despite how integral AI is in our daily lives, most of us barely understand what it is, let alone what this spike in AI usage entails.

Why the sudden boom? 

In recent years, AI—particularly generative AI—has surged in popularity among the masses, partly due to its convenience and sheer accessibility. Just enter AI into the search bar in Google and you’ll be sure to find at least a handful of sponsored websites offered up to you. The human race has always been innovative in searching for ways to make daily life easier, and to lessen our loads, both physical and mental. That, paired with the capitalist interests of various industries, has led to the development and rise of AI.

The first ever mentions of AI hit the mainstream in the form of news reports; headlines that reported students using generative AI, such as Chat GPT, to produce paragraph upon paragraph of text based on a singular essay prompt. This was detrimental to the organic development of students’ abilities as well as academic integrity. As a result, learning institutions worldwide enacted swift bans on AI-generated homework. After this, it was only natural that industries and companies, hoping to reduce reliance on manpower, decided to start using AI to potentially replace human workers. For example, the Ministry of Science, Technology and Innovation has faced public criticism for its use of AI-generated images for posters. Netizens claim that the usage of AI-generated photos would deprive graphic designers of work, eventually rendering them irrelevant.

The energy usage of AI

Issues such as over-reliance and job replacement are some of the widely discussed concerns over our increasing AI usage. One of the lesser discussed yet highly impactful costs of AI is its energy usage. Many sources have discussed the potential electrical usage of AI, some estimating that a simple ChatGPT query requires around 10 times as much electricity as a typical Google search and that generating an AI image can require as much energy as charging up a phone.

This then begs the question: why does AI require so much energy? Generative AI, in particular, requires a large amount of energy for its training, inference (application of learnt data), and deployment (producing answers to queries). According to Forbes, “AI is one of the most energy-intensive modern IT undertakings.” Training AI uses a huge amount of electricity. GPT-4, a generative AI computer program created by Open AI, required over 50 gigawatt hours (GWh) of electricity to be trained. This is approximately 0.02% of the electricity California generates in a year. To put this further into perspective, let us draw a comparison by doing some calculations.
According to this study conducted by the Journal of Engineering, Science and Technology (JESTEC) of Taylor’s University, School of Engineering, the average electricity consumption across 384 households was recorded to be about 710.1 kilowatt hours (kWh), or 0.00071 GWh, per month.

There are 2 caveats to this study:

  1. This study was conducted in 2017, hence electricity consumption would’ve gone up by now due to newer, more technologically advanced appliances that consume more energy.
  2. Only households in Kajang and Putrajaya were polled. Energy consumption in higher density areas such as Kuala Lumpur would’ve recorded higher electricity consumption.

Let’s do the maths. 50 GWh/0.00071 GWh = 70,422 months. That means the electricity used to train GPT-4 could have powered the average Kajang/Putrajaya household for 5868.5 years. Alternatively, it could’ve powered 70,422 households for an entire month.

But it is not the training process alone that requires so much energy. Storage also forms a key part to this process, and storing data on the internet (e.g. cloud storage, Google Drive) also consumes significantly more energy compared to storing data locally (e.g. thumb drives), hence the large energy consumption. To store these sheer amounts of data, cloud data centres are needed. Most major cloud service providers like Google and Microsoft run their own cloud data centres, and each data centre is filled with row upon row of servers.

Surely you have noticed how your phone heats up while it’s charging, or when you have had to lift your laptop off your lap because the bottom was getting a little too hot. Like any other electrical appliance, these data servers also generate heat. The difference between these servers and our computers is that the servers are typically running 24/7 to support the global demand of cloud services, which causes them to heat up immensely and thus require constant cooling. The most common way of cooling is through water circulation  processes. As more data centres are built and as existing ones grow more and more dense, more cooling will be needed, which will require more water.

Despite the attempts, it is ultimately difficult to estimate the actual amount of energy consumption from AI deployment. Unlike other electrical appliances, whose energy consumption can be roughly estimated by knowing which socket and energy grid it’s connected to, it’s hard to know where the process of holding a conversation with ChatGPT is running. The lack of transparency from AI deployment companies further perpetuates this issue as well.

The detrimental effects

We have discussed the immense amount of electricity that AI requires. Electricity is produced by many energy sources both renewable and non-renewable, such as oil, gas, coal, hydroelectric power, wind, solar, and biofuels. It is reported that roughly 35.5 % of global power comes from coal, which is burned at high temperatures to heat water, produce steam, and rotate turbines for the generation of electricity. Following that, natural gas accounts for 23 % of electricity generation. That means that over 50% of all electricity generated in 2023 came from non-renewable energy sources. 

Non-renewable energy sources have been used by humans for millennia. However, as the world population continues to boom, energy usage is also expected to skyrocket in the coming years. The International Energy Agency states, “Global electricity demand is expected to rise at a faster rate over the next 3 years, growing by an average of 3.4% annually through 2026.” AI is specifically pinpointed as a driver of this energy usage; as quoted, “Electricity consumption from data centres, artificial intelligence (AI), and the cryptocurrency sector could double by 2026.” 
Of course, this would lead to a host of problems, such as global warming as a result of carbon emissions skyrocketing. For example, the sea level is expected to rise 0.25 metres to 0.30 metres along the US coastline in the next 30 years.  This doesn’t seem like a lot, but what will happen in another 100 or 500 years? Sea levels could rise exponentially, decreasing housing land and arable land. Paired with more heatwaves, floods, and droughts, famine is possible in the future.

What can be done?

In terms of businesses, they should consider what end users want and need, and how these can best be fulfilled. Is AI really the best solution? Many have gotten upset over AI and automated customer service responses, showing that AI usage is not always necessary. 
Corporate Social Responsibility (CSR) is important and should always be kept in mind by companies. In terms of environmental responsibility, companies should strive to manage their resource usage sustainably. Some companies are building data centres on mountain sites in order to take advantage of the cool air. Microsoft built underwater data centres back in 2015, making use of the ocean water to aid with the cooling process; though they have not released any further developmental updates on this technology despite it yielding positive results. Another similar, more recent example of this would be the Chinese company Highlander, which deployed a data centre just 35 metres underwater. The company states that by deploying 100 such modules at the site, up to “68,000 square metres of land, 122 million kilowatt-hours of electricity and 105,000 tons of freshwater” can be saved per year.

Energy usage is one thing; it also matters where the energy is sourced from; whether it be renewables or fossil fuels. AI developers should be pushed to be more transparent regarding their energy use, and regulators should start requiring energy use disclosures from them.

What can we do?

The first and best thing that can be done is to spread awareness. Most everyone uses AI, yet not many know of its flaws. Helping others become aware of this can encourage some to make more conscious choices when it comes to using AI—such as choosing more energy-efficient models or opting out of using it at all. Aside from that, opting for traditional searching on the web as a first measure, only using AI when completely necessary, and turning off auto-AI suggestions such as those that appear in Google and Bing after generating a search result are all steps that can be taken to reduce AI usage and reliance. 

Of course, it is undeniable that it is also necessary for us to push toward greener and more sustainable energy sources. We may feel as if we are unable to make big changes and shape the landscape of our reality, but everyone can make a difference. For example, lobbying governments to make the switch towards greener energy is something that we can do. We can also implement greener and more sustainable alternatives in our daily lives, such as switching to electric cars. As a result, we can hope to create a world in which humans can live sustainably for the sake of a better future.

Written By: Sarah Tan & Sarah Wong

Edited by: Tisyha

Recommended Articles

Leave a Reply

Your email address will not be published. Required fields are marked *