According to the International Energy Agency (IEA), a single ChatGPT request requires ten times more electricity than a Google search.1
While both take a tiny amount of power, one must also think of the amount of users and how power usage grows based on that. A single google search, on average, takes 0.0003 kWh whereas a single ChatGPT query takes an estimated 2.9 Wh (or 0.0029 kWh) in 2024.2 However, as this was based on an older study and the amount of ChatGPT users has definitely increased, one can assume that the amount of power necessary is higher than ever before.

1. (United Nations, 2025)
2. (de Vries, 2023)

What are some positives?

While not much can be said for the actual usage of power by LLMs, one should look towards the energy sector instead. Since AI is good at pattern recognition and fast data analysis, it's a strong pick for management work - however it shows much the same issue as other industries: a lot of the possibilities remain possibilities - in most cases, the technology is not at a point where the choice is viable. From Edgecom Energy, these are some of the current uses of AI3:

For current plans, there are four sections: Optimization, efficiency, forecasting, and sustainability.
Optimization refers to what can be changed under the watchful eyes of a LLM - for example, if you have variables that all change with location, calculating the most optimal way for each location manually would take a long time especially if a variable changes and needs the entire plan changed around it. AI can automate that process through existing data, and can continually monitor and restructure it live.
Efficiency is an umbrella term, but is mostly used to refer to how energy should be shifted based on clusters where power use is higher or lower - in areas where energy use is lower, more of that allocated energy can be moved to higher use areas.
Forecasting is something we already have, but AI would allow us to streamline the process further and let it work automatically.
Finally, sustainability. Ironically, not referring to AI itself, but to how companies can use AI to analyse their facility data and use that to create sustainability plans. It can also be used to simplify and speed up reporting, as most of it would be available far sooner than with typical means.

3. (Edgecom Energy, 2024)

What are some negatives?

Energy is, once again, one of the bigger issues around AI. It directly ties into water usage, but more directly comes from how many power plants have to run to keep it supplied - because of the aforementioned figure of how much energy is necessary per prompt, the attempt to capitalize on any little bit of energy really isn’t surprising.

For graphs, see (Towler, 2025) in References

While not directly negative, there is something to be said about how megacorporations have begun planning to open nuclear power sites - because of AI demand, there is quite literally not enough power for data centres. In Alberta, for example, CBC reports that the allocated energy for large scale projects has gone to two data centres, with 37 still on a waiting list. While not yet cutting into regular energy, the fact that only two data centers are being given a share of 1,200 mW is a concern, especially given that there is far more on the way - especially considering that “The remaining 37 data centre projects currently looking to connect to Alberta’s grid are requesting a cumulative 19.4 gigawatts of power, according to [the Alberta Energy System Operator] AESO. For comparison, that’s almost 14 times more electricity than it takes to power the city of Edmonton.”4

To switch topics, nuclear/fission is not an inherently bad choice for powering LLMs as it leaves very little actual waste and is only getting more and more efficient (see thorium-based reactors). However, as reactors can take decades to build, test, and finally go into operation, the likelihood of AI relying on pre-existing and cheap energy like coal or fossil fuels is higher which leads to further carbon or heat emissions.5

4. (Ali, 2025)
5. (Siddik et al., 2021)