ChatGPT's Power Usage Unveiled: Myths vs Facts | TechCrunch

Published On Wed Feb 12 2025
ChatGPT's Power Usage Unveiled: Myths vs Facts | TechCrunch

ChatGPT may not be as power-hungry as once assumed | TechCrunch

ChatGPT, OpenAI’s chatbot platform, may not be as power-hungry as once assumed. But its appetite largely depends on how ChatGPT is being used and the AI models that are answering the queries, according to a new study.

95% Less Energy Consumption in Neural Networks Can be Achieved

A recent analysis by Epoch AI, a nonprofit AI research institute, attempted to calculate how much energy a typical ChatGPT query consumes. A commonly cited stat is that ChatGPT requires around 3 watt-hours of power to answer a single question, or 10 times as much as a Google search.

Energy Usage of ChatGPT

Epoch believes that’s an overestimate. Using OpenAI’s latest default model for ChatGPT, GPT-4o, as a reference, Epoch found the average ChatGPT query consumes around 0.3 watt-hours — less than many household appliances. “The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car,” Joshua You, the data analyst at Epoch who conducted the analysis, told TechCrunch.

AI's Power Demand: Calculating ChatGPT's electricity consumption

Debate on AI Energy Usage

AI’s energy usage — and its environmental impact, broadly speaking — is the subject of contentious debate as AI companies look to rapidly expand their infrastructure footprints. Just last week, a group of over 100 organizations published an open letter calling on the AI industry and regulators to ensure that new AI data centers don’t deplete natural resources and force utilities to rely on nonrenewable sources of energy.

Sustainability Considerations In Tech Future De Facto Standard PPT

You told TechCrunch his analysis was spurred by what he characterized as outdated previous research. You pointed out, for example, that the author of the report that arrived at the 3 watt-hours estimate assumed OpenAI used older, less-efficient chips to run its models.

Future Implications

You said he does expect baseline ChatGPT power consumption to rise, however. “[The] AI will get more advanced, training this AI will probably require much more energy, and this future AI may be used much more intensely — handling much more tasks, and more complex tasks, than how people use ChatGPT today,” You said.

AI Industry Challenges

ChatGPT alone reaches an enormous — and expanding — number of people, making its server demands similarly massive. OpenAI, along with several investment partners, plans to spend billions of dollars on new AI data center projects over the next few years.

AIM على X: ChatGPT vs Google: Who's Burning More Energy? AI is a ...

OpenAI’s attention — along with the rest of the AI industry’s — is also shifting to reasoning models, which are generally more capable in terms of the tasks they can accomplish but require more computing to run. As opposed to models like GPT-4o, which respond to queries nearly instantaneously, reasoning models “think” for seconds to minutes before answering, a process that sucks up more computing — and thus power.

Conclusion

You suggested that people worried about their AI energy footprint use apps such as ChatGPT infrequently, or select models that minimize the computing necessary — to the extent that’s realistic. “You could try using smaller AI models like [OpenAI’s] GPT-4o-mini,” You said, “and sparingly use them in a way that requires processing or generating a ton of data.”