LOGO

ChatGPT Power Consumption: Is It Overestimated?

February 11, 2025
ChatGPT Power Consumption: Is It Overestimated?

ChatGPT's Energy Consumption: A New Assessment

Recent research suggests that ChatGPT, the chatbot developed by OpenAI, might not demand as much energy as previously believed. However, its actual power usage is significantly influenced by the specific applications and the underlying AI models utilized to address user inquiries.

Analyzing ChatGPT's Power Needs

A new analysis conducted by Epoch AI, a non-profit AI research organization, aimed to determine the energy consumption of a standard ChatGPT query. A frequently referenced figure indicates that ChatGPT requires approximately 3 watt-hours of power per response, which is ten times greater than a typical Google search.

Epoch AI contests this estimation.

Employing OpenAI’s most recent default model, GPT-4o, as a benchmark, Epoch’s findings indicate that an average ChatGPT query consumes around 0.3 watt-hours – a figure lower than many common household appliances.

“The level of energy utilized is not substantial when compared to the energy demands of everyday appliances, home heating or cooling, or operating a vehicle,” explained Joshua You, the data analyst at Epoch who led the analysis, in an interview with TechCrunch.

The Broader Debate on AI and Energy

The energy consumption of AI – and its overall environmental impact – remains a topic of considerable discussion as AI companies continue to expand their infrastructure. Recently, over 100 organizations issued an open letter urging the AI sector and regulatory bodies to ensure that new AI data centers do not strain natural resources or necessitate reliance on non-renewable energy sources.

You shared with TechCrunch that his analysis was motivated by what he perceived as outdated prior studies. He noted that the report estimating 3 watt-hours per query relied on assumptions about OpenAI utilizing older, less efficient processing chips.

chatgpt may not be as power-hungry as once assumed“I’ve observed considerable public discussion acknowledging AI’s potential for significant energy consumption in the future, but lacking accuracy regarding current energy usage,” You stated. “Furthermore, the widely cited 3 watt-hour estimate was based on relatively old research and appeared, through preliminary calculations, to be inflated.”

Acknowledging Limitations and Future Trends

It’s important to recognize that Epoch’s 0.3 watt-hour figure is an approximation, as OpenAI has not released the detailed data required for a precise calculation.

The analysis also doesn’t account for the additional energy required by ChatGPT features like image creation, or the processing of input data. You conceded that ChatGPT queries involving large files are likely to consume more electricity than standard questions.

However, You anticipates that the baseline power consumption of ChatGPT will increase over time.

“[As] AI becomes more sophisticated, training these systems will likely demand considerably more energy, and future AI may be employed more intensely – handling a greater volume of tasks, and more complex tasks, than current ChatGPT usage,” You explained.

Infrastructure Expansion and Power Demands

Despite recent advancements in AI efficiency, the rapid deployment of AI is projected to drive substantial expansion of power-intensive infrastructure. A Rand report suggests that AI data centers could require nearly all of California’s 2022 power capacity (68 GW) within the next two years.

By 2030, training a leading-edge AI model could demand power equivalent to that of eight nuclear reactors (8 GW), the report predicts.

ChatGPT’s widespread – and growing – user base creates significant demands on its servers. OpenAI, in collaboration with investment partners, intends to invest billions of dollars in new AI data center projects in the coming years.

The Shift to Reasoning Models

OpenAI’s focus, along with the broader AI industry, is also shifting towards reasoning models. These models are generally more capable but require greater computational resources to operate. Unlike models like GPT-4o, which provide near-instantaneous responses, reasoning models “think” for seconds or minutes before answering, consuming more computing power – and therefore, more energy.

“Reasoning models will increasingly handle tasks beyond the capabilities of older models, generating more data in the process, and both of these factors necessitate more data centers,” You said.

OpenAI has begun releasing more energy-efficient reasoning models, such as o3-mini. However, it appears unlikely that these efficiency gains will fully offset the increased power demands resulting from the “thinking” process of reasoning models and the global expansion of AI usage.

Mitigating Your AI Energy Footprint

You recommends that individuals concerned about their AI energy footprint use applications like ChatGPT less frequently, or select models that minimize computational requirements – where feasible.

“Consider utilizing smaller AI models like [OpenAI’s] GPT-4o-mini,” You suggested, “and use them sparingly in ways that avoid extensive data processing or generation.”

#ChatGPT#AI#artificial intelligence#power consumption#energy usage#large language models