18.5 C
Cañada

The Hidden Burden of AI: GPT-5’s Energy Demand Exposed

Date:

OpenAI’s GPT-5 has been celebrated for its enhanced capabilities, but it’s also shining a spotlight on a less glamorous side of AI: its massive energy consumption. With the company remaining silent on the issue, independent researchers are providing a crucial public service by benchmarking the model’s resource use. Their findings suggest that the new model’s impressive performance comes at a steep environmental cost, challenging the industry to be more transparent about its impact.
The numbers are a wake-up call for the AI community. A research group at the University of Rhode Island’s AI lab has found that a medium-length response from GPT-5 consumes an average of 18 watt-hours. This is a substantial increase from previous models and places GPT-5 among the most energy-intensive AIs ever benchmarked. To put this into perspective, 18 watt-hours is enough to power an incandescent light bulb for 18 minutes. Given that ChatGPT handles billions of requests daily, the total energy consumption could be enough to power 1.5 million US homes, highlighting a sustainability issue of national scale.
This dramatic increase in power consumption is primarily due to the model’s size. While OpenAI has not disclosed the parameter count for GPT-5, experts believe it is “several times larger than GPT-4.” This aligns with a study by the French AI company Mistral, which found a “strong correlation” between a model’s size and its energy consumption. The study concluded that a model ten times bigger would have an impact one order of magnitude larger. This suggests that the trend of building ever-larger AI models, championed by many in the industry, will continue to drive up resource usage.
The new features of GPT-5 also contribute to its high energy demands. Its “reasoning mode” and ability to process video and images require more intensive computation than simple text generation. A professor studying the resource footprint of AI models noted that using the reasoning mode could increase resource usage by a factor of “five to 10.” This means that while a “mixture-of-experts” architecture offers some efficiency, the new, more complex tasks are driving the overall energy footprint to new heights. The urgent calls for transparency from AI developers are a direct response to this growing environmental concern.

Subscribe to our magazine

━ more like this

Mark Zuckerberg Moves On From Metaverse Wreckage — $80 Billion Spent, AI Era Begins

The wreckage is expensive, but Meta is moving on. Horizon Worlds is being shut down on VR platforms — off the Quest store in...

Instagram and Privacy: The End of Encrypted DMs Explained

In a move that has drawn both praise and criticism, Meta has announced that Instagram's end-to-end encrypted direct messages will be phased out starting...

Google’s Amateur Health Advice AI Feature: Launched in Spring, Gone by Autumn

In the span of a few months, Google introduced and then silently discontinued a search feature that used AI to present health advice from...

Microsoft’s Court Support for Anthropic Exposes Deep Tensions Between AI Innovation and Pentagon Control

Microsoft's decision to file a court brief supporting Anthropic in its battle against the Pentagon's supply-chain risk designation has exposed deep and long-simmering tensions...

Musk’s xAI “Macrohardrr” Project Secures Energy Approval Amid Lawsuit Threats

In a move that has further polarized northern Mississippi, state regulators have approved 41 methane turbines for Elon Musk’s xAI. The permit allows the...

LEAVE A REPLY

Please enter your comment!
Please enter your name here