ChatGPT may not be as power-hungry as once assumed
ChatGPT uses less energy than thought, study finds it's 0.3 watt-hours per query.
A study by Epoch AI suggests that ChatGPT's energy consumption may be lower than previously thought, using approximately 0.3 watt-hours per query. This figure is a reduction from the widely cited 3 watt-hours, attributed to improvements such as OpenAI's GPT-4o leveraging more efficient chips.
The analysis does not account for additional energy costs from functionalities like image generation. Moreover, reasoning models, which take longer to process answers, demand more computational power, indicating potential rises in power usage as AI evolves and grows.
Despite advances in AI efficiency, the need for power-hungry infrastructure is expected to rise, with a predicted demand in the near future potentially mirroring a significant portion of California’s power capacity. Proper usage of smaller models like GPT-4o-mini is recommended to reduce the energy footprint.