This Tool Will Tell You How Much Energy Your AI Chatbot Is Using

Thanking your AI chatbot when it responds to a query may not require a lot of energy in itself, but the cost of your interactions will add up over time, and a new tool from Hugging Face can tell you roughly how much.

ChatUI’s energy interface estimates the energy consumption required to communicate with an AI model in real time, comparing it to common appliances such as LED light bulbs and phone chargers. You can enter any query or use one of the provided inputs to generate a response along with the corresponding energy requirement.

For example, it took the AI ​​just over 25 seconds and 0.5 watt-hours to generate a “professional email,” which is equivalent to 2.67% of the phone’s charge. A 90-second scenario for testing transcription software required 1.4 watt-hours—7.37% phone charge, 22 minutes of LED light, or 0.6 seconds of microwave use. (The response to my “thank you” was 0.2% of the cost of the phone.)

Please note that ChatUI approximates rather than gives exact measurements. The tool can run on a variety of models, including Meta’s Llama 3.3 70B and Google’s Gemma 3.

What are your thoughts so far?

Comparing AI Energy Usage to Google Search

The International Energy Agency (IEA) estimates that a single ChatGPT query requires almost 10 times more electricity than a regular Google search: 2.9 watt-hours versus 0.2 watt-hours, respectively. If ChatGPT were used in all 9 billion daily searches, it would require almost 10 terawatt-hours of additional electricity per year, equivalent to the use of 1.5 million people in the European Union.

The environmental impact of artificial intelligence is largely driven by the power and water needs of data centers. The IEA expects global AI electricity consumption in 2026 to be ten times higher than 2023 levels, and water demand by 2027 could exceed the entire annual consumption of all of Denmark.

More…

Leave a Reply