AI Energy & Cost Calculator

How much electricity does your AI usage actually consume? Find out the real cost.

⚡ Configure Your Usage

30
2
5

📈 Your AI Energy Footprint

0.300
kWh / Session
$0.05
Cost / Session
1.50
kWh / Week
$0.96
Monthly Cost
$12.48
Yearly Cost
2.60
kg CO₂ / Month
US avg grid: 0.4 kg/kWh

💡 Your Monthly AI Usage Is Equivalent To...

📱
--
smartphone charges
🎬
--
hours of Netflix
🔍
--
Google searches
🚗
--
miles driven (emissions eq.)
--
pots of coffee brewed
💡
--
hours of a 60W bulb

📊 Putting It in Context

vs. Google Search

A single Google search uses about 0.0003 kWh. One AI query uses roughly 17x more energy.

vs. Sending an Email

Sending one email uses about 0.000004 kWh. A single AI query is equivalent to sending ~1,250 emails.

Perspective

AI is more energy-intensive per query, but your monthly AI usage is still far less than running a refrigerator (~30 kWh/mo), a clothes dryer (~5 kWh/load), or driving to work.

How AI Energy Usage Is Calculated

Every time you send a prompt to an AI model, your request travels to a data center where powerful GPUs process your query. These GPUs — typically NVIDIA A100 or H100 chips — draw significant power during inference. The total energy per query depends on the model size (number of parameters), the length of your prompt and response, and the efficiency of the serving infrastructure.

For large language models like GPT-4o, Claude Sonnet, and Gemini, each query requires an estimated 0.004 to 0.006 kWh of electricity. That includes the GPU computation, memory access, cooling overhead, and networking. Image generation models like Midjourney and DALL-E consume more per request because they run many diffusion steps, each requiring full forward passes through large neural networks.

Our calculator uses a per-query energy model: we multiply the energy per query by the number of queries in your session, then scale to weekly, monthly, and yearly projections. CO₂ emissions are calculated using the US average grid carbon intensity of 0.4 kg CO₂ per kWh.

Energy Estimates by Model

Model Energy per Query/Image Comparison to Google Search
ChatGPT (GPT-4o)~0.005 kWh~17x a Google search
Claude (Sonnet)~0.004 kWh~13x a Google search
Gemini 2.0~0.005 kWh~17x a Google search
Grok 3~0.006 kWh~20x a Google search
Llama (local GPU)~0.002 kWh~7x a Google search
Midjourney (per image)~0.025 kWh~83x a Google search
DALL-E (per image)~0.020 kWh~67x a Google search
Stable Diffusion (local, per image)~0.008 kWh~27x a Google search

Sources for These Estimates

  • IEA (International Energy Agency) — 2024 report on data center energy consumption and AI workloads.
  • Goldman Sachs Research — Estimated a single ChatGPT query uses roughly 10x the energy of a Google search (~0.003 kWh for a search, ~2.9 Wh per AI query).
  • The Electric Power Research Institute (EPRI) — 2024 analysis of generative AI energy demands.
  • Anthropic, OpenAI, and Google — Published sustainability reports and inference efficiency disclosures.
  • Luccioni et al. (2023) — "Power Hungry Processing: Watts Driving the Cost of AI Deployment?" Published research measuring energy per inference for various model sizes.
  • US EIA — Average US residential electricity rate of ~$0.16/kWh (2024 data). Grid carbon intensity of ~0.4 kg CO₂/kWh.

AI vs. Other Technology Energy Usage

While individual AI queries consume more energy than simple web searches or emails, it is important to keep perspective. Here is how AI compares to other everyday technology:

  • A Google search: ~0.0003 kWh (0.3 Wh)
  • Sending an email: ~0.000004 kWh (tiny, mostly server overhead)
  • Streaming one hour of Netflix: ~0.08 kWh
  • Charging a smartphone: ~0.012 kWh per full charge
  • Running a refrigerator: ~1 kWh per day (~30 kWh/month)
  • One load of laundry (dryer): ~5 kWh
  • Driving one mile: ~0.24 kWh equivalent in emissions

Even heavy AI users — say, 50 coding sessions per week — might only add 15–30 kWh to their monthly electricity usage. That is roughly the same as leaving a few extra lights on. The real energy concern with AI is aggregate: billions of queries across millions of users add up to significant data center demand.

Tips for Reducing Your AI Energy Footprint

  • Use smaller models when possible. GPT-4o Mini, Claude Haiku, and Gemini Flash use a fraction of the energy of their larger siblings. If your task does not need the most powerful model, downsize.
  • Run local models. Tools like Llama running on your own hardware can be more efficient for repetitive tasks since you avoid network overhead and can optimize for your specific GPU.
  • Be specific in your prompts. Fewer back-and-forth exchanges means fewer queries. A well-crafted prompt that gets the right answer on the first try uses half the energy of two attempts.
  • Batch your requests. Instead of sending 10 small queries, combine them into one larger prompt when the model supports it.
  • Choose efficient image generation. Stable Diffusion running locally on an efficient GPU uses less than a third of the energy of cloud-based Midjourney per image.
  • Consider your electricity source. If you have the option, choosing renewable energy for your home or workplace offsets the carbon impact of local AI inference.
  • Turn off always-on AI assistants you are not actively using. Background AI features that constantly listen or process add up over time.