You Don’t Turn On a Microwave for an Hour Because It Wastes Energy. Don’t Do This Either.

Image showing AI Energy Consumption Impact

The Hidden Energy Cost of AI-Generated Content

Artificial intelligence (AI) generated videos are rapidly gaining popularity as an alternative to self-recorded footage or hiring professionals. However, much like generating text prompts and graphics, creating short video clips with AI consumes a substantial amount of energy. Thanks to research from MIT scientists, we can put these energy values into real-world perspective.

Table of Contents

  • How Much Energy Do Mass-Generated AI Videos Cost Us?
  • The Dramatic Surge in Energy Consumption by AI Data Centers

How Much Energy Do Mass-Generated AI Videos Cost Us?

According to a study conducted by the MIT Technology Report, generating a single 5-second video using one of the newer AI models consumes approximately 3.4 megajoules (MJ) of energy. To put this into perspective, experts compare this energy expenditure to running a microwave oven for an entire hour.

Researchers at the Boston-based MIT university further explored the energy implications of a typical workday involving artificial intelligence. They sought to answer: What is the total energy cost for handling 15 text queries, generating 10 graphics, and creating three 5-second AI videos?

The findings reveal that this combined activity increases the energy consumption to over 10.4 MJ, which translates to approximately 2.9 kilowatt-hours (kWh). In simpler terms, this is equivalent to keeping a microwave oven running for 3.5 hours.

It’s crucial to keep these figures in mind the next time you consider generating AI videos without a clear necessity. Conscious usage can help mitigate the environmental impact.

The Dramatic Surge in Energy Consumption by AI Data Centers

The demand for generative AI and cryptocurrency mining is significantly contributing to a dramatic increase in energy consumption. According to information gathered by the U.S. Energy Information Administration (EIA), the USA broke its national energy consumption record in 2026, with forecasts predicting a repeat in 2027. This surge is largely driven by the increasing demand for data centers that power these AI technologies.

Meanwhile, as reported by CNBC, electricity bills in the U.S. alone rose by 6.9% in 2025, outpacing the inflation rate of 2.9% at the time. Current projections indicate that American household electricity expenses are expected to increase by an additional 6% by 2027.

Furthermore, the consulting firm Arthur D. Little warns that by 2030, a significant 3% of the world’s total energy demand will be directly attributed to the operation of artificial intelligence technologies. This highlights a growing global concern regarding the sustainability of unchecked AI expansion.

Frequently Asked Questions (FAQ)


How much energy does one AI-generated video use?

A single 5-second AI-generated video can consume approximately 3.4 megajoules (MJ) of energy, which is comparable to running a microwave for an hour.


What is the total energy cost of a typical AI workday?

Handling 15 text queries, 10 graphics, and three 5-second AI videos can consume over 10.4 MJ (2.9 kWh) of energy, equivalent to running a microwave for 3.5 hours.


How is AI impacting global energy consumption?

The demand for AI data centers is driving a dramatic increase in global electricity use, with projections suggesting that AI operations could account for 3% of the world’s total energy demand by 2030.

Source: Mashable / Reuters / Arthur D. Little / CNBC. Opening photo: Generated by Gemini.

About Post Author