Large language models, like ChatGPT are becoming increasingly common, raising concerns about their environmental impact. While a simple Google search uses about 0.3 Wh of energy, complex AI queries can consume 10 to 100 times more.
This translates to 3 to 30 Wh per LLM interaction, roughly equivalent to running a 10-watt LED light bulb for 18 minutes to 3 hours.
Experts worry about the cumulative energy consumption as LLMs become integrated into various applications. With no clear projections on future energy use, some are calling for more research on energy-efficient AI development.