Altman defends AI power use, says training people takes energy too

Altman defends AI power use

Sam Altman, the CEO of OpenAI has responded to claims that ChatGPT and other similar tools are consuming excessive amounts of electricity by claiming that the discussion frequently overlooks the resources needed to support and educate people.

He said that comparisons between the energy required to train large AI models and the cost of a human answering a question were frequently presented unfairly, during a February 20 event hosted by The Indian Express in New Delhi on the sidelines of the India AI Impact Summit 2026.

During an Indian Express Q&A, OpenAI CEO Sam Altman stated, “Training a human also requires a lot of energy.”

Altman maintained that a more insightful comparison was between the energy required by a human to provide an answer and the energy used by an AI model that has already been trained to respond to a prompt.

He disputed assertions made during the conversation that a single ChatGPT query could consume the equivalent of several smartphone battery charges, stating that it was far from that amount.

He also criticized widely shared online posts about water use claiming that public perceptions had been distorted by outdated cooling techniques.

Sam characterized viral “gallons per query” claims as false and stated that OpenAI no longer uses evaporative cooling in its data centers in one account of his comments.

The move is in the wake of governments and communities taking a keen interest in the pace at which AI infrastructure is being developed and the pressure it is putting on the power grid.

According to the International Energy Agency, data centers consumed some 415 terawatt hours of electricity in 2024, which was about 1.5% of total consumption and this is expected to more than double by 2030 in the base case scenario.

Altman recognized the concerns that have been raised about overall consumption as AI adoption increases.