Brother Groke, is it true? A casual chat for you, the cost of this simple message Elon Musk and the planet dear
Do you know that a simple “thanks” for the AI system such as Groke also uses an unqualified amount of energy? Small for the same query, but the scale changes everything. And this means that the actual cost for technical companies and planets.

In short
- Chatting with AI Bot like Groke or Chat consumes a lot of energy
- Millions of unnecessary questions add to big energy use
- The cost of running all this affects both technical companies and planets
Most of us do not think twice before typing something in AI Chatbot. A random question, a casual greeting, or even a humble “thanks” may finally feel harmless. For example, if you look at X, where Grocke 4, the chatboat made by Elon Musk’s XE, rotates, you will see thousands of people tagging AI Chatbot light and serious tags in everything. Groke brother, check it – this is often a message once on X.
But behind the curtain, every message that we send to AI tools such as Groke, Chatgate, Dipsek, or any other chatbot, using electricity, server space and other resources. The very real pressure he has put on energy systems is not only seen by technical companies, but also by large -scale policy makers, workers and all those who are trying to keep the planet cool in the middle of global warming.
You see, these chatbots run at a large scale data centers that require large amounts of energy to operate. This means that even a simple and unnecessary query uses resources. And when you multiply millions of users doing the same work every day, this technology starts adding to companies, and in a serious plan of things, to the planet.
You may be surprised what we are trying to do here? Explain to us. In one day of April, an X user, who goes under the name Tommy, asked a simple question, “I wonder how much money has been lost from people in the cost of electricity from people, who ‘please’ and ‘thanks’ thanks’ thanks’ their models’.” Now, it was in the form of a accent post, but OpenEE CEO Sam Altman replied, “Tens of million dollars were well spent – you never know.” That answer attracted the attention of the people. It was thinking them, being actually humble for AI to spend millions? And if yes, what does it mean for energy use and environment?
Generative AI – Grocke 4, Chatgipt, Gemini and Like – especially uses high amounts of energy during the training phase of the model. But even after training, every conversation, no matter how small, requires computing power. Those humble phrases, while sweet, are still counted as questions, whether they are serious or not. And questions take processing power, which in turn consumes electricity. Do you see the pattern? All this is connected.
Use of energy, but how much?
The AI systems are still relatively new. Therefore, accurate and more concrete details are still coming about how much energy they use. But there are some estimates.
For example, the AI Tool Deepsek estimates that a small AI response for something like “Thank you” can use about 0.001 to 0.01 kwh power. It sounds small for a single query. But the scale changes everything. If one million people send such a message every day, the use of energy can reach 1,000 to 10,000 kwh per day. More than a year, it becomes hundreds and thousands of megawatt-hours, which is sufficient to provide electricity to many houses for months.
Similar energy is used in the AI system. The MIT technology review conducted a study in May and came up with some figures. It was anticipated by the energy use of the several conclusions that an individual who actively uses AI, would force the system to consume a day. “You will use about 2.9 kW-hour power-it is enough to ride more than 100 miles on E-bike (or average electric vehicle) or to run the microwave for more than three and a half hours,” the study has said in the study.
Such high energy use by AI Systems has inspired technical companies to search for energy sources. From Google to Microsoft to Meta, they are all trying to get into nuclear power or tied with energy -generating atomic plants. But some companies are unable to secure 100 percent of clean energy, even trying to use more traditional methods to produce electricity. XAI, which is now running one of the largest groups of computing power to operate Grocke 4, was recently in the news, because in the memphis, it began using methane gas generators. The move inspired protests against pollution against the local environment group, Memphis community. The group said, “Our local leaders have been assigned to protect us from corporations violating our right to clean air, but we are witnessing their failure in doing so,” the group said.
But is a ‘please’ and ‘thanks’ still worth it?
Of course, not everyone agrees on the impact of the use of AI energy on the environment. Some people feel that it is being flown out of ratio.
A director of Microsoft Copilot, Kurtis Beers also argue that including politics, there are also horrific messages. In Microsoft Worklab Memo, he said that using basic etiquette with AI leads to more respectable and collaborative output. Originally, in his view, being humble for AI chatbott causes accountability and performance improvement, which can make additional energy use correct.
Similarly, Elon Musk’s AI chatboat grouke also sees things slightly differently. In his response to the above debate, Groke stated that the additional energy used by humble words was negligible in the large picture. Even on millions of questions, Groke 4 says, the use of total energy will be similar to running light bulbs for a few hours. In Chatbot’s words, “If you are concerned about the environmental footprint of AI, there are big criminal model training (which can use thousands of KWH) and data center cooling. Your humble words? They are just a favorable whisper in digital void.”