this post was submitted on 31 Oct 2025
87 points (98.9% liked)
askchapo
23174 readers
48 users here now
Ask Hexbear is the place to ask and answer ~~thought-provoking~~ questions.
Rules:
-
Posts must ask a question.
-
If the question asked is serious, answer seriously.
-
Questions where you want to learn more about socialism are allowed, but questions in bad faith are not.
-
Try !feedback@hexbear.net if you're having questions about regarding moderation, site policy, the site itself, development, volunteering or the mod team.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You're actually wrong about the specifics of energy usage. The vast majority of energy usage of a model comes from its usage, not from its training. But you are right that the energy usage of an individual prompt is relatively small, roughly comparable to 15 or so Google searches.
The problem is when you process billions of prompts every day.
Had to google it to check but you are right. The sum of all the energy used to prompt a model over its life time is usually greater than whats need to train it to begin with.
I didn't know that but that makes sense. I meant more like prompting The Thing Once isnt that big of an energy drain where as the intial training is. Average of 0.34 watt-hours per prompt but to train GPT-3 cost around 1.3 gigawatt-hours (GWh) and GPT-4 requiring an estimated 62.3 GWh. I see all these memes about how prompting an llm once is super wasteful and thats the misconception i was addressing.