this post was submitted on 28 Jan 2025
1008 points (97.8% liked)

Microblog Memes

6320 readers
4520 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] daniskarma@lemmy.dbzer0.com 4 points 1 day ago* (last edited 1 day ago) (2 children)

2000 times, given your approximations as correct, the usage of a household for something that's used by millions, or potentially billions, of people it's not bad at all.

Probably comparable with 3d movies or many other industrial computer uses, like search indexers.

[–] Tartas1995@discuss.tchncs.de 4 points 1 day ago* (last edited 1 day ago)

Yeah, but then they start "gaming"...

I just edited my comment, just no wonder you missed it.

In 2024, chatgtp was projected to use 226.8 GWh. You see, if people are "gaming" 24/7, it is quite wasteful.

Edit: just in case, it isn't obvious. The hardware needs to be produced. The data collected. And they are scaling up. So my point was that even if you do locally sometimes a little bit of LLM, there is more energy consumed then just the energy used for that 1 prompt.

[–] Zos_Kia@lemmynsfw.com 1 points 1 day ago

Yeah it's ridiculous. GPT-4 serves billions of tokens every day so if you take that into account the cost per token is very very low.