this post was submitted on 01 Jul 2025
2120 points (98.4% liked)

Microblog Memes

8452 readers
2354 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] jsomae@lemmy.ml 12 points 1 week ago* (last edited 1 week ago) (62 children)

I know she's exaggerating but this post yet again underscores how nobody understands that it is training AI which is computationally expensive. Deployment of an AI model is a comparable power draw to running a high-end videogame. How can people hope to fight back against things they don't understand?

[–] FooBarrington@lemmy.world 20 points 1 week ago (29 children)

It's closer to running 8 high-end video games at once. Sure, from a scale perspective it's further removed from training, but it's still fairly expensive.

[–] jsomae@lemmy.ml 2 points 1 week ago (6 children)

really depends. You can locally host an LLM on a typical gaming computer.

[–] CheeseNoodle@lemmy.world 3 points 1 week ago

Yeh but those local models are usually pretty underpowered compared to the ones that run via online services, and are still more demanding than any game.

load more comments (5 replies)
load more comments (27 replies)
load more comments (59 replies)