this post was submitted on 01 Jul 2025
2200 points (98.4% liked)

Microblog Memes

10442 readers
2792 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

RULES:

  1. Your post must be a screen capture of a microblog-type post that includes the UI of the site it came from, preferably also including the avatar and username of the original poster. Including relevant comments made to the original post is encouraged.
  2. Your post, included comments, or your title/comment should include some kind of commentary or remark on the subject of the screen capture. Your title must include at least one word relevant to your post.
  3. You are encouraged to provide a link back to the source of your screen capture in the body of your post.
  4. Current politics and news are allowed, but discouraged. There MUST be some kind of human commentary/reaction included (either by the original poster or you). Just news articles or headlines will be deleted.
  5. Doctored posts/images and AI are allowed, but discouraged. You MUST indicate this in your post (even if you didn't originally know). If a post is found to be fabricated or edited in any way and it is not properly labeled, it will be deleted.
  6. Be nice. Take political debates to the appropriate communities. Take personal disagreements to private messages.
  7. No advertising, brand promotion, or guerrilla marketing.

Related communities:

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] jsomae@lemmy.ml 2 points 7 months ago (4 children)

really depends. You can locally host an LLM on a typical gaming computer.

[–] FooBarrington@lemmy.world 9 points 7 months ago

You can, but that's not the kind of LLM the meme is talking about. It's about the big LLMs hosted by large companies.

[–] floquant@lemmy.dbzer0.com 5 points 7 months ago* (last edited 7 months ago) (1 children)

True, and that's how everyone who is able should use AI, but OpenAI's models are in the trillion parameter range. That's 2-3 orders of magnitude more than what you can reasonably run yourself

[–] jsomae@lemmy.ml 0 points 7 months ago* (last edited 7 months ago) (1 children)

This is still orders of magnitude less than what it takes to run an EV, which are an eco-friendly form of carbrained transportation. Especially if you live in an area where the power source is renewable. On that note, it looks to me like AI is finally going to be the impetus to get the U.S. to invest in and switch to nuclear power -- isn't that altogether a good thing for the environment?

[–] Thorry84@feddit.nl 5 points 7 months ago* (last edited 7 months ago)

Well that's sort of half right. Yes you can run the smaller models locally, but usually it's the bigger models that we want to use. It would also be very slow on a typical gaming computer and even a high end gaming computer. To make it go faster not only is the hardware used in datacenters more optimised for the task, it's also a lot faster. This is both a speed increase per unit as well as more units being used than you would normally find in a gaming PC.

Now these things aren't magic, the basic technology is the same, so where does the speed come from? The answer is raw power, these things run insane amounts of power through them, with specialised cooling systems to keep them cool. This comes at the cost of efficiency.

So whilst running a model is much cheaper compared to training a model, it is far from free. And whilst you can run a smaller model on your home PC, it isn't directly comparable to how it's used in the datacenter. So the use of AI is still very power hungry, even when not counting the training.

[–] CheeseNoodle@lemmy.world 3 points 7 months ago

Yeh but those local models are usually pretty underpowered compared to the ones that run via online services, and are still more demanding than any game.