this post was submitted on 05 May 2026
53 points (96.5% liked)

technology

24366 readers
375 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] KnilAdlez@hexbear.net 6 points 1 week ago (1 children)

I'm not sure if I understand you, it sounds like you think running a small llm locally on your computer will suddenly make it use like 10x more power. That's not how it works. It's the servers used to run the full sized models that use that much power, as each one has tens of thousands of processors running at once. And local llms do have usage, especially for accessability. I use a local llm for my home assistant instance so I can use voice commands, which is very helpful as a disabled peraon.


[–] chgxvjh@hexbear.net 3 points 1 week ago (1 children)

You seem to have no idea how good modern computers are at idling

[–] KnilAdlez@hexbear.net 7 points 1 week ago (1 children)

What does that have to do with anything?

[–] NewOldGuard@lemmy.ml 1 points 1 week ago (1 children)

I think their point is that regular web browsing will use less power than web browsing with local LLM calls. Your PC running an LLM is likely gonna hit its TDP limits, while browsing will be a fraction of that. Yes it’s less power than used by a trillion parameter model but I think their point is it’s vastly more than your non-LLM standard browsing would be

[–] KnilAdlez@hexbear.net 3 points 1 week ago

Your PC running an LLM is likely gonna hit its TDP limits

Debatable for a 4GB model, depending on the hardware. It's also (most likely) not constantly running, so while yes, it will use more power than not having it, whether or not it is a significant change in the long run depends on many factors.