this post was submitted on 13 Feb 2026
342 points (98.3% liked)
Technology
81118 readers
3298 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
local ai models you run on your own machine are better for your privacy, even if they're nowhere as intelligent.
I'm pretty sure they're exactly as "intelligent": not at all
Technically correct but any suggestions on how to differentiate between the different qualities of models?
Local llms have barely reached the point where output is comprehensible, in the mean time chatgpt can effectively scam elderly people.
Not true. 500Mb models suck ass and are just here for fun. A lot of local models in the 2.5Gb range can run on a phone and produce very coherent output on par with free-to-use LLMs without actually destroying the planet (while using them I mean, training is still a nightmare).
"Fun" fact, political bias is baked in the local models too, don't ask Qwen3 about what happened in Tiananmen Square in 1989...
I haven't been following much of the new developments of these self-hostable AI. Can you actually self-host anything decent?
I have played around these smaller models in the past and honestly anything smaller than LLama 4 Scout was just not very useful. Now Llama 4 Scout was "17B Active parameters, 16 experts and 109B Total parameters" so not sure what that even means.
Depends what you mean by decent and your hardware. A simple YouTube search will get you going.