this post was submitted on 23 Feb 2026
130 points (100.0% liked)

Slop.

801 readers
463 users here now

For posting all the anonymous reactionary bullshit that you can't post anywhere else.

Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.

Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.

Rule 3: No sectarianism.

Rule 4: TERF/SWERFs Not Welcome

Rule 5: No bigotry of any kind, including ironic bigotry.

Rule 6: Do not post fellow hexbears.

Rule 7: Do not individually target federated instances' admins or moderators.

founded 1 year ago
MODERATORS
 

Thanks capitalism for doing the stupidest implementation of this technology possible

you are viewing a single comment's thread
view the rest of the comments
[โ€“] LaughingLion@hexbear.net 2 points 9 hours ago (1 children)

You're right. Getting a 24B model locally isn't going to be as powerful as a 600B model, for sure. You're also right thar they don't think. They absolutely don't.

But the local ones are pretty powerful and can do a lot more than most think. Even some simple vibe coding can be done with local AI. I think for the average gamer type if they wanted to mess with it local is more than enough tbh.

To be fair, local is better in many ways than cloud solutions, it keeps the data private, and does not lock into a vendor (which they desperately want). Also Lora is an option for finetuneing, but thats way advanced for an average user.