this post was submitted on 23 Feb 2026
131 points (100.0% liked)

Slop.

801 readers
485 users here now

For posting all the anonymous reactionary bullshit that you can't post anywhere else.

Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.

Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.

Rule 3: No sectarianism.

Rule 4: TERF/SWERFs Not Welcome

Rule 5: No bigotry of any kind, including ironic bigotry.

Rule 6: Do not post fellow hexbears.

Rule 7: Do not individually target federated instances' admins or moderators.

founded 1 year ago
MODERATORS
 

Thanks capitalism for doing the stupidest implementation of this technology possible

you are viewing a single comment's thread
view the rest of the comments
[–] NominatedNemesis@reddthat.com 3 points 14 hours ago (1 children)

I would like to agree with you, but in my experience I cannot. I usually use local models, on my work computer, and have access to pro models payed by company. There is a great difference.

I have to use AI. It is in the KPI and my salary raise depends on it... so stupid, just got a mail that We cannot replace our computer for the foreseeable future because of ram and ssd shortage... Meanwhile I fight with "developers" that are generating code... which does not even work!

So I use local AI because I am forced to use AI and I am in the terminal anyway. Also f×ck the great companies pushing their bullsh×it, they already demonstated that they will use the data they get from paying companies as well.

AI hase an usecase, LLM is not the sentient sh×t they want us to beleive. I want to go back before the hype...

I actually programmed an AI and trained to do repetitive but not well defineable tasks for me in c++ with openCV and some tensor library, after a week it worked better than any human. Also helped a research group to optimalize an image recognition AI to help doctors identify cancerous cells.

[–] LaughingLion@hexbear.net 2 points 13 hours ago (1 children)

You're right. Getting a 24B model locally isn't going to be as powerful as a 600B model, for sure. You're also right thar they don't think. They absolutely don't.

But the local ones are pretty powerful and can do a lot more than most think. Even some simple vibe coding can be done with local AI. I think for the average gamer type if they wanted to mess with it local is more than enough tbh.

[–] NominatedNemesis@reddthat.com 3 points 13 hours ago

To be fair, local is better in many ways than cloud solutions, it keeps the data private, and does not lock into a vendor (which they desperately want). Also Lora is an option for finetuneing, but thats way advanced for an average user.