this post was submitted on 04 Apr 2026
531 points (99.1% liked)

Fuck AI

6651 readers
1162 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] matlag@sh.itjust.works 40 points 3 days ago (4 children)

That's the endgame, isn't it? Not only is it going to be forced on us with promises that it will be our assistant we can delegate things to, but we will also we accountable for its calamities for "not having monitored it good enough".

[–] wonderingwanderer@sopuli.xyz 2 points 2 days ago

It's like the tesla "self-driving" cars which were rushed through to production without thorough testing, then elon made all sorts of promises about how safe they were, but then they start every session with a disclaimer that says "driver must remain alert at all times; if the vehicle crashes, it's your responsibility."

What a fucking joke.

[–] SacralPlexus@lemmy.world 30 points 3 days ago (4 children)

I work in radiology and I’ve been saying this for years. AI tools probably won’t replace us because of liability. We will have all of the liability while AI tools push us to work faster and faster for less money. I suspect this will happen with a lot of jobs.

[–] takeda@lemmy.dbzer0.com 2 points 2 days ago (1 children)

I don't know how it works with radiology, but my experience in software engineering is that reviewing the slop code takes more time than writing it, especially when it is crap and you have to send it back again and again and then review it.

At this point I either have to go through honestly which is extremely slow and frustrating to both sides, or accept the slop without review and then deal with tech debt later.

Both options are bad.

[–] SacralPlexus@lemmy.world 4 points 2 days ago

Well in radiology we are searching images for specific findings so the generative slop problem isn’t the issue for us, it will be being overwhelmed with false positives or false negatives with a time pressure to go faster. I’ve been trying to follow the impacts of these models on the coding professions and I do not envy you at all. It really does seem like a rock and a hard place right now.

[–] verdi@tarte.nuage-libre.fr 4 points 3 days ago

It takes less efffort to eliminate the man whipping you than it takes to perform the work at the unreasonable rate your current whip demands. Delay work, Deny work, Defend your fellow workers. Turn the weapons of the oppression against it, UNITE, UNIONISE!

[–] KittyCat@lemmy.world 2 points 3 days ago

And that's one of the few fields where it makes sense too. A system that circles potential areas of interest on medical scans is a useful thing.

[–] Tollana1234567@lemmy.today 3 points 3 days ago

like some posts on the subs, they probably want to staff hospitals less and put all the work on the few mds they hire, to squeeze out more work for them, without hiring more radiologists,MD.

[–] sp3ctr4l@lemmy.dbzer0.com 4 points 2 days ago* (last edited 2 days ago)

The problem with this is that the corpos will fight each other over IP/copyright/liscensing laws.

Yeah, if they can work out some kind a framework, then when blam, that is neo/technofuedalism formalized.

But the problem is that they are, in addition to ravenously insane, greedy as fuck.

Our legal system is slowly building precedent for how this kind of shit will shake out in court... but there are no broadly well understood and clear guidelines here, there's no framework for this.

But!

They're doing move fast and break things with trillions of dollars.

And if when the spinning plates start flying apart, they will eat each other, it will be complete fucking chaos, because they moved way faster than apparently their ability to even consider or estimate what the rules for this will look like.

They did not think any of this shit through, at all.

[–] CileTheSane@lemmy.ca 5 points 3 days ago

The trick is end users will be held accountable for things their AI does, while corporations and governments will say "an AI did it" and wash their hands of it.