this post was submitted on 22 Feb 2026
265 points (92.9% liked)

Technology

81759 readers
3068 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Goatboy@lemmy.today 113 points 1 day ago (3 children)

What worries me is that companies are using "the AI fucked up" as an excuse and just... not fixing the problem. They're using it as an accountability shield.

[–] Bakkoda@lemmy.world 12 points 17 hours ago (1 children)

IMO this was always the reason for it. It's the ultimate scapegoat and from the second you saw a headline that said "AI is responsible for..." or "AI did..." and not "Humans used AI to.." it was all over.

Humans are using AI to justify wage suppression, mass layoffs, janky everything and we just gonna blame software and data centers. It's humans, it always was and at least for the foreseeable future it's always gonna be.

[–] Gormadt@lemmy.blahaj.zone 2 points 11 hours ago

It's like all those articles that read "The vehicle struck..." instead of "The driver struck...", "A shooting then took place..." instead of "The officer then shot...", etc, etc.

It's a deflection of blame and whenever I see it it makes my blood boil.

[–] Aceticon@lemmy.dbzer0.com 6 points 19 hours ago

"Computer says" is a pretty standard excuse for doing fucked up shit as it adds a complex form of indirection and obfuscation between the will of a human and the actual actions that result from that will.

Doesn't work as an excuse with people who actually make the software that makes the computer "say" something (because the complexity of what us used is far less for them and thus they know what's behind it and that the software is just an agent of somebody's will), but it seems to work with even non-expert (technology fan) techies, more so with non-techies.

With AI the people using the computer as an excuse just doubled down on this because in this case the software wasn't even explicitly crafted to do what it does, it was trained (though in practice you can sorta guide it in some direction or other by chosing what you train it with) further obscuring the link between the will of a human which has decided what it does (or at least, decided which of the things it ended up doing after training are acceptable and which require changes to training) and the output of a computer system.

Considering that just about the entirety of the Justice System. Legislative System and Regulatory System are technically ignorant, using the "computer says" as an excuse often results in profit enhancing outcomes, incentivising "greed above all" people to use it to confuse, block or manipulate such systems.

[–] Endymion_Mallorn@kbin.melroy.org 47 points 1 day ago (1 children)

That's what companies always do.

[–] WhatAmLemmy@lemmy.world 31 points 1 day ago

The very purpose of creating most company's is to limit liability of shareholders and staff.

It's significantly easier to commit crimes with the knowledge that the system can't come after your liberty or wealth for those crimes.