this post was submitted on 23 Dec 2025
283 points (99.6% liked)
Fuck AI
4980 readers
1288 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Well, their excuse here is essentially that it's better to have a false positive than a false negative.
And that's actually a pretty standard way of thinking in any industry that deals in automated detection systems. I work with a product that fills a somewhat similar role - automatically detecting a hazard - and what it comes down to is this; a false negative comes back on you, the company. Someone died because your product was supposed to do a job, and didn't. A false positive on the other hand, you can always counter with "But what if there had been a danger and we hadn't alerted you?"
When pressed between those two options, the customer (that is, the execs at the top) will always prefer the false positive. Now, those false positives bring with them a whole host of problems, just like the article describes. Staff get fatigued by constant false alerts, and often start to hate the entire system. But the thing is, the people who pay for the system never have to directly deal with those negative effects. But someone dying who shouldn't have, that's absolutely something those people up at the top get it in the neck for. So they'll happily keep paying for the system, and forcing everyone to use it, even as it burns out their staff. "Better to be safe than sorry."
I'm not remotely arguing that schools should be using this product. I'd need to see a LOT more data on its actual detection rate vs false positive rate to form any kind of opinion on that. Just saying that if you're going to make a product like this in the first place, well, yeah that's how you'd do it.