this post was submitted on 08 Mar 2026
517 points (94.3% liked)

Off My Chest

1935 readers
13 users here now

RULES:


I am looking for mods!


1. The "good" part of our community means we are pro-empathy and anti-harassment. However, we don't intend to make this a "safe space" where everyone has to be a saint. Sh*t happens, and life is messy. That's why we get things off our chests.

2. Bigotry is not allowed. That includes racism, sexism, ableism, homophobia, transphobia, xenophobia, and religiophobia. (If you want to vent about religion, that's fine; but religion is not inherently evil.)

3. Frustrated, venting, or angry posts are still welcome.

4. Posts and comments that bait, threaten, or incite harassment are not allowed.

5. If anyone offers mental, medical, or professional advice here, please remember to take it with a grain of salt. Seek out real professionals if needed.

6. Please put NSFW behind NSFW tags.


founded 2 years ago
MODERATORS
 

I’ve been working with so many students who turn to it as a first resort for everything. The second a problem stumps them, it’s AI. The first source for research is AI.

It’s not even about the tech, there’s just something about not wanting to learn that deeply upsets me. It’s not really something I can understand. There is no reason to avoid getting better at writing.

you are viewing a single comment's thread
view the rest of the comments
[–] BranBucket@lemmy.world 1 points 2 months ago* (last edited 2 months ago) (1 children)

As I alluded to in another comment in this thread, the worst I've personally seen were procedures develeoped that would have had people entering areas that were not just hazardous, but incompatible with human life, and performing maintenance on fully energized industrial systems without safety constraints in place. Both cases would have caused fatalities if someone blindly followed the checklist as written. An internal review caught these mistakes, but they should have never made it that far.

The people designing the procedure checklists missed them possibly because, as you said, AI lies beautifully, but I think it was also because many people seem to have an inclinination is to trust it over their own judgements and knowledge. These were supervisors with years of direct experience, the red flags should have been instantly obvious. If they'd written it out by hand, the proper order of events would have been almost muscle memory, what made them so careless?

They claimed they just used AI to format and grammar check their work, and I don't have logs to prove or disporve that. But this is more than just a hallucination, it's a lack of reasoning similar to the car wash problem, but with much more severe consequences. TBH I'm not sure even adding specific knowledge of our equipment and facilities would fix it, let alone just a reduction in hallucinations.

On top of that, I've seen a long, long time trend of people who just will not take the time to read and understand the sum total of information needed to safely and correctly perform our work. It's a lot, but we do complicated and dangerous things. They've replaced knowing things with Googling them or searching through documents to find a possibly out of context quote. Failed safety and regulatory compliance inspections are far more common because people just don't know what they need to know despite having all that information at their finger tips. Nothing seems to be processed or retained, it's just sort of gawked at and repeated.

They aren't dumb. I work with them. I know them. It's not just stupidity and it's not just hallucinations. Our tools are using us, and it should always be the other way around. A tool that can't be used, in both the philosophical and literal sense, should be discarded.

I'm not trusting AI anytime soon, and I remain suspicious of everyone until they prove themselves to actually understand what's going on.

I'm willing to reconsider things as technology improves, but I wouldn't bet my 401k on LLMs being worth a shit anytime before I retire.