this post was submitted on 24 Feb 2026
209 points (99.1% liked)
Fuck AI
6040 readers
1284 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
My mind is blown every time I read that a major company has unleashed an "AI Agent" on production systems and code. Like, did everyone in the IT industry suddenly grow stupid and forget the most basic rule? Always sandbox and test. You never ever fuck around on production systems!
I guess setting up a test environment would take too much time and these chucklefucks must move faster and break more stuff.
The issue is IT folk are brought in either too late or not at all. Literally just salespeople pitch to execs and IT is for the he birds
The problem with so much of the IT industry - and it's been like this ever since I've started - is that it's so prone to fads and bandwagoning for fear of being called "legacy" or what have you.
So the same lessons have to be learned over and over for the hype and delusion cycle. If someone considered "old" (>25) might say that this new shiny object looks like something in the past, only with a minor twist and some new branding, they are ignored, etc.
The other problem is that it has a real problem with worshiping whatever is (perceived as) new and shiny. Pair that with what I think Uncle Bob pointed out about how we basically keep adding new people into the field at a fairly rapid-fire rate, and push out the more experienced, because they are considered "old". You see this kind of culture skewered fairly well in Silicon Valley series, where they act like someone who is 25 is over the hill.
There is a real craft to these things, and learning things at the feet of people with real, applied experience is useful. Unfortunately, a whole lot of the industry is not really set up that way at all. For a set of people that call themselves "engineers", I fail to see anything approaching that level of rigor when it comes to properly setting up a pipeline that goes from apprenticeship to master and proper stewardship of systems that run so much of our lives now.
Instead, what you often see is job screening that strongly selects for recent grads and people grinding on leetcode and learning algorithms that most people in programming likely won't be using at all in the day-to-day. People in their early 20s making all the hiring decisions. People on both sides of the hiring equation trying to use LLMs to game the hiring process.
The whole thing seems to be set up like a fucking casino where the goal is to go to a prestigious school, grind incessantly on useless trivia for the interview, land a job either at some BigTech place or some startup and work towards burnout there, then retire or switch to another field entirely before you leave your 20s. I'm not sure how a serious industry can be expected to sustain itself in a stable way on such thinking. There is not much expectation that you work until a normal retirement age - not unless you get into some non-technical aspect like management. And that management is almost never looked to for any direction, LOL. Even the most recent hire can often convince entire teams to do shit like rewriting working systems using language du jour (Rust, anyone?).
And the thing is, it's not all that new - if you read something like Microserfs you'll see it skewered back then (1995). I would not be surprised to learn there are even early precedents.
LLM output is not deterministic. WTF good is testing gonna do for you when it could just do something randomly different the next time anyway?
Letting the thing execute commands on its own, without having the human read and confirm them first, is just fundamentally idiotic and insane.
I think this needs to be called out much more. IT, by its very nature is meant to consist of repeatable, verifiable processes and outputs. That is how a lot of the trust around the process is built.
Now you’re basically trying to tell people: Trust a system that can only reproduce the same results 98-99% of the time. For some that may be fine, but it’s going to become more of a problem as time goes on.
LLM outputs are 100% deterministic.
If you enter the same prompt with the same seed you will get the same vector outputs.
Chatbots take those vector outputs and treat them as a distribution and select a random token. This isn't a property of the LLMs, it's a property of chatbots.
If they aren’t selling non-chatbot LLMs then that’s irrelevant.
You know this saying in ICT: Everyone has a development environment, a lucky few also have a separate production environment.
I witnessed it first hand on IBM, three in the morning, troubleshooting a database problem for a big client. Engineer writes up a script to try and solve the issue, I was the systems operator. Tells me to just run it on the mainframe.
“Wait, was this tested at all?”
“Client authorized it, they just want the downtime gone. Send it.”
So I just ran an untested script that fundamentally changed everything on the production database, written by a sleep deprived engineer that just wanted to go back to sleep. Granted, it worked, that one engineer was an old rockstar who had been with the client for over a decade. But the next three weeks were dedicated to tiptoeing around the changes of this one script and testing everything, in production, to make sure that the solution was viable long term and it didn't break anything unseen. We all knew better, but everyone agreed and did it anyways.
IT loses the battle when the C suite says "do it now".
IT Ops Manager here. I was told by C Suites I was becoming “difficult to work with” in my attempt to slow and control the constant deployment of AI into every aspect of the business.
You ever get tempted to let them just shove it in everything so the can see what ‘difficult to work with’ actually means?
Thing is, in any industry, you need a combination of new blood and old wisdom in order to successfully pass the torch to the next generation. Old wisdom is expensive to keep around, but the cheap new blood doesn't know what they need to in order to succeed.
When you get rid of all your old wisdom and hire all new blood to cut costs, they're going to come in with a series of footguns that old wisdom knows how to avoid. If you're lucky, the new blood is going to learn about those footguns primarily by shooting themselves with them and then scrambling to fix the big problem that follows. If you aren't lucky, said footgun blows the entire leg off your corporation and you implode, do not pass Go, do not collect $200.
All this to say, no, they probably don't know. A million companies elected to excise all of their knowledge and replace it with fresh-faced, eager, noticeably cheaper juniors.
Now there's nothing wrong with hiring juniors, but you can't just put 30 of them in a room and say "alright, monkeys, get to writing Shakespeare" - they lack 30+ years of practical knowledge, and as mentioned, juniors all ship with footguns pre-installed. You need someone who is able to steer the ship properly. A good senior dev is worth his weight in gold. However, most companies don't want to pay a senior dev his weight in gold. Observe the consequences.
It's especially egregious in IT. I think Uncle Bob had some statistics about how many new people are introduced to the field all the time and how that effectively dilutes wisdom and experience and there is a real challenge to try to train those people in useful ways.
The problem, in practice, is that pretty much the clock starts at about age 25 where people start thinking you are "too old" to do IT and it ratchets up every year to try to push you out.
The net effect is as you say - lots of eager beavers wanting to try something new and shiny but not too much wisdom and experience left (or if they are, they are mostly sidelined).
So then you get lots of stupid rookie shit going on, nearly constantly. People that probably never even heard of Mythical Man-Month (and if they did, probably think it's not worth reading because, well, what would someone that came before even know about this industry) chomping at the bit to get a second-system effect going on working system(s) (rewrite it in Rust), for example.
We all got to see this at the national level where you saw fElon's boys making proclamations about rewriting COBOL systems refined over the course of decades into Java in mere months. Because "AI". That's of course sheer fucking lunacy, but the industry is known for give adulation to bold bullshit proclamations like this.