I feel like people over estimate how effective LLMs actually are. Alot of the slop bros are unreliable narrators. Alot of their shit isn't finished or never really worked in the first place.
LLMs just steal large swaths of code from their training data with light composition. That's why it fails if you try and do anything remotely novel or specific.
They are just casinos for stack overflow posts.
The models for bigger = they can store more data.
I'll give them that LLMs have some very minor intelligence but it's minor.
People equate speed with quality all the time. Management thinks that just because they get code faster it's better for long term company profits.
What they don't realize is they code they are getting is completely unmaintainable and they'll hit a wall in a year or two once the LLM has churned the code base into a spaghettied mess.
I just hope people in charge of critical infrastructure know better.
I can't imagine dying because some medical equipment I'm using fails because of LLM slop.