this post was submitted on 23 Feb 2026
22 points (80.6% liked)

Programming

25780 readers
181 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments

I feel like people over estimate how effective LLMs actually are. Alot of the slop bros are unreliable narrators. Alot of their shit isn't finished or never really worked in the first place.

LLMs just steal large swaths of code from their training data with light composition. That's why it fails if you try and do anything remotely novel or specific.

They are just casinos for stack overflow posts.

The models for bigger = they can store more data.

I'll give them that LLMs have some very minor intelligence but it's minor.

People equate speed with quality all the time. Management thinks that just because they get code faster it's better for long term company profits.

What they don't realize is they code they are getting is completely unmaintainable and they'll hit a wall in a year or two once the LLM has churned the code base into a spaghettied mess.

I just hope people in charge of critical infrastructure know better.

I can't imagine dying because some medical equipment I'm using fails because of LLM slop.