this post was submitted on 02 Mar 2026
44 points (97.8% liked)

Programming

26050 readers
87 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
top 9 comments
sorted by: hot top controversial new old
[–] Solumbran@lemmy.world 6 points 1 week ago

I mean, when you see most programmers, you want to believe it :)

[–] BlameThePeacock@lemmy.ca 6 points 1 week ago (2 children)

Eliminating programmers will be possible when we figure out how to eliminate engineers in designing buildings.

Only a true AGI will be able to do that, and while LLMs feel like a step towards AGI, they are still missing the critical ongoing learning component that needs to happen for an AGI to exist. The way the current systems are trained simply doesn't allow for accepting and adopting new information continuously.

[–] BrilliantantTurd4361@sh.itjust.works 2 points 1 week ago (1 children)

They are already learning, but they cannot ideate. That is the distinction.

[–] entwine@programming.dev 6 points 1 week ago (1 children)
  • Take a human and have him study every single repo on GitHub

  • Take an AI and train it on every single repo on Github

Which of those two will continue to make novice mistakes like SQL injection and XSS vulnerabilities?

These AI "coding agents" aren't learning or thinking. They're just natural language statistical search engines, and as such it's easy to anthropomorphize them. Future generations will laugh at us, kinda like how we laugh at old products that contain cocaine, asbestos, lead, uranium, etc.

[–] BrilliantantTurd4361@sh.itjust.works 1 points 1 week ago (1 children)

I never said they were intelligent. But by definition they are learning and it is not conceptually different from how we learn.

[–] entwine@programming.dev 2 points 1 week ago

But by definition they are learning and it is not conceptually different from how we learn.

(citation needed)

"Machine learning" is neither mechanically nor conceptually similar to how humans learn, unless you take a uselessly broad view and define it as "thing goes in, thing comes out". The same could be applied to a simple CRUD app.

[–] TehPers@beehaw.org 0 points 1 week ago (1 children)

The way the current systems are trained simply doesn't allow for accepting and adopting new information continuously.

As further evidence of this, RAG was supposed to enable this. Instead, we've found that RAG was nothing more than an overused buzz-term that has limited applications, and often results in hallucination anyway.

[–] BlameThePeacock@lemmy.ca 1 points 1 week ago

Rag was never supposed to be about learning over time. It was supposed to provide better context at inference. It could never scale to handle new learning beyond focused concepts.

[–] Horrabin@programming.dev 3 points 1 week ago

"there is no substitute for understanding" No further questions, Your Honor