this post was submitted on 13 Dec 2025
61 points (85.1% liked)
Programming
23877 readers
468 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But that's point of my post, how can they take junior devs jobs if they're all hallucinating constantly? And let me tell you, we're hiring juniors.
I think your question is covered by the original commentator. They do hallucinate often, and the job does become using the tool more effectively which includes capturing and correcting those errors.
Naturally, greater efficiency is an element of job reduction. They can be both hallucinating often and creating additional efficiency that reduces jobs.
But they're not hallucinating when I use them? Are you just repeating talking points? It's not like the code I write is somehow connected with an AI, I just bounce my code off of an LLM. And when I'm done reviewing each line, adding stuff, checking design docs etc, no one could tell that an LLM was ever used for creating that piece of code in the first place. To this date I've never failed a code review on "that's AI slop, please remove".
I'd argue that greater efficiency sometimes gives me more free time, hue hue
And that’s fantastic! That’s what technology is supposed to do IMHO - Give you more free time because of that efficiency. That’s technology making life better for humans. I’m glad that you’re experiencing that.
If they’re not hallucinating as you use them, then I’m afraid we just have different experiences. Perhaps you’re using better models or you’re using your tools more effectively than I am. In that case, I must respect that you are having a different and equally legitimate experience.