this post was submitted on 13 Mar 2026
-23 points (17.1% liked)

Programming

26036 readers
522 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] AbelianGrape@beehaw.org 2 points 8 hours ago

People with this view seem to forget that writing code by hand is fun. Most of the experienced programmers I know only use AI to skip the actual boring parts (some boilerplate, the occasional "duplicate this module and change some details", stuff like that) and some of them don't use AI at all. One of them is required to use AI by his job and he says most of his co-workers don't like it.

The opposing viewpoints are there and I'm sure NYT knows it.

[–] jubilationtcornpone@sh.itjust.works 12 points 22 hours ago (1 children)

“We’re talking 10 to 20 — to even 100 — times as productive as I’ve ever been in my career,” Steve Yegge, a veteran coder who built his own tool for running swarms of coding agents, told me. “It’s like we’ve been walking our whole lives,” he says, but now they have been given a ride, “and it’s fast as [expletive].” Like many of his peers, though, Yegge can’t quite figure out what it means for the future of his profession. For decades, being a software developer meant mastering coding languages, but now a language technology itself is upending the very nature of the job.

Hate to tell them this but if the LLM's available today are really somehow making you 10 x as productive, then you suck at your job. I suppose the opinion tracks though. I have worked with way too many devs who can pump out lines of bug filled, poor performing code at a rapid pace while seeming to have no idea how it works or how to fix it. These are the same people who are now gleefully hacking together a bunch of LLM generated code that they still don't know how to read.

You still have to understand the complexities and nuances of the tools that your using because the LLM you're generating code with does not and it will come back to bite you in the ass.

[–] zieg989@awful.systems 1 points 14 hours ago

Of course they interviewed this schizo. He is deeply mentally unwell and should be treated in an institution instead of being an expert on all things "Ai"

[–] itkovian@lemmy.world 9 points 21 hours ago (1 children)

How about celebrating the end of coders when LLMs are actually good at software development?

I mean… it’s NYT

[–] sacredfire@programming.dev 6 points 22 hours ago (3 children)

That was an interesting read… The company I currently work for doesn’t allow AI tools to be fully integrated into our code base. I tinker around with them on my own time, but I’m left wondering what the profession is turning into for other people.

Here on lemmy, we are definitely in the naysayers camp, but this article is trying to paint the picture that the reality is that almost everyone in tech is all on board and convinced these tools are the way. That writing code by hand is something of the past. The author certainly went to great lengths to recount many interviews with people who seem to share this opinion - many who I will note, have a vested interest in AI. Yet they didn’t really ask anyone who specifically held the opposing viewpoint. Only tangentially mentioning that there were opponents and dismissing them as perhaps diluted.

I did appreciate that they touched on the difference between greenfield projects and brownfield projects and reported that Google only saw about a 10% increase in productivity with this kind of AI workflow.

Still I wonder what the future holds and suppose it’s still too early to know how this will all turn out. I will admit that I’m more in the naysayers camp, but perhaps that’s from a fear of losing my livelihood? Am I predisposed to see how these tools are lacking? Have I not given them a fair chance?

[–] Kissaki@programming.dev 1 points 4 hours ago* (last edited 3 hours ago)

It's a tool that adds yet more complexity to our profession. More choice, more cost-benefit-analysis, more risk assessment, more shitty stuff to inherit and fix, more ability for shitty code producers to hide incompetence, more product and data policy analysis, more publisher trustworthyness and safety analysis, more concerns regarding what tooling sends into the cloud and what it can access and do locally, a significant "cheap and fast solution" you will be compared against requiring more communication, explanation, justification, new attack vectors to protect against, …

My team and some others [can] use Copilot at my workplace. I haven't had or seen significant gains. Only very selectively. Some other senior devs I trust are also skeptical/selective. We see the potential opportunities, but they're not new solutions. Some other colleagues are more enthusiastic.

What it does is make me question bad code from review request authors. Whether they missed it, are this unobservant, or incapable, or used AI. Quality, trustworthyness, and diligence are concerns anyway, but now I can't really assess how much care and understanding they actually take, if they're misled, take shortcuts, and how that changes over time - other than asking of course.

I'm not scared for my job. It already changed the field and industry, but not in a net quality productivity gain. And will continue to in one way or another. There are many parts of and surrounding software development that it can't do well.

[–] TehPers@beehaw.org 1 points 11 hours ago* (last edited 11 hours ago)

I’m left wondering what the profession is turning into for other people.

All the code I review looks good at first glance and makes shit up as it goes once you read into it more. We use two different HTTP libraries - one sync, one async - in our asynchronous codebase. There's a directory full of unreadable, obsolete markdown files that are essentially used as state. Most of my coworkers don't know what their own code does. The project barely works. There's tons of dead code, including dead broken code. There are barely any tests. Some tests assert true with extra steps. Documentation is full of obsolete implementation details and pointers to files that no longer exist. The README has a list of all the files in the repo at the top of it for some reason.

I will admit that I’m more in the naysayers camp, but perhaps that’s from a fear of losing my livelihood?

People are being laid off because of poor management and a shitty economy. No software devs are losing their jobs because AI replaced them. CEOs are just lying about that because it's convenient. If software devs truly were more effective with these tools, you'd hire more.

Am I predisposed to see how these tools are lacking? Have I not given them a fair chance?

That's up to you to decide. Try using them if you want. But don't force yourself to become obsessed with them. If you find yourself more productive, then that's that. If not, then you don't. It's just a tool, albeit a fallible one.

Still I wonder what the future holds and suppose it’s still too early to know how this will all turn out. I will admit that I’m more in the naysayers camp, but perhaps that’s from a fear of losing my livelihood?

It's all just conjecture at this point. I vividly remember how "the cloud" was allegedly going to help organizations eliminate the IT department, dramatically lower operating costs, and basically put every system admin out of a job.

It succeeded at none of those things. It did help some organizations shift costs from CapEx to OpEx. But it also effectively made data centers available to organizations (and individuals) who didn't have access to that kind of technology before. It didn't live up to the hype but it has had a major impact.

Personally, I figure a lot of these "AI" companies are going to fold. There's just not any value in cramming LLM's into every product. Not to mention we've spent the better part of 30+ years trying to get away from users having to type when they want the computer to do something. Moving back away from a "point- and-click" interface, which has hardly reached its general best state, could be a steep uphill battle.

Again, all conjecture.

[–] luciole@beehaw.org 2 points 22 hours ago

Nice advertisement. Classic unavoidable single path of progress bit. I hope NYT charged Anthropic for it at least.