this post was submitted on 10 May 2026
63 points (94.4% liked)

Programming

26924 readers
548 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] ell1e@leminal.space 58 points 4 days ago (53 children)

We need the equivalent investment now. If average code is cheap, then the scarce resource is no longer the ability to produce it. The scarce resource is the ability to read it, to navigate it

You know what would help a lot with understanding the code one is working on? Writing it yourself without turning your brain off via AI.

But that's an insight the article somehow seems to be missing.

load more comments (53 replies)
[–] luciole@beehaw.org 15 points 4 days ago (1 children)

AI tools can generate functional, adequate, perfectly average code at a speed and cost that would have been unimaginable even five years ago. And like the outsourcing wave of the early 2000s, the economics are real and rational. Nobody is wrong for using these tools. The code they produce is often fine. It works. It passes tests. It might ship as-is.

Not the first time I've read this kind of statement and I always struggle to reconcile this with my personal experience. I'm seriously doubting that I'm just not a "good enough prompter". I know how to explain context from domain to tech and vice versa, that's like, a good 20% of my job. I'd say that AI tools are good at producing code that already exists.

The LLMs are an interface to a corpus of written material. They've never had a thought, a chat around the coffee machine, or any experience in the largest sense of the world. This is a hard barrier on any induction they may emulate.

[–] BlameThePeacock@lemmy.ca 3 points 4 days ago (1 children)

You're both correct, and also wrong.

A lot of code already exists. Or at least in a close enough form that it can be easily adjusted to address a new situation.

When someone comes up with an idea for a new App at this point, it's almost never because it's an entirely new branch of computing. It's very likely just CRUD with a visual design, and then a small more complex algorithm to mix the data around behind the scenes.

What's the difference between a dating app and an automatic meal plan builder? The algorithm doesn't care about whether or not the recipe swiped back when it matches it up to you.

You're right that they're not going to be inventing entirely new things most of the time, that's just not what's needed of them most of the time.

[–] luciole@beehaw.org 3 points 4 days ago (2 children)

Fortunately software is much more than App ideas fishing for VC investments. A lot of us are building actual tools for nurses, teachers, technicians, artists, students, etc. We have to analyze these human beings' role in society, their needs, their situation, which is different from merely preying on their attention span. Programming languages are still the most reliable way to specify how the software must behave. And once the software is done, it is merely born. It then lives through a steady flow of continuous adaptation until one day it dies as all things do. Downplaying the human condition is a mistake.

[–] iglou@programming.dev 1 points 3 days ago* (last edited 3 days ago)

You missed the point. The point is that almost all software today follows the same general ideas, patterns, etc.

The quality of the output of AI is not tied to what these patterns are used towards. Even if, say, your tool has a completely new network protocol. An LLM will still "understand" that it is a network protocol, that it serializes following rules that you tell it, serializes and deserializes the way you decide, then it will write that down in a memory and be able to work with that.

A new file format? Same. A very specialized new kind of No-SQL database that fits your very specific tool better? It will also write down in a file how it works and be able to use that.

It's as good as the documentation you give it is. Which, for basic things such as setting up a basic REST API, it has learned in its training data. If it hasn't, it's up to you to provide it, and it will be perfectly able to use it.

Even if you build some weird unique assembly language it will be able to use it if you give it the set of instructions and their documentation.

[–] BlameThePeacock@lemmy.ca 1 points 4 days ago

A medicine dispenser application for a nurse is still just CRUD operations for the most part. There's nothing innovative about how the code would be written in an application like that.

[–] tatterdemalion@programming.dev 2 points 3 days ago* (last edited 3 days ago)

As if needing to be able to understand code quickly wasn't already a problem before LLMs.

[–] sobchak@programming.dev 8 points 4 days ago* (last edited 4 days ago) (3 children)

Meh, disagree with a lot of this.

AI tools can generate functional, adequate, perfectly average code

Not in my experience.

The outsourcing era taught us that the expensive part of software was never writing it. It was understanding it well enough to change it safely, to debug it under pressure, to explain to the next person why a particular decision was made at 2 a.m. on a Tuesday.

Since AI is adequate, just have AI change, debug, and explain it. You don't even need devs running the AI. Have AI generate intent. Just have AI scrape Twitter for people complaining about applications they wish existed, and have the AI make them. Let AI do market research. It's supposedly perfectly adequate.

Just have AI scrape Twitter for people complaining about applications they wish existed, and have the AI make them.

I mean... in 2026, this is probably a viable business strategy tbh.

[–] mindbleach@sh.itjust.works 2 points 4 days ago

'Well if it's merely okay just have it be perfect.'

Unserious.

[–] fruitycoder@sh.itjust.works 2 points 4 days ago

Its perfectly adequate generating simple scripts if you know what it's doing or complex programs IF.you have a "harness" which is to say tones of well defined scopes, design docs, coding guidelines, and a dev and test environment with written and automatic unit and integration tests.

Basically every devs wish lists. You get adequate complex coding results.

load more comments
view more: next ›