this post was submitted on 10 Feb 2026
78 points (95.3% liked)

Programming

25441 readers
253 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
top 38 comments
sorted by: hot top controversial new old
[–] TheGiantKorean@lemmy.today 5 points 4 hours ago (1 children)

Man. I so feel this. I'm 51 and started programming when I was 10. It's not anything like it used to be. I miss those days.

[–] sping@lemmy.sdf.org 2 points 16 minutes ago

It's so much better! Tooling is many orders of magnitude better and so many libraries give you deep power from an easy API. What used to be a team and 18 months is a library install and a day so you're free to do much bigger things.

Christ even version control. The shit I put up with over the years.

[–] codeinabox@programming.dev 52 points 9 hours ago (4 children)

This quote on the abstraction tower really stood out for me:

I saw someone on LinkedIn recently — early twenties, a few years into their career — lamenting that with AI they “didn’t really know what was going on anymore.” And I thought: mate, you were already so far up the abstraction chain you didn’t even realise you were teetering on top of a wobbly Jenga tower.

They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.

But sure. AI is the moment they lost track of what’s happening.

The abstraction ship sailed decades ago. We just didn’t notice because each layer arrived gradually enough that we could pretend we still understood the whole stack. AI is just the layer that made the pretence impossible to maintain.

[–] idunnololz@lemmy.world 2 points 39 minutes ago* (last edited 36 minutes ago)

I've had this problem with abstractions for the longest time. Of course whenever I say anything negative about abstractions I just get dog piled so I don't usually like to discuss the topic.

I think abstractions as a tool is fine. My problem with abstractions is that most developers I meet seem to only talk about the upsides of abstractions and they never take into account the downsides seriously.

More often then not, I just think people treat abstractions as this magical tool you cant over use. In reality, over use of abstractions can increase complexity and reduce readability. They can greatly reduce the amount of assumptions you can make about code which has many many additional downsides.

Of course I'm not saying we shouldnt use abstractions. Not having any abstractions can be just as bad as having too many. You end up with similar issues such as increased complexity and reduced readability.

The hard part is finding the balance, the sweet spot where complexity is minimized and readability is maximized while using the fewest amount of abstractions possible.

I think too often, developers would err on the side of caution and add more abstractions then necessary and call it good enough. Developers really need to question if every abstraction is absolutely necessary. Is it really worth it to add an additional layer of abstraction just because a problem might arise in the future vs reducing the number of abstractions and waiting for it to become a problem before adding more abstractions. I don't think we do the latter enough. Often times you can get away with slightly less abstractions than you think you need because you will never touch the code again.

[–] queerlilhayseed@piefed.blahaj.zone 5 points 4 hours ago (2 children)

I feel like they kind of lost the thread here, even though I think I agree with the idea that vibe coding is a fundamentally different thing than another layer of abstraction. There's no need to punch on the web developers. We've all known, for the last several decades at least, that we don't have to understand the entire mechanism completely. No one is out there doping their own silicon and writing JS apps to run on it. The whole point of layered abstractions is that you can build on a set of assumptions without having to know all the implementation details of the layers below. When an abstraction leaks, then it can be useful to have some facility with the lower levels, but no person alive is a master of the full stack. The beautiful thing about abstractions is that you don't have to be. That's still true with vibe coding, just with the extra complexity of having a ticker tape spitting out semi-coherent code faster than any human could type it, which moves the dev from a creative role to more of an administrative one, as they mention earlier in the piece, which 1) is not nearly as fun, and crucially 2) doesn't help you build the muscles that make one good at code administration.

[–] pinball_wizard@lemmy.zip 2 points 2 hours ago* (last edited 2 hours ago) (1 children)

No one is out there doping their own silicon and writing JS apps to run on it.

Ahem. Right. That would be silly. No one would do that.

(Quick, I'll change the subject!)

I'll...uh... So Rust sure looks nice. Nothing silly going on there.

(Joking aside, I've really never done that. I can't claim I've never done anything similarly silly and wasteful. But I haven't done that, anyway.)

(Edit: yet.)

As I typed it I felt in my bones that someone was going to come along with an example of someone doing exactly that. I kinda hope someone does, I've looked into homegrown silicon and it looks... very difficult and expensive.

[–] Ajen@sh.itjust.works 2 points 3 hours ago (1 children)

You think people writing C(++) for baremetal systems don't understand how their hardware works?

[–] queerlilhayseed@piefed.blahaj.zone 6 points 2 hours ago (1 children)

I don't think it's a binary switch between "understanding" and "not understanding". I have the basic gist of how semiconductors and logic gates work, I know a little about how CPUs and assembly work, and I can work with assembly if I have to, but those aren't my areas of expertise. I know enough about floating point arithmetic that I can identify floating point errors as floating point errors, but I don't claim to have anything close to the fluency in those systems that I do for higher-level languages. The ability to specialize makes it possible to make fantastic machines like the global Internet even though no one person on earth understands all the sub-components to the degree that a specialist in a particular sub-component does. I'm not saying that there aren't some computing systems that are fully comprehended by a single person, but the ability to specialize increases the scope of what is collectively possible.

[–] Ajen@sh.itjust.works 2 points 1 hour ago (2 children)

OK, but that doesn't really answer my question, and I'm getting the sense you don't know how deeply some engineers understand how the hardware works. Plenty of embedded programmers have EE degrees, and can write VHDL just as well (or just as badly) as they can write C and ASM.

[–] obbeel@lemmy.eco.br 1 points 8 minutes ago

A "EE" degree won't get you into poking the right things into memory using BASIC. How about your "EE" programmers try that understanding of hardware?

To answer your question: no, I don't think that. I know there are some areas of computing where having a deep understanding of the entire system is critical, like embedded systems. I mean, that necessity of deep understanding doesn't have to apply to every domain of computing, and creating abstractions is a useful way of dividing the work so we can make more complicated systems than one person could if they needed to understand every part.

[–] Feyd@programming.dev 28 points 9 hours ago (2 children)

LLMs don't add an abstraction layer. You can't competently produce software without understanding what they're outputting.

[–] chicken@lemmy.dbzer0.com 7 points 5 hours ago (1 children)

The author's point is that people already don't understand what the programs they write do, because of all the layered abstraction. That's still true whether or not you want to object to the semantics of calling the use of LLMs an abstraction layer.

[–] Feyd@programming.dev 6 points 4 hours ago (2 children)

Not knowing what cpu instructions your code compiles to and not understanding the code you are compiling are completely different things. This is yet another article talking up the (not real) capability of LLM coding assistants, though in a more round about way. In fact, this garbage blogspam should go on the AI coding community that was made specifically because the subscribers of the programming community didn't want it here, yet we keep getting these trying to skirt the line.

[–] codeinabox@programming.dev 2 points 1 hour ago (1 children)

In fact, this garbage blogspam should go on the AI coding community that was made specifically because the subscribers of the programming community didn't want it here.

This article may mention AI coding but I made a very considered decision to post it in here because the primary focus is the author's relationship to programming, and hence worth sharing with the wider programming community.

Considering how many people have voted this up, I would take that as a sign I posted it in the appropriate community. If you don't feel this post is appropriate in this community, I'm happy to discuss that.

[–] Feyd@programming.dev 1 points 1 hour ago (2 children)

You made a very considered decision that you could argue it's not technically AI booster bullshit, you mean.

[–] codeinabox@programming.dev 2 points 1 hour ago

What I'm saying is the post is broadly about programming, and how that has changed over the decades, so I posted it in the community I thought was most appropriate.

If you're arguing that articles posted in this community can't discuss AI and its impact on programming, then that's something you'll need to take up with the moderators.

I think there's room for people to try to grapple with the fact that, for good or ill, the industry is being impacted by LLM code assistants right now in a significant way. That doesn't mean this isn't a tech craze, or a flash in the pan, or a hype bubble that has gotten huge. And whether or not the bubble pops, I don't think it's unreasonable to think that code writing tools comparable to what we have now will be around for awhile, again for good or ill. This seems like a dev grappling, not sneaky AI booster bullshit.

[–] chicken@lemmy.dbzer0.com 2 points 3 hours ago

Talking about low level compilers seems like moving the goalposts, since they are way more well defined and vetted than the mass of software libraries and copy pasted StackOverflow functions a large segment of programming has been done with.

[–] mesamunefire@piefed.social 2 points 8 hours ago (1 children)

I mean you can ...but its gonna be slop.

[–] MNByChoice@midwest.social 4 points 6 hours ago (1 children)

One can get paid and advance through a career producing slop.

Good engineering is hard, and lots of that no longer happens.

[–] pinball_wizard@lemmy.zip 2 points 2 hours ago (1 children)

One can get paid and advance through a career producing slop.

And thank goodness for that! LoL.

I wonder if how many people who write code for a living genuinely think, deep down, they're truly awful coders because they've only ever coded in an environment that demands the fastest, absolute barest minimum quality. It's so rare to get to write code exactly how one would prefer it.

[–] rimu@piefed.social 2 points 7 hours ago (1 children)

Notice the heavy use of the em-dash throughout that post?

[–] codeinabox@programming.dev 3 points 4 hours ago (4 children)

There is much debate about whether the use em-dash is a reliable signal for AI generated content.

It would be more effective to compare this post with the author's posts before gen AI, and see if there has been a change in writing style.

[–] Mac@mander.xyz 3 points 2 hours ago (2 children)

I toned down mu em-dash usage because i don't want people to think it's AI :(

[–] MagicShel@lemmy.zip 1 points 8 minutes ago

I tuned my usage up once I realized it is universal punctuation. I used to be unfamiliar with it and agonize over which punctuation was best for a given sentence. Can't decide between a comma, semi-colon, comma clause, parenthetical, or what-not — just use an emdash and don't fucking worry about it.

I'm pretty sure I haven't been accused of being an LLM. Despite my lazy command of the emdash and comfortability with multisyllabic and archaic words, I think LLMs come across as insufferable bores and I don't think I do that — not to that degree, anyway.

[–] queerlilhayseed@piefed.blahaj.zone 1 points 57 minutes ago

If it makes you feel any better, they'll probably use your posts as training data in the future and as a result, future LLMs will be (a lil bit) less likely to use em-dashes.

[–] paraphrand@lemmy.world 1 points 3 hours ago (1 children)

Aww you’re no fun. Stop with the nuance.

[–] codeinabox@programming.dev 1 points 2 hours ago

My nuanced reply was in response to the nuances of the parent comment. I thought we shared articles to discuss their content, not the grammar.

[–] rimu@piefed.social 0 points 2 hours ago

That is not the only sign in that blog post, just the most obvious one.

[–] ThomasWilliams@lemmy.world -4 points 3 hours ago (3 children)

There's no debate, no one real uses em dash. Where is the em dash key on the keyboard?

[–] sping@lemmy.sdf.org 1 points 21 minutes ago

Compose minus minus for me. I use it frequently.

It turns out that modern software supports something called "Copy and Paste" that makes it easy to insert an em-dash whenever—and wherever—you want.

[–] codeinabox@programming.dev 1 points 3 hours ago

There are plenty of humans using em dash, how do you think large language models learnt to use them in the first place? NPR even did an episode on it called Inside the unofficial movement to save the em dash — from A.I.

[–] Feyd@programming.dev 9 points 9 hours ago (1 children)

I say that knowing how often those words have been wrong throughout history.

Yup

Previous technology shifts were “learn the new thing, apply existing skills.” AI isn’t that. It’s not a new platform or a new language or a new paradigm. It’s a shift in what it means to be good at this.

A swing and a miss

[–] OpenStars@piefed.social 5 points 8 hours ago

Technically it would have been true, it's just that A"I" does not deliver on that promise.