this post was submitted on 11 Mar 2026
27 points (96.6% liked)

Games

21258 readers
118 users here now

Tabletop, DnD, board games, and minecraft. Also Animal Crossing.

Rules

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] towhee@hexbear.net 11 points 16 hours ago* (last edited 16 hours ago) (3 children)

People have written so much shit about this that writing more just feels like pissing into an ocean of piss but in brief, AI code:

  • Generally looks correct but is not (the most dangerous kind of correct!)
  • Reduces the share of the codebase that is held in the minds of those who maintain it, meaning further work on the codebase is either more difficult or incentivizes further giving up control to an LLM (aka hosted proprietary software blob)
  • Takes the hard work put into open source software and launders it into proprietary or permissively-licensed software through the usual LLM plagiarism process
  • Has a corrosive effect on the nicest part of open source, which is people voluntarily choosing to work together for a shared common good
[–] PorkrollPosadist@hexbear.net 3 points 9 hours ago

In the spirit of the GNU project re-defining well-known acronyms and abbreviations, I've noticed developers on the Guix mailing lists referring to LLMs as "License Laundering Machines."

[–] neo@hexbear.net 13 points 15 hours ago

Your points are all correct but for the first one.

The dangerous thing about LLM-generated code is not that it generally looks correct but isn’t. The danger is it oftentimes is correct and oftentimes isn’t.

The fact that it can be actually correct is dangerous. It lulls actual programmers into a false sense of security with it. It makes them cognitively lazy. And then when it turns out that it produces something wrong it slips by.

And even worse, what it assuredly does is convince bosses and non-programmers that THEY are correct and know even better than people who actually studied programming and learned the craft!

I never believed “anyone can code” was a worthwhile goal or objective, one that was aggressively pursued and promoted in the 2010s. Perhaps anyone can. Maybe anyone can be a mathematician. Maybe anyone can be an electrician. But I always saw it for what it was: a naked attempt to devalue the skill of programming and make the labor for it cheap.

Now anyone can be tricked into thinking they can code. Good or bad, it doesn’t matter. The software is about to get a lot worse.

[–] BimboChristmas@hexbear.net 4 points 16 hours ago (1 children)

Yeah sorry just never thought about this. But sounds like it would be shitty even if someone self hosted their own llm for it.

[–] spectre@hexbear.net 1 points 15 hours ago

A lot of it is about how it's used. I think the second point is the most important. A lot of [software] engineering is familiarity with the topic and tools used. The mental map of the architecture of how everything fits together is powerful, and giving that all up to an LLM is a huge loss if you are using it to write anything more than a basic function.

In my practice in use it in a couple spots:

  • rewrite this section in a more readable, standard manner (when ive laid down some real slop of my own). For me this is as much a learning opportunity of copying something out of stackoverflow. I take a moment to understand what change was made so I can use the same pattern in the future where appropriate.
  • read this file and add docstrings and comments (which will be like 75% correct and at least save me the time of formatting everything. I obviously need to make corrections and add context about how the functions are used that the LLM doesn't have access to.

Using it more than that feels like a heavy risk of brain drain to me.