this post was submitted on 13 Mar 2026
-19 points (25.6% liked)

Technology

82581 readers
3099 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Scary article

you are viewing a single comment's thread
view the rest of the comments
[–] Yaky@slrpnk.net 14 points 18 hours ago (2 children)

We had "coding without coders" in late 90s (maybe 2000s) with VB and Access databases. Some of my coworkers maintained such "software" previously written by not-a-dev.

And then there was "low code" fad about ten years ago? There was "coding" with diagrams and such, like Scratch but for serious people.

And what will regular developers do? Probably the same old shit, digging in decades-old, hastily-written, and now LLM-generated code, making it all work, and adding functionality. While "architects" and management will draw diagrams (with AI now!), and try to abstract everything into the cloud (and now into AI probably, somehow)

[–] MountingSuspicion@reddthat.com 2 points 17 hours ago (1 children)

While I personally don't like AI, I do think it is changing things. I don't think it's ever safe to run code without oversight from an actual programmer, but AI will likely affect the number of programmers being hired in a non negligible way.

[–] jbloggs777@discuss.tchncs.de 2 points 13 hours ago (1 children)

If it's run in a good sandbox, it'll be safer than most of the code you run.

Then you add in controlled interfaces/gateways to give it "just enough" power to do something interesting... and you audit the hell out of those.

Risk is something that has to be managed, because it usually can't be eliminated.

[–] MountingSuspicion@reddthat.com 1 points 13 hours ago

If you sandbox anything it'll be safer than otherwise. Not really sure what you're suggesting. I would still want the code reviewed regardless of the safety measures in place.

I wrote a program that basically auto organizes my files for me. Even if an AI was sandboxed and only had access to the relevant files and had no delete privileges, I would still want the code reviewed. Otherwise it could move a file into a nonsensical location and I would have to go through all possible folders to find it. Someone would have to make the interfaces/gateways and also review the code. There's no way to know how it's working, so there's no way to know IF it's working, until the code is reviewed. Regardless of how detailed you prompt, AI will generate something that possibly (currently very likely) needs to be adjusted. I'm not going to take an AIs raw output and run it assuming the AI did it properly, regardless of the safety measures.

[–] vacuumflower@lemmy.sdf.org 2 points 18 hours ago

There was “coding” with diagrams and such, like Scratch but for serious people.

Yep. Genesys IRD is kinda good (well-tested), Genesys Composer I hate, and recent cloud Genesys workflows are not usable for anything but demos without pain.