this post was submitted on 05 Apr 2026
202 points (100.0% liked)

LinkedinLunatics

6673 readers
360 users here now

A place to post ridiculous posts from linkedIn.com

(Full transparency.. a mod for this sub happens to work there.. but that doesn't influence his moderation or laughter at a lot of posts.)

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] HarkMahlberg@kbin.earth 4 points 4 hours ago (1 children)

That's an... interesting correlation they're making, more code = more money. I know it's not you personally making that comparison, but man is it strange. That's a very business school way of thinking.

What good is "more code" from the LLMs, if I have to scrutinize it for bugs and vulnerabilities? More code only means more surface area, more points of failure. And of the AI I've tried, every single one writes far far far too much code. And all that time in code review, QA, user acceptance testing, that absolutely does not make the company more money - it costs them more money, in paying for labor. And it doesn't get the product to the end user faster anyway.

I'm just ranting and this a minor point, but speed is also not the only metric I would care about. I'd also care about making sure the user doesn't experience many bugs - preferably no bugs at all. The classic engineer's triangle still holds: "Fast, Cheap, and Good: choose 2." And AI seems to pick "Fast" twice. XD

[–] OwOarchist@pawb.social 2 points 1 hour ago

More code only means more surface area, more points of failure. And of the AI I’ve tried, every single one writes far far far too much code. And all that time in code review, QA, user acceptance testing, that absolutely does not make the company more money - it costs them more money, in paying for labor. And it doesn’t get the product to the end user faster anyway.

Duh, just have the LLM do code review, QA, and testing for you! And then blindly ship it to production once that's done.