this post was submitted on 29 Dec 2025
127 points (95.7% liked)

Funny

12760 readers
533 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS
 

you are viewing a single comment's thread
view the rest of the comments
[–] spicehoarder@lemmy.zip 7 points 23 hours ago (2 children)

While I wholeheartedly agree with your statement. I think the reason it spits out thousands of lines of code instead of the obvious answer is because of poor training data and unqualified individuals assuming more code means better.

[–] Hawk@lemmy.dbzer0.com 5 points 23 hours ago (2 children)

Doesn't surprise me. The internet is filled with outdated code and personal blogs detailing stuff you should never put in production code.

AI is just going to regurgitate those instead of the actual optimal answer, so unless you steer it in the right direction, it spits out what it sees most in it's training data.

[–] HeyThisIsntTheYMCA@lemmy.world 2 points 12 hours ago

one of these days an llm is going to sudo rm -rf /* itself and i need to buy an appropriate alcohol.

[–] spicehoarder@lemmy.zip 1 points 21 hours ago (1 children)

No, dead ass. all it's training data becomes useless the moment you make a breaking change to an API.

[–] Frenchgeek@lemmy.ml 3 points 20 hours ago

Didn't the training data become useless the moment AI code ended up in it?

[–] village604@adultswim.fan 1 points 23 hours ago* (last edited 23 hours ago)

Bad prompt formation is also an issue. You can't just ask a one sentence question and get good results. For best results you need be very specific and get it to prompt you for more details if needed.

But it's very much 'garbage in garbage out'.