this post was submitted on 29 Dec 2025
117 points (96.1% liked)

Funny

12747 readers
752 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS
 

top 21 comments
sorted by: hot top controversial new old
[–] ZombiFrancis@sh.itjust.works 5 points 8 hours ago

Plug and play, except you have to provide instructions and chastisement first.

[–] Skyrmir@lemmy.world 10 points 10 hours ago (1 children)

It's the difference between an algorithm and actually thinking about a problem. An LLM isn't trying to solve the problem, it's giving the most likely response words. An LLM will never be able to make that leap to actually thinking.

In the mean time, it can save a lot of typing by spitting out boiler plate code blocks. Not that typing is in any way to biggest part of the job.

[–] spicehoarder@lemmy.zip 7 points 8 hours ago (2 children)

While I wholeheartedly agree with your statement. I think the reason it spits out thousands of lines of code instead of the obvious answer is because of poor training data and unqualified individuals assuming more code means better.

[–] Hawk@lemmy.dbzer0.com 5 points 8 hours ago (1 children)

Doesn't surprise me. The internet is filled with outdated code and personal blogs detailing stuff you should never put in production code.

AI is just going to regurgitate those instead of the actual optimal answer, so unless you steer it in the right direction, it spits out what it sees most in it's training data.

[–] spicehoarder@lemmy.zip 1 points 6 hours ago (1 children)

No, dead ass. all it's training data becomes useless the moment you make a breaking change to an API.

[–] Frenchgeek@lemmy.ml 1 points 5 hours ago

Didn't the training data become useless the moment AI code ended up in it?

[–] village604@adultswim.fan 1 points 8 hours ago* (last edited 8 hours ago)

Bad prompt formation is also an issue. You can't just ask a one sentence question and get good results. For best results you need be very specific and get it to prompt you for more details if needed.

But it's very much 'garbage in garbage out'.

[–] driving_crooner@lemmy.eco.br 36 points 21 hours ago (2 children)

I was code reviewing some coworker code and it had a really complicated function that took two dates, do aa loop and returned a list with all the months between the two and I asked him why he was using that instead of pd.date_range and it was like that was the solution the AI proposed to me and it worked fine (except that it was causing a bottleneck on his program).

[–] deadbeef79000@lemmy.nz 31 points 20 hours ago (2 children)

worked fine

Translation: didn't understand it.

causing a bottleneck

Then it didn't work fine.

AI tools don't make up for lack of skill or experience, but in the hands of a skilled experienced developer can be formidable.

[–] zogrewaste_@sh.itjust.works 5 points 11 hours ago

The issue being that one is unlikely to gain the experience necessary to fully leverage the power of the tool, if it's the primary way to code, because it does too much, to readily...

How many CNC guys have the intuition of an of old school master machinist? Some do, most don't. Plus, one of those masters can viably run many machines, with an unskilled observer monitoring to catch catastrophic fails. Fewer good jobs because of that. When automaton takes the learning out of the curve, very few people will put in the extra effort to grow beyond what's needed for minimum viability, with all the knock on consequences that brings.

LLM coding may not kill programming as we know it right now, but I think it's just a matter of time, just like with US machining/manufacture. Once the learning track to mastery becomes unrewarded, very few will walk it.

[–] village604@adultswim.fan 1 points 8 hours ago

AI is just a tool. It's a shame it's being billed as a solution.

[–] venusaur@lemmy.world 2 points 20 hours ago (1 children)

I use it to write macros in Excel. It’s awesome because it does the job and saves a ton of time because I don’t know VBA well. Worked fine is fortunately good enough for the things I’m using AI to code.

[–] driving_crooner@lemmy.eco.br 3 points 20 hours ago (1 children)

I use it to translate old VBA code into Python.

[–] spicehoarder@lemmy.zip 3 points 19 hours ago

And the cycle continues.

[–] Ephera@lemmy.ml 15 points 19 hours ago (1 children)

I really hate, how it will gladly generate dozens of lines of complex algorithms, when it doesn't find the obvious solution right away. Particularly, because you will readily find colleagues that just do not care.

They probably stop reading the code in detail when it's sufficiently long enough. And when you tell them that what they've checked in is terrible and absolutely unreadable, they don't feel responsible for it either, because the AI generated it.

[–] scrubbles@poptalk.scrubbles.tech 8 points 10 hours ago

Lazy engineers who don't review AI generated code so far are keeping me employed. I ship so many bugfixes

[–] DontRedditMyLemmy@lemmy.world 13 points 20 hours ago (2 children)

AI says "oh my god" now? What could that mean?

[–] kadu@scribe.disroot.org 3 points 10 hours ago (1 children)

Chat boxes get a system prompt describing them as a helpful assistant, and then a blank, they need to predict how to fill the blank. Then they get the exact same system prompt, the word they just filled, and a new blank. Repeat until the blank becomes an ending token.

This automatically means the AI is likely to answer in a way a human would find natural not necessarily optimal or correct.

Which is why the "Oh my god, you're right! I missed this obvious feature!" remarks appear even in coding agents.

[–] DontRedditMyLemmy@lemmy.world 1 points 5 hours ago

I meant philosophically, not technically

[–] Zachariah@lemmy.world 8 points 20 hours ago (1 children)
[–] elvith@feddit.org 1 points 19 hours ago