this post was submitted on 27 Apr 2026
1014 points (98.4% liked)

Technology

84597 readers
3946 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] droopy4096@lemmy.ca 1 points 2 weeks ago (1 children)
[–] sturmblast@lemmy.world 2 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I find they fail a lot and spit out code with lots of problems more often than not.

Studies also show that humans have better quality output and can apply reasoning and logic where LLMs cannot. LLMs will just hallucinate to fill blanks... Its not even close to prime time output.

Not to mention very insecure code output as well..

Its not "AI" its hype around text prediction.

[–] droopy4096@lemmy.ca 1 points 2 weeks ago (1 children)

This is irrelevant as CEO's and shareholders keep screaming "moar AI!". Engineers know limits of the tool but are forced to ignore those. So yes,LLM is comming for your job but not because it can do it,but because it is perceived to be cheaper.

[–] sturmblast@lemmy.world 1 points 2 weeks ago (1 children)
[–] droopy4096@lemmy.ca 2 points 2 weeks ago

not "if" but "when" but that is of no concern to CxO's and "market analysts" 😉