this post was submitted on 27 Apr 2026
1014 points (98.4% liked)
Technology
84597 readers
4009 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I find they fail a lot and spit out code with lots of problems more often than not.
Studies also show that humans have better quality output and can apply reasoning and logic where LLMs cannot. LLMs will just hallucinate to fill blanks... Its not even close to prime time output.
Not to mention very insecure code output as well..
Its not "AI" its hype around text prediction.
This is irrelevant as CEO's and shareholders keep screaming "moar AI!". Engineers know limits of the tool but are forced to ignore those. So yes,LLM is comming for your job but not because it can do it,but because it is perceived to be cheaper.
It wont be cheap if it fails
not "if" but "when" but that is of no concern to CxO's and "market analysts" 😉