this post was submitted on 07 Mar 2026
610 points (98.6% liked)
Technology
82460 readers
2461 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Of course someone who doesn't believe "truth" exists thinks LLMs are just fine. You have to not believe things can be true in order to find their output acceptable.
Every week I see a new post of an LLM being blantly wrong. LLMs said to add glue to pizza to make the cheese stick together.
"They have improved the models since then..." Last week the American military used "AI" and it targeted a school as a military structure. The models are full of shit, they just manually remove the blantly incorrect shit whenever they make the rounds, and there's always more blantly incorrect shit to be found.