1098
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 27 May 2024
1098 points (98.0% liked)
Technology
60070 readers
3167 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
I'm curious, are these hallucinations very prevalent? I'm outside under US so haven't seen the feature yet. But I have noticed that practically every article references the same glue incident.
So I'm not sure if the hallucinations are happening all the time, or everyone is just jumping on a handful of mistakes the AI made. If the latter, the situation reminds me of how every single accident involving a Tesla was reported on back in the day.
It will confidently report inaccurate information. It's usually not so hilariously wrong, but it's still wrong.
For example, I was talking with someone about what constituents a "fruit" botanically, and I searched "are beans fruit", and it confidently told me that beans are not a fruit, botanically speaking, because they're a legume. It seems to have adapted, but that's a good example of a "small wrong" that's not uncommon at all.