this post was submitted on 30 Dec 2025
479 points (98.8% liked)
Technology
78098 readers
3314 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I plugged my local AI into offline wikipedia expecting a source of truth to make it way way better.
It’s better, but I also can’t tell when it’s making up citations now, because it uses Wikipedia to support its own world view from pre training instead of reality.
So it’s not really much better.
Hallucinations become a bigger problem the more info they have (that you now have to double check)
At my work, we don't allow it to make citations. We instruct it to add in placeholders for citations instead, which allows us to hunt down the info, ensure it's good info, and then add it in ourselves.
That's still looking for sources that fit a predetermined conclusion, not real research
Yup.
In some instances that's sufficient though, depending on how much precision you need for what you do. Regardless, you have to review it no matter what it produces.
That probably makes sense.
I haven’t played around since the initial shell shock of “oh god it’s worse now”