this post was submitted on 08 Nov 2025
51 points (96.4% liked)
Asklemmy
51225 readers
262 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"That's a great question!"
The truth is, we don't need AI to have misinformation, and AI is not the biggest problem in the current post-truth society. There has been a war going on globally in undermining truth for a long time. The old saying, "The first casualty in war is truth" is invalid now, because truth is no longer relevant and lies are weaponised like never before in history. People don't want to be certain of something, their first reaction to news is to react at a deep and emotional level and the science of misinformation is highly refined and successful in making most people react in a certain way. It takes effort and training not to do that, and most of us can't.
Journalists have been warning us about this for decades but integrity costs money, and that funding has been under attack too. It's pretty depressing whichever way you look at it.