this post was submitted on 08 Nov 2025
51 points (96.4% liked)

Asklemmy

51225 readers
262 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS
 

Clarification:

Judging by the fact that AI progress will never stop and will eventually be able to replace truth with fiction, it will become impossible to trust any article, and even if it is possible, not all of them, and we won't even be able to tell exactly what's true and what's fiction.

So, what if people from different countries and regions exchanged contacts here and talked about what's really happening in their countries, what laws are being passed, etc., and also shared their well-thought-out theories and thoughts?

If my idea works, why not sober up as many people as possible that only similar methods will be able to distinguish reality from falsehood in the future?

I'm also interested in your ideas, as I'm not much of an expert.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Old_Dread_Knight@lemmy.world 1 points 4 days ago (1 children)

One could say that falsification and deception of the population has reached a new level.

[โ€“] SkyNTP@lemmy.ca 9 points 4 days ago* (last edited 4 days ago) (1 children)

The tools to manufacture content are more accessible, sure. But again, information has always been easy to manufacture. Consider a simple headline:

[Group A] kills 5 [Group B] people in terrorist plot.

I used no AI tools to generate it, yet I was able to create it with minimal effort nonetheless. You would be rightfully skeptical to question its veracity unless you recognized my authority.

The content is not important. The person speaking it and your relationship of trust with them is. The evidence is only so good as the chain of custody leading to the origin of that piece of evidence.

Not only that, but a lot of people already avoid hard truths, and seek to affirm their own belief system. It is soothing to believe the headline if you identify as a member of Group B and painful if you identify as a member of Group A. That phenomena does not change with AI.

Our relationship with the truth is already extremely flawed. It has always been a giant mistake to treat information as the truth because it looks a certain way. Maybe a saturation of misinformation is the inoculation we need to finally break that habit and force ourselves to peg information to a verifiable origin (the reality we can experience personally, as we do with simple critical thinking skills). Or maybe nothing will change because people don't actually want the truth, they just want to soothe themselves. I guess my point is we are already in a very bad place with the truth, and it seems like there isn't much room for it to get any worse.

[โ€“] stonkage@aussie.zone 1 points 4 days ago

Brilliant post mate