this post was submitted on 21 Jul 2025
698 points (98.6% liked)

Technology

607 readers
621 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No external video links, only native(.mp4,...etc) links under 5 mins.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

Misc:

Relevant Lemmy Communities:

founded 4 months ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] moosetwin@lemmy.dbzer0.com 13 points 2 months ago* (last edited 2 months ago) (3 children)

(Just to make sure we're on the same page, the first article describes deception as 'the systematic inducement of false beliefs in the pursuit of some outcome other than the truth'.)

Are you saying that AI bots do not do this behavior? Why is that?

(P.S. I am not saying this story is necessarily real, I am just want to know your reasoning)

[–] ech@lemmy.ca 10 points 2 months ago

Correct. Because there is no "pursuit of untruth". There is no pursuit, period. It's putting words together that statistically match up based on the input it receives. The output can be wrong, but it's not ever "lying", even if the words it puts together resemble that.

[–] f314@lemmy.world 9 points 2 months ago (1 children)

I’m not the guy you’re replying to, but I wanted to post this passage from the article about their definition:

It is difficult to talk about deception in AI systems without psychologizing them. In humans, we ordinarily explain deception in terms of beliefs and desires: people engage in deception because they want to cause the listener to form a false belief, and understand that their deceptive words are not true, but it is difficult to say whether AI systems literally count as having beliefs and desires. For this reason, our definition does not require this.

[–] ech@lemmy.ca 6 points 2 months ago

Their "definition" is wrong. They don't get to redefine words to support their vague (and also wrong) suggestion that llms "might" have consciousness. It's not "difficult to say" - they don't, plain and simple.