this post was submitted on 12 Jan 2026
544 points (96.9% liked)

Games

22748 readers
523 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] lvxferre@mander.xyz 26 points 1 day ago* (last edited 14 hours ago) (3 children)

IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it's fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

Non-consensual porn victimises the person being depicted, because it violates the person's rights over their own body — including its image. Plus it's ripe material for harassment.

This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

Now, someone else mentioned Bart's dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There's no victim.


EDIT: I'm going to abridge what I said above, in a way that even my dog would understand:

What Grok is doing is harmful, there are victims of that, regardless of some "ackshyually this is not CSAM lol lmao". And yet you guys keep babbling about definitions?

Everything else I said here was contextualising and detailing the above.

Is this clear now? Or will I get yet another lying piece of shit (like @Atomic@sh.itjust.works) going out of their way to misinterpret what I said?

(I don't even have a dog.)

[–] Atomic@sh.itjust.works 3 points 14 hours ago

What exactly have I lied about?

I've never once tried to even insinuate that what grok is doing ok. Nor that it should be. What I've said. Is that it doesn't even matter if there are an actual real person being victimized or not. It's still illegal. No matter how you look at it. It's illegal. Fictional or not.

Your example of Bart in the Simpsons movie is so far out of place I hardly know where to begin.

It's NOT because he's fictional. Because fictional depictions of naked children in sexually compromised situations IS illegal.

Though I am glad you don't have a dog. It would be real awkward for the dog to always be the smartest being in the house.

[–] WorldsDumbestMan@lemmy.today 1 points 1 day ago (1 children)

Supporting CSAM should be treated like making CSAM.

Down into the forgetting hole with them!

[–] lvxferre@mander.xyz 0 points 23 hours ago (1 children)

Nobody here is supporting CSAM. Learn to read, dammit.

[–] WorldsDumbestMan@lemmy.today 1 points 19 hours ago* (last edited 19 hours ago) (2 children)

He implicitly is.

EDIT: Wait, what is this about? Did I missphrase something?

[–] lvxferre@mander.xyz 2 points 14 hours ago* (last edited 14 hours ago)

Fuck! I misread you. Yes, you're right, Tim Sweeney is supporting CSAM.

Sorry for the misunderstanding, undeserved crankiness, and defensiveness; I thought you were claiming I was the one doing it. That was my bad. (In my own defence, someone already did it.)


Now, giving you a proper answer: yeah, Epic is better sent down the forgetting hole. And I hope Sweeney gets haunted by his own words for years and years to come.

[–] EldritchFeminity@lemmy.blahaj.zone 3 points 16 hours ago (1 children)

They mistook your comment as disagreeing with their take on how there are real victims of Grok's porn and CSAM and saying that they themselves were supporting CSAM, rather than saying that you agree and were saying Sweeney is supporting CSAM.

[–] WorldsDumbestMan@lemmy.today 2 points 12 hours ago (1 children)

Gasp "Lvxferre! You damn Diddy demon! How could youuuu!"

[–] lvxferre@mander.xyz 2 points 10 hours ago

At this rate I'm calling dibs on your nickname 🤣