485
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 20 Apr 2024
485 points (98.2% liked)
Gaming
19859 readers
45 users here now
Sub for any gaming related content!
Rules:
- 1: No spam or advertising. This basically means no linking to your own content on blogs, YouTube, Twitch, etc.
- 2: No bigotry or gatekeeping. This should be obvious, but neither of those things will be tolerated. This goes for linked content too; if the site has some heavy "anti-woke" energy, you probably shouldn't be posting it here.
- 3: No untagged game spoilers. If the game was recently released or not released at all yet, use the Spoiler tag (the little ⚠️ button) in the body text, and avoid typing spoilers in the title. It should also be avoided to openly talk about major story spoilers, even in old games.
founded 5 years ago
MODERATORS
I kinda fail to see the problem. The GPU owner doesn't see what workload they are processing. The pr0n company is willing to pay for GPU power. The GPU owner wants to earn money with his hardware. There's a demand, there's an offer, nobody is getting hurt (ai pr0n is not illegal, at least for now) so let people what they want to do
The problem is that they are clearly targeting minors who don't pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.
This is a shitty grift, abusing people who don't understand the consequences of the software.
Agreed. Preying on children who don't understand what they're signing up for is shitty to begin with.
Then, add that deepfake AI porn is unethical and likely illegal (and who knows what other kinds of potentially-illegal images are being generated...)
And, as you point out, the files having existed in the computer could, alone, be illegal.
Then, as and extra fuck you, burning GPU cycles to make AI images is causing CO2 emissions, GPU wear, waste heat that might trigger AC, and other negative externalities too, I'm sure...
It's shit all around.
Because most ai-generated pornography models are trained off actual nudes scraped off the internet; and not just those who work in the corporate porn industry. This essentially falls under the same morality as nonconsensual/revenge porn by allowing all and sundry to generate images off images the original posters never were polled for consent for.
But I forgot, this comm is plagued with treathounds that meatspace kink communities would throw out for a rule 3 breach; so I don't know why I'm inconveniencing the electrons to explain something that even the terminally-pornbrained should be able to comprehend...