This is only going to get easier. The djinn is out of the bottle.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
"Djinn", specifically, being the correct word choice. We're way past fun-loving blue cartoon Robin Williams genies granting wishes, doing impressions of Jack Nicholson and getting into madcap hijinks. We're back into fuckin'... shapeshifting cobras woven of fire and dust by the archdevil Iblis, hiding in caves and slithering out into the desert at night to tempt mortal men to sin. That mythologically-accurate shit.
Doesn't mean distribution should be legal.
People are going to do what they're going to do, and the existence of this isn't an argument to put spyware on everyone's computer to catch it or whatever crazy extreme you can take it to.
But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.
Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.
This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me. The result is the same: fake porn/nudes.
And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.
I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.
People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me
Because now it's faster, can be generated in bulk and requires no skill from the person doing it.
I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.
Those were the days...
I hate this: "Just accept it women of the world, accept the abuse because it's the new normal" techbro logic so much. It's absolutely hateful towards women.
We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.
I don't know why you're being down voted. Sure, it's unfortunately been happening for a while, but we're just supposed to keep quiet about it and let it go?
I'm sorry, putting my face on a naked body that's not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it's actually believable because it's AI generated. That is SO much worse/psychologically damaging if they find out about it.
It’s unacceptable.
We have legal and justice systems to deal with this.
For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):
Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.
Telegram got right on it (not). Fuckers.
I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family's faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.
I agree there is nothing to be done, but it's painfully obvious to me that the scale and ease of it that makes it much more concerning.
The same reason AR15 rifles are different than muskets
To people who aren't sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There's also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it's like watching video the next day of yourself undergoing sex without consent as if you'd been drugged.
I'll admit I used to look at celeb deepfakes, but once I saw that video I stopped immediately and avoid it as much as I possibly can. I believe porn can be done correctly with participant protection and respect. Regarding deepfakes/revenge porn though that statistic about suicidal ideation puts it outside of healthy or ethical. Obviously I can't make that decision for others or purge the internet, but the fact that there's such regular and extreme harm for the (what I now know are) victims of non-consensual porn makes it personally immoral. Not because of religion or society but because I want my entertainment to be at minimum consensual and hopefully fun and exciting, not killing people or ruining their happiness.
I get that people say this is the new normal, but it's already resulted in trauma and will almost certainly continue to do so. Maybe even get worse as the deepfakes get more realistic.
once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.
Not saying that they are justified or anything but wouldn't people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.
The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.
You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you'll get a whole lot of complex PTSD instead.
People used to think their lives are over if they were caught alone with someone of the opposite sex they're not married to. That is no longer the case in western countries due to normalisation.
The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.
I think this is realistically the only way forward. To delegitimize any kind of nudes that might show up of a person. Which could be good. But I have no doubt that highschools will be flooded with bullies sending porn around of innocent victims. As much as we delegitimize it as a society, it'll still have an effect. Like social media, though it's normal for anyone to reach you at any time, It still makes cyber bullying more hurtful.
That's a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they're purposefully exploiting people who aren't tech savvy.
I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.
I wish everyone involved in this use of AI a very awful day.
Imagine hiring a hit man and then realizing he hired another hit man at half the price. I think the government should compensate them.
The people being exploited are the ones who are the victims of this, not people who paid for it.
it's a "I don't know tech" tax
That's like 80% of the IT industry.
IDK, $10 seems pretty reasonable to run a script for someone who doesn't want to. A lot of people have that type of arrangement for a job...
That said, I would absolutely never do this for someone, I'm not making nudes of a real person.
Scam is another thing. Fuck these people selling.
But fuck dude they aren't taking advantage of anyone buying the service. That's not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.
NOBODY on that side of the equation are bring exploited 🤣
I doubt tbh that this is the most severe harm of generative AI tools lol
Pretty sure we will see fake political candidates that actually garner votes soon here.
The Waldo Moment manifest.
Porn of Normal People
Why did they feel the need to add that "normal" to the headline?
To differentiate from celebrities.
Because it's different to somebody going online and finding a stock picture of Taylor Swift
You can get 300 tokens in pornx dot ai for $9.99. This guy is ripping people off.
I'd like to share my initial opinion here. "non consential Ai generated nudes" is technically a freedom, no? Like, we can bastardize our president's, paste peoples photos on devils or other characters, why is Ai nudes where the line is drawn? The internet made photos of trump and putin kissing shirtless.
They're making pornography of women who are not consenting to it when that is an extremely invasive thing to do that has massive social consequences for women and girls. This could (and almost certainly will) be used on kids too right, this can literally be a tool for the production of child pornography.
Even with regards to adults, do you think this will be used exclusively on public figures? Do you think people aren't taking pictures of their classmates, of their co-workers, of women and girls they personally know and having this done to pictures of them? It's fucking disgusting, and horrifying. Have you ever heard of the correlation between revenge porn and suicide? People literally end their lives when pornographic material of them is made and spread without their knowledge and consent. It's terrifyingly invasive and exploitative. It absolutely can and must be illegal to do this.
It's a far cry from making weird memes to making actual porn. Especially when it's not easily seen as fake.
Seems to fall under any other form of legal public humiliation to me, UNLESS it is purported to be true or genuine. I think if there’s a clear AI watermark or artists signature that’s free speech. If not, it falls under Libel - false and defamatory statements or facts, published as truth. Any harmful deep fake released as truth should be prosecuted as Libel or Slander, whether it’s sexual or not.
We are acting as if through out history we managed to orient technology so as to to only keep the benefits and eliminate negative effects. while in reality most of the technology we use still comes with both aspects. and it is not gonna be different with AI.