59
submitted 1 year ago by tree@lemmy.zip to c/feminism@beehaw.org

20 years after Mark Zuckerberg’s infamous ‘hot-or-not’ website, developers have learned absolutely nothing.


Two decades after Mark Zuckerberg created FaceMash, the infamously sexist “hot-or-not” website that served as the precursor to Facebook, a developer has had the bright idea to do the exact same thing—this time with all the women generated by AI.

A new website, smashorpass.ai, feels like a sick parody of Zuckerberg’s shameful beginnings, but is apparently meant as an earnest experiment exploring the capabilities of AI image recommendation. Just like Zuck’s original site, “Smash or Pass” shows images of women and invites users to rate them with a positive or negative response. The only difference is that all the “women” are actually AI generated images, and exhibit many of the telltale signs of the sexist bias common to image-based machine learning systems.

For starters, nearly all of the imaginary women generated by the site have cartoonishly large breasts, and their faces have an unsettling airbrushed quality that is typical of AI generators. Their figures are also often heavily outlined and contrasted with backgrounds, another dead giveaway for AI generated images depicting people. Even more disturbing, some of the images omit faces altogether, depicting headless feminine figures with enormous breasts.

According to the site’s novice developer, Emmet Halm, the site is a “generative AI party game” that requires “no further explanation.”

“You know what to do, boys,” Halm tweeted while introducing the project, inviting men to objectify the female form in a fun and novel way. His tweet debuting the website garnered over 500 retweets and 1,500 likes. In a follow-up tweet, he claimed that the top 3 images on the site all had roughly 16,000 "smashes."

Understandably, AI experts find the project simultaneously horrifying and hilariously tonedeaf. “It's truly disheartening that in the 20 years since FaceMash was launched, technology is still seen as an acceptable way to objectify and gather clicks,” Sasha Luccioni, an AI researcher at HuggingFace, told Motherboard after using the Smash or Pass website.

One developer, Rona Wang, responded by making a nearly identical parody website that rates men—not based on their looks, but how likely they are to be dangerous predators of women.

The sexist and racist biases exhibited by AI systems have been thoroughly documented, but that hasn’t stopped many AI developers from deploying apps that inherit those biases in new and often harmful ways. In some cases, developers espousing “anti-woke” beliefs have treated bias against women and marginalized people as a feature of AI, and not a bug. With virtually no evidence, some conservative outrage jockeys have claimed the opposite—that AI is “woke” because popular tools like ChatGPT won’t say racial slurs.

The developer’s initial claims about the site’s capabilities seem to be exaggerated. In a series of tweets, Halm claimed the project is a “recursively self-improving” image recommendation engine that uses the data collected from your clicks to determine your preference in AI-generated women. But the currently-existing version of the site doesn’t actually self-improve—using the site long enough results in many of the images repeating, and Halm says the recursive capability will be added in a future version.

It's also not gone over well with everyone on social media. One blue-check user responded, "Bro wtf is this. The concept of finetuning your aesthetic GenAI image tool is cool but you definitely could have done it with literally any other category to prove the concept, like food, interior design, landscapes, etc."

Halm could not be reached for comment.

“I’m in the arena trying stuff,” Halm tweeted. “Some ideas just need to exist.”

Luccioni points out that no, they absolutely do not.

“There are huge amounts of nonhuman data that is available and this tool could have been used to generate images of cars, kittens, or plants—and yet we see machine-generated images of women with big breasts,” said Luccioni. “As a woman working in the male-dominated field of AI, this really saddens me.”


you are viewing a single comment's thread
view the rest of the comments
[-] potterman28wxcv@beehaw.org 3 points 1 year ago

If you have ever printed a photo of your SO, haven't you ever thought "Yeah they are pretty on that photo" ?

How would it be any different for men who look at a picture of a women ? No matter the medium used (ink, pixels..).

Yeah, these pictures do not come from real people. But they do remain pictures. If you look at AI generated images of beautiful landscape, you will still find those landscapes to be beautiful although they have been generated by AI.

The looks is one of the possible drive for sexuality. It's probably the most obvious and most accessible one. Now there is a big gap between sexual desire and serious relationships - people can find someone to be sexually desirable (as in, they probably wouldn't say no to a sexual experience assuming they are free to do so) and yet not want to get in relation with them

I think we should not be rejecting our sexual impulses. We have all the right in the world to find people to be sexually beautiful or not. It's best to accept it than to say "Stop it ! It's bad to like a woman because of her curves!". However, we should be aware that our impulses are just that - impulses- and that it should never become obsessive ; and we must always remain respectful of the other persons, including their privacy (it would be disrespectful to stare openly at someone just because we find them pretty)

[-] AnalogyAddict@beehaw.org 3 points 1 year ago* (last edited 1 year ago)

The difference is that a photo of my SO represents a real person in my life. I'm affected by their wants, imperfections, needs and humanity.

It's not bad to like a WOMAN. It's bad to equate a fantasy with a woman, and have a hard time differentiating between them.

[-] potterman28wxcv@beehaw.org 1 points 1 year ago

I see your point but to me that's no different than finding movie stars pretty or find a character from a comics to be hot. It even happened to me to find the character of a book to be hot - although there's no picture, just text. And, honestly, I don't see how that's bad.

No, what is bad in that app is not that men get sexual feelings for AI images. What is bad is that there's this big button "smash" that objectifies women, effectively treating the gender like sex dolls. It also doesn't help because these images are surreal - with features that women do not have in general. If you train your brain to pick up on these fake pics with big breasts, you will perhaps also be selective in real life and find nobody.

That's the two biggest problems I see with that app. But I don't find anything wrong about liking an AI picture by itself.

this post was submitted on 05 Sep 2023
59 points (98.4% liked)

Feminism

1869 readers
7 users here now

Feminism, women's rights, bodily autonomy, and other issues of this nature. Trans and sex worker inclusive.

See also this community's sister subs LGBTQ+, Neurodivergence, Disability, and POC

Also check out our sister community on lemmy:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS