this post was submitted on 11 Nov 2025
389 points (96.4% liked)

Technology

77278 readers
3911 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] bismuthbob@sopuli.xyz 191 points 3 weeks ago* (last edited 3 weeks ago) (5 children)

Wow. If a black box analysis of arbitrary facial characteristics is more meritocratic than the status quo, that speaks volumes about the nightmare hellscape shitshow of policy, procedure and discretion that resides behind the current set of 'metrics' being used.

[–] UnderpantsWeevil@lemmy.world 45 points 3 weeks ago (1 children)

The gamification of hiring is largely a result of businesses de-institutionalizing Human Resources. If you were hired on at a company like Exxon or IBM in the 1980s, there was an enormous professionalized team dedicated to sourcing prospective hires, vetting them, and negotiating their employment.

Now, we've automated so much of the process and gutted so much of the actual professionalized vetting and onboarding that its a total crap shoot as to whom you're getting. Applicants aren't trying to impress a recruiter, they're just aiming to win the keyword search lottery. Businesses aren't looking to cultivate talent long term, just fill contract positions at below-contractor rates.

So we get an influx of pseudo-science to substitute for what had been a real sociological science of hiring. People promising quick and easy answers to complex and difficult questions, on the premise that they can accelerate the churn of staff without driving up cost of doing business.

[–] bismuthbob@sopuli.xyz 23 points 3 weeks ago

Gotcha. This is replacing one nonsense black box with a different one, then. That makes a depressing kind of sense. No evidence needed, either!

[–] bismuthbob@sopuli.xyz 32 points 3 weeks ago

All of that being typed, I'm aware that the 'If' in my initial response is doing the same amount of heavy lifting as the 'Some might argue' in the article. Barring the revelation of some truly extraordinary evidence, I don't accept the premise.

[–] AcidiclyBasicGlitch@sh.itjust.works 20 points 2 weeks ago (1 children)

Spoken like somebody with the sloping brow of a common criminal.

[–] scratchee@feddit.uk 15 points 2 weeks ago (3 children)

I really must commend you for overcoming your natural murderous inclinations and managing to become a useful member of society despite the depression in your front lobe. Keep resisting those dark temptations!

load more comments (3 replies)
[–] technocrit@lemmy.dbzer0.com 16 points 3 weeks ago

A primary application of "AI" is providing blackboxes that enable the extremely privileged to wield arbitrary control with impunity.

load more comments (1 replies)
[–] psycotica0@lemmy.ca 110 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

"Imagine appearing for a job interview and, without saying a single word, being told that you are not getting the role because your face didn’t fit. You would assume discrimination, and might even contemplate litigation. But what if bias was not the reason?

Uh... guys...

Discrimination: the act, practice, or an instance of unfairly treating a person or group differently from other people or groups on a class or categorical basis

Prejudice: an adverse opinion or leaning formed without just grounds or before sufficient knowledge

Bias: to give a settled and often prejudiced outlook to

Judging someone's ability without knowing them, based solely on their appearance, is, like, kinda the definition of bias, discrimination, and prejudice. I think their stupid angle is "it's not unfair because what if this time it really worked though!" 😅

I know this is the point, but there's no way this could possibly end up with anything other than a lazily written, comically clichéd, Sci Fi future where there's an underclass of like "class gammas" who have gamma face, and then the betas that blah blah. Whereas the alphas are the most perfect ughhhhh. It's not even a huge leap; it's fucking inevitable. That's the outcome of this.

I should watch Gattaca again...

[–] Tattorack@lemmy.world 32 points 3 weeks ago (1 children)

Like every corporate entity, they're trying to redefine what those words mean. See, it's not "insufficient knowledge" if they're using an AI powered facial recognition program to get an objective prediction, right? Right?

load more comments (1 replies)
[–] morriscox@lemmy.world 13 points 3 weeks ago (2 children)

People see me in cargo pants, polo shirt, a smartphone in my shirt pocket, and sometimes tech stuff in my (cargo) pants pockets and they assume that I am good at computers. I have an IT background and have been on the Internet since March of 1993 so they are correct. I call it the tech support uniform. However, people could dress similarly to try to fool people.

People will find ways, maybe makeup and prosthetics or AI modifications, to try to fool this system. Maybe they will learn to fake emotions. This system is a tool, not a solution.

[–] MalReynolds@slrpnk.net 23 points 3 weeks ago

Goodhart's law: "When a measure becomes a target, it ceases to be a good measure"

TLDR as soon as you have a system like this people will game it...

load more comments (1 replies)
load more comments (1 replies)
[–] panda_abyss@lemmy.ca 73 points 3 weeks ago* (last edited 3 weeks ago) (5 children)

Racial profiling keeps getting reinvented.

Fuck that.

They then used data on these individuals’ labour-market outcomes to see whether the Photo Big Five had any predictive power. The answer, they conclude, is yes: facial analysis has useful things to say about a person’s post-mba earnings and propensity to move jobs, among other things.

Correlation vs causation. More attractive people will be defaulted to better negotiating positions. People with richer backgrounds will probably look healthier. People from high stress environments will show signs of stress through skin wrinkles and resting muscles.

This is going to do nothing but enforce systemic biases, but in a kafkaesque Gattica way.

And then of course you have the garden of forking paths.

These models have zero restraint on their features, so we have an extremely large feature space, and we train the model to pick features predictive of the outcome. Even the process of training, evaluating, then selecting the best model at this scale ends up being essentially P hacking.

load more comments (5 replies)
[–] blubfisch@discuss.tchncs.de 62 points 2 weeks ago (3 children)

Cool. Literal Nazi shit, but now with AI 😵‍💫

[–] BreadstickNinja@lemmy.world 20 points 2 weeks ago

Basically the slogan for the 2020s

load more comments (2 replies)
[–] stickly@lemmy.world 60 points 3 weeks ago (1 children)
load more comments (1 replies)
[–] AmbitiousProcess@piefed.social 51 points 3 weeks ago (1 children)

The study claims that they analyzed participants' labor market outcomes, that being earnings and propensity to move jobs, "among other things."

Fun fact, did you know white men tend to get paid more than black men for the same job, with the same experience and education?

Following that logic, if we took a dataset of both black and white men, then used their labor market outcomes to judge which one would be a good fit over another, white men would have higher earnings and be recommended for a job more than black people.

Black workers are also more likely to switch jobs, one of the reasons likely being because you tend to experience higher salary growth when moving jobs every 2-3 years than when you stay with a given company, which is necessary if you're already being paid lower wages than your white counterparts.

By this study's methodology, that person could be deemed "unreliable" because they often switch jobs, and would then not be considered.

Essentially, this is a black box that gets to excuse management saying "fuck all black people, we only want to hire whites" while sounding all smart and fancy.

[–] shawn1122@sh.itjust.works 14 points 3 weeks ago

The goal here is to go back to a world where such racial hieraechies are accepted but without human accountability. This way you are subjugated arbitrarily but hey the computer said so, so what can we do about it?

[–] AbidanYre@lemmy.world 42 points 3 weeks ago (1 children)

Not April fool's or the onion? What the fuck?

load more comments (1 replies)
[–] ViatorOmnium@piefed.social 36 points 3 weeks ago (1 children)

Does it predict people that allegedly finished university not knowing the difference between correlation and causality?

This reminds me of a fraud risk classification model I once heard about, which ended up being an excellent income-by-postal-code classifier.

[–] UnderpantsWeevil@lemmy.world 15 points 3 weeks ago (1 children)

It predicts people with business school degrees getting six, seven, and eight figure salaries to blow smoke up the asses of the investor pool.

This reminds me of a fraud risk classification model I once heard about, which ended up being an excellent income-by-postal-code classifier.

The dark art of sociology is recognizing how poverty impacts human behaviors and then calibrating your business to profit off it.

load more comments (1 replies)
[–] neidu3@sh.itjust.works 34 points 3 weeks ago (1 children)

This is just phrenology with extra steps

load more comments (1 replies)
[–] knatschus@discuss.tchncs.de 30 points 3 weeks ago (2 children)

I remember when stuff like this was used to show how dystopian china is.

load more comments (2 replies)
[–] skisnow@lemmy.ca 30 points 2 weeks ago (1 children)

But what if bias was not the reason? What if your face gave genuinely useful clues about your probable performance?

I hate this so much, because spouting statistics is the number one go-to of idiot racists and other bigots trying to justify their prejudices. The whole fucking point is that judging someone's value someone based on physical attributes outside their control, is fucking evil, and increasing the accuracy of your algorithm only makes it all the more insidious.

The Economist has never been shy to post some questionable kneejerk shit in the past, but this is approaching a low even for them. Not only do they give the concept credibility, but they're even going out of their way to dishonestly paint it as some sort of progressive boon for the poor.

load more comments (1 replies)
[–] umbraroze@piefed.social 29 points 3 weeks ago (1 children)

Boeing CEO: "We're always innovating, and sometimes we need to boldly embrace the wisdom of the past if it can be re-examined in light of current technology. From now on, our airplane navigation systems will be based on the Flat Earth model. This makes navigation so much more computationally efficient, guys."

load more comments (1 replies)
[–] Tattorack@lemmy.world 29 points 3 weeks ago (1 children)

Woaw, we skipped right from diversity hiring to phrenology hiring without wasting a single beat. Boy has the modern world become efreceint.

load more comments (1 replies)
[–] wilfim@sh.itjust.works 28 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

This is so absurd it almost feels like it isn't real. But indeed, the article appears when I look it up

[–] 0x0@lemmy.zip 10 points 2 weeks ago (1 children)

It's very nazi Germany real actually.

load more comments (1 replies)
[–] verdi@feddit.org 28 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

FYI, it's not a paper, it's a blog post from well connected and presumably highly educated people benefiting from the institutional prestige to see their poorly conducted study be propagated ad eternum without a modicum of relevant peer review.

edit: After a few more minutes, it's an unreliable psychopath detector.

load more comments (1 replies)
[–] oce@jlai.lu 28 points 3 weeks ago (4 children)

I looked for the original article, abstract:

Human capital—encompassing cognitive skills and personality traits—is critical for labor market success, yet the personality component remains difficult to measure at scale. Leveraging advances in artificial intelligence and comprehensive LinkedIn data, we extract the Big 5 personality traits from facial images of 96,000 MBA graduates, and demonstrate that this novel" Photo Big 5" predicts school rank, compensation, job seniority, industry choice, job transitions, and career advancement. Using administrative records from top-tier MBA programs, we find that the Photo Big 5 exhibits only modest correlations with cognitive measures like GPA and standardized test scores, yet offers comparable incremental predictive power for labor outcomes. Unlike traditional survey-based personality measures, the Photo Big 5 is readily accessible and potentially less susceptible to manipulation, making it suitable for wide adoption in academic research and hiring processes. However, its use in labor market screening raises ethical concerns regarding statistical discrimination and individual autonomy.

The PDF is downloadable here: https://scholar.google.com/citations?view_op=view_citation&hl=en&user=2eia4X4AAAAJ&sortby=pubdate&citation_for_view=2eia4X4AAAAJ%3A_FxGoFyzp5QC

I don't have the time nor the expertise to read everything to understand how they take into account the bias that good looking white men with educated parents are way more likely to succeed at life.

[–] cypherpunks@lemmy.ml 31 points 3 weeks ago (1 children)

one can also get the full paper directly from yale here without needing to solve a google captcha:

https://insights.som.yale.edu/sites/default/files/2025-01/AI%20Personality%20Extraction%20from%20Faces%20Labor%20Market%20Implications_0.pdf

I don’t have the time nor the expertise to read everything to understand how they take into account the bias that good looking white men with educated parents are way more likely to succeed at life.

i admittedly did not read the entire 61 pages but i read enough to answer this:

spoilerthey don't

[–] underisk@lemmy.ml 14 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

Lmao they source the photos from LinkedIn profiles. I’m sure that didn’t bias their training at all. Yes sir there’s no chance this thing is selecting for anything but facial features.

Edit: double lmao they’re all MBAs

Edit2: they didn't even train the AI!! this paper is them just feeding linkedin photos into a third-party black-box API and then nodding thoughtfully at the results. i cant tell you how stupid the AI is because I can't find any information about it or even the API mentioned in the paper.

load more comments (2 replies)
load more comments (3 replies)
[–] gian@lemmy.grys.it 27 points 3 weeks ago

Last time did not end well for about 6 million people...

[–] entwine@programming.dev 27 points 3 weeks ago (3 children)

This fascist wave is really bringing out all the cockroaches in our society. It's a good thing you can't erase anything on the internet, as this type of evidence will probably be useful in the future.

You'd better get in on a crypto grift, Kelly Shue of the Yale School of Management. I suspect you'll have a hard time finding work within the next 1-3 years.

load more comments (3 replies)
[–] _cnt0@sh.itjust.works 23 points 3 weeks ago

Race theory 2.0 AI edition just dropped.

[–] buttnugget@lemmy.world 23 points 2 weeks ago

Actually, what if slavery wasn’t such a bad idea after all? Lmao they never stop trying to resurrect class warfare and gatekeeping.

[–] technocrit@lemmy.dbzer0.com 22 points 3 weeks ago* (last edited 3 weeks ago)

It's completely normal for fascists to promote pseudo-science. Always had been.

Indeed their publication is named after one of the worst pseudo-sciences.

[–] Boppel@feddit.org 22 points 3 weeks ago

"okay, okay, hear me out: what if nazi methods, but for getting a job. we could even tattoo their number on their arms. it's only consequent, we already devide by skin colour"

WTF

[–] uriel238@lemmy.blahaj.zone 20 points 3 weeks ago (1 children)

I thought phrenology was still a science at the time of the German Reich, only made defunct later. Now I have my doubts.

Social darwinism was disproven in the 1900s and supply-side economics died in the 19th century so it's not like pseudoscience does not spring up like weeds when rich people want to sponsor it.

load more comments (1 replies)
[–] cabbage@piefed.social 19 points 3 weeks ago (2 children)

Whatever it takes to keep hiring mediocre white men, I guess.

load more comments (2 replies)
[–] pelespirit@sh.itjust.works 16 points 3 weeks ago (1 children)

Everyone is kind of focusing on the hiring part, which is incredibly nazi already, but they're saying for lending too. Fucking yikes.

load more comments (1 replies)
[–] JustJack23@slrpnk.net 13 points 3 weeks ago (1 children)

How long before they start measuring skulls at job interviews

load more comments (1 replies)
[–] Brutticus@midwest.social 13 points 3 weeks ago (1 children)

Why stop there? Why just banks and hiring firms? why not allow access to Law Enforcement and use the phrenology robot to screen for pre crime?

load more comments (1 replies)
[–] 6nk06@sh.itjust.works 12 points 3 weeks ago (2 children)

What if being a nazi was meritocratic? How about no?

load more comments (2 replies)
[–] AdamEatsAss@lemmy.world 12 points 3 weeks ago (2 children)

Plastic surgery would become more popular. $10k work done to my nose to double my salary? Yes please.

[–] cypherpunks@lemmy.ml 18 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

Plastic surgery would become more popular.

One of the paper's authors had the same thought:

“Suppose this type of technology gets used in labor market screening, or maybe dating markets,” Shue muses. “Going forward, you could imagine a reaction in which people then start modifying their pictures to look a certain way. Or they could modify their actual faces through cosmetic procedures.”‌

She also bizarrely says that:

"we are very much not advocating that this technology be used by firms as part of their hiring process."

and yet, for some reason:

The next step for Shue and her colleagues is to explore whether certain personality types are drawn to specific industries or whether those personality types are more likely to succeed within given industries.

load more comments (2 replies)
load more comments (1 replies)
[–] Passerby6497@lemmy.world 11 points 3 weeks ago

"Some might argue that the authors of this article have their head so far up their own ass that they haven't seen daylight in years"

[–] pyre@lemmy.world 11 points 2 weeks ago

this should be grounds for a prison sentence. open support for Nazism shouldn't be covered by free speech laws.

[–] humanspiral@lemmy.ca 10 points 2 weeks ago (1 children)

Dystopian neutrality in article.

without discriminating on grounds of protected characteristics

AI classification is trained based on supervised (the right answers are predetermined) learning. MechaHitler for the fascist nationalism's sake, will rate Obama's face as a poor employee, and Trump's as the bestest employee.

Open training data sets would be subject to 1. zero competitive advantage to a model, 2. massive complaints about any specific training data.

For some jobs, psycopaths AND loyalty are desirable traits, even though they can be opposite traits. Honesty, integrity, intelligence can be desirable traits, or obstacles to desperate loyalty. My point is that if there are many traits determined by faces, much more training data is needed to detect them, and then human hiring decision based on 10 or 30 traits matching to a (impossibly unique) position, where there direct manager only cares about loyalty, without being too talented, but higher level managers might prefer a candidate with potential to replace their direct manager, but all of them care about race or pregnancy risk, and then post training based on some "illegal characteristics".

A Gattaca situation where, everyone has easy time getting great job, and moving to a greater job, OR being shut out of all jobs, creates a self contradicting prediction on "loyalty/desperation" controlability traits. If job duties are changed to include blow job services, then surely those agreeable make a better employee, despite any facial ticks responding to suggestion.

Human silent "illegal discrimination" is not eliminated/changed, but the new capability, you can use a computer to do the interviewing, and waste more interviewees' time at no human cost to employer is why this will be a success. A warehousing company recently looked at facial expressions to determine attention to safety, and this leads to "The AI punishments to your life will continue until you smile more." Elysium's automated parole incident interviews is a good overview of the dystopia.

load more comments (1 replies)
[–] ICastFist@programming.dev 10 points 2 weeks ago

Yeah, nothing says "this person will repay their loans" like looking at their face and nothing fucking else.

I love how you can just call it capetalismo in portuguese, capeta = devil

load more comments
view more: next ›