this post was submitted on 03 Dec 2024
118 points (100.0% liked)

chapotraphouse

13609 readers
698 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 4 years ago
MODERATORS
 
all 33 comments
sorted by: hot top controversial new old
[–] context@hexbear.net 46 points 3 weeks ago (2 children)

this is from a couple years ago. everyone is here thinking this is like minority report but it's not about individual behavior, this is just about using predictive models to self-justify racist policing patterns:

Fears of “The Thought Police” are probably running through your head right now, but Chattopadhyay is keen to stress that the focus isn’t on individuals, “We do not focus on predicting individual behavior, and do not suggest that anyone be charged with a crime that they didn’t commit, or be incarcerated for that. Our model learns from, and then predicts, event patterns in the urban space, for example, predicting that within a two block radius around the intersection of 67th Street and SW Avenue, there is a high risk of a homicide a week from now. It does not indicate who is going to be the victim or the perpetrator,” she says.

https://archive.ph/zgUjs

[–] MolotovHalfEmpty@hexbear.net 33 points 3 weeks ago (1 children)

Which of course has a certain degree of success baked in, because if you focus policing in a particular place you will find crimes there because a) crimes happen everywhere and b) cops can just juke the stats / make shit up / make arrests without cause.

[–] context@hexbear.net 22 points 3 weeks ago (1 children)

exactly. it's amazing to me that these nerds can talk themselves into creating an ouroboros like this because they don't actually bother to understand how any of this shit works, but i guess whatever justifies their salary...

[–] Infamousblt@hexbear.net 4 points 3 weeks ago* (last edited 3 weeks ago)

It's the result of other scientists pretending sociology isn't a science. Sociology makes shit like this worthless, so instead of just working together with sociologists, they ignore them.

[–] CDommunist@hexbear.net 22 points 3 weeks ago (1 children)

very-smart "using advanced AI, we have determined there will be more crime in the high crime area"

[–] context@hexbear.net 18 points 3 weeks ago (1 children)

it's even worse than that! they're treating crimes like they're forces of nature or fucking dice rolls to begin with and completely ignores the role police play in defining and creating crime and the construction of criminality!

[–] TheLastHero@hexbear.net 4 points 3 weeks ago (1 children)

AI bringing back miasma theory, past crimes are creating bad odors in the area that are just turning previously pure citizens into criminals. I hope the government gives the police more military equipment to purge these evil vapors

[–] context@hexbear.net 3 points 3 weeks ago

exactly but we call it "broken window theory" to jazz it up a bit

[–] fox@hexbear.net 37 points 3 weeks ago (2 children)

With 90% accuracy it will successfully identify 90 out of 100 criminals and falsely accuse 90 out of 1000 innocent people.

[–] AntiOutsideAktion@hexbear.net 14 points 3 weeks ago (1 children)

It's better for ten innocent people be jailed than for one full time wage to be paid

[–] underwire212@lemm.ee 1 points 3 weeks ago

10 innocent people may be jailed, but it’s a risk I’m willing to take

[–] Wolfman86@hexbear.net 2 points 3 weeks ago

I see anywhere where prisons are a private thing being massively in favour of this.

[–] Lussy@hexbear.net 25 points 3 weeks ago* (last edited 3 weeks ago)

us-foreign-policy

Wow how innovative

[–] SorosFootSoldier@hexbear.net 22 points 3 weeks ago

The Torment Nexus is only when you do 1984, not Minority Report.

[–] Feline@hexbear.net 21 points 3 weeks ago

U Chicago continuing its proud reactionary legacy https://en.wikipedia.org/wiki/Chicago_Boys

[–] gay_king_prince_charles@hexbear.net 21 points 3 weeks ago (2 children)
pub fn predict_crime(suspect: Person) -> bool {
   if suspect.race() == Race::Black {
       return true;
   } else {
       return false;
   }
}
[–] huf@hexbear.net 17 points 3 weeks ago (2 children)

ew...

pub fn predict_crime(suspect: Person) -> bool {
   return suspect.race() == Race::Black
}
[–] TheDoctor@hexbear.net 11 points 3 weeks ago (1 children)

Good change but also why is race a getter method while Race::Black is a constant enum? Is race an impure function dependent on global state? Is it derived from some other internal immutable state?

race() is a getter method as it is dependent on which Eastern and Southern Europeans are considered white at the time

[–] ProletarianDictator@hexbear.net 1 points 3 weeks ago (1 children)

you dont need the return statement either

[–] huf@hexbear.net 2 points 3 weeks ago (1 children)

i dont even know what language this is :D i just thought it'd be a nice bit to silently pass over the racism aspect and nitpick the code

[–] ProletarianDictator@hexbear.net 2 points 3 weeks ago

It's Rust.

If you omit the semicolon on the last line, it will return that value, so suspect.race() == Race::Black will return true/false for the containing expression.

[–] Barx@hexbear.net 10 points 3 weeks ago

Nerds with a rudimentary understanding of undergrad stats do this all the time with extra steps by just building a simplistic model based on (racist) "crime data". Sometimes literally just a basic Bayesian model.

And they get hired by Palantir to do versions of that for $300k/year.

[–] glans@hexbear.net 18 points 3 weeks ago

Predicts police behaviour not crime. And who can't do that.

[–] JoeByeThen@hexbear.net 18 points 3 weeks ago

Insider trading is probably very predictable, with enough data.

[–] DragonBallZinn@hexbear.net 17 points 3 weeks ago (2 children)

Can’t wait until in “freedomland” I get arrested not because I commit any crimes, but because I look like someone who might.

“Red always sus” but in real life.

[–] GalaxyBrain@hexbear.net 7 points 3 weeks ago (1 children)

That's been happening for a really really long time already. It's called racism, now they're teaching it to computers.

[–] Wolfman86@hexbear.net 2 points 3 weeks ago (1 children)

Didn’t it come out early on that AI was racist?

It didn't really “come out”. It was always known that garbage in leads to garbage out, and that models will reflect their training data. No serious researcher was surprised to learn that models reflect the biases of their training data, because that's part of the design.

[–] Evilphd666@hexbear.net 13 points 3 weeks ago

Does it poop out cute little billard balls too?

[–] TheDoctor@hexbear.net 10 points 3 weeks ago (1 children)

We do not focus on predicting individual behavior, and do not suggest that anyone be charged with a crime that they didn’t commit, or be incarcerated for that. Our model learns from, and then predicts, event patterns in the urban space, for example, predicting that within a two block radius around the intersection of 67th Street and SW Avenue, there is a high risk of a homicide a week from now. It does not indicate who is going to be the victim or the perpetrator.

We found that when stressed, the law enforcement response is seemingly different in high socio-economic-status (SES) areas compared to their more disadvantaged neighboring communities. It is suggested in the paper, that when crime rates spike, the higher SES neighborhoods tend to get more attention at the cost of resources drawn away from poorer neighborhoods.

link

[–] Philosoraptor@hexbear.net 14 points 3 weeks ago

We found that when stressed, the law enforcement response is seemingly different in high socio-economic-status (SES) areas compared to their more disadvantaged neighboring communities. shocked-pikachu