142
submitted 9 months ago by KarnaSubarna@lemmy.ml to c/privacy@lemmy.ml

The idea behind predictive policing is that by feeding historical crime data into a computer algorithm, it’s possible to determine where crime is most likely to occur, or who is most likely to offend. Law enforcement officials can then make proactive interventions, like conducting patrols in predicted crime locations, ideally stopping crime before it occurs.

Predictive policing systems rely on historical data distorted by falsified crime reports and disproportionate arrests of people of color,” the letter continues. “As a result, they are prone to over-predicting crime rates in Black and Latino neighborhoods while under-predicting crime in white neighborhoods. The continued use of such systems creates a dangerous feedback loop: biased predictions are used to justify disproportionate stops and arrests in minority neighborhoods, which further biases statistics on where crimes are happening.

Cameron was part of a joint effort between The Markup and Gizmodo that published an investigation in 2021 showing how a predictive policing algorithm developed by a company called Geolitica disproportionately directed officers to patrol marginalized communities almost everywhere it was used.

all 10 comments
sorted by: hot top controversial new old
[-] redknight942@sh.itjust.works 19 points 9 months ago

Approximately 0 of these people saw Minority Report and realized that PreCrime was a bad idea.

[-] AtmaJnana@lemmy.world 5 points 9 months ago

These are authoritarians. They saw it as a how-to.

[-] trolololol@lemmy.world 3 points 9 months ago

Haha I came here to write this

At least the pre cogs could actually see the future, instead of being a tool for confirmation bias such as this news report..... reports

Seriously, if you want to take AI seriously you need s feedback loop. 1sr round yeah use historical data because there's nothing else to use. Then keep training the model and feed it actual outcomes of successful and unsuccessful cases where it was used. It's not rocket science, folks. Reminder, the success rate here was 1%, so if you're not blind you'll pretty much see a giant red flag.

Only THEN it's even worth weeding out discrimination bias which is likely not to happen because it's actually the success criteria. But I digress.

[-] possiblylinux127@lemmy.zip 1 points 9 months ago

The movie isn't great but it does get one thing right.

[-] MonsiuerPatEBrown@reddthat.com 14 points 9 months ago

Whatever you do don't tweet at your local PD asking if they use these types of softwares like someone did to my local police department years ago.

It will yield bad results.

[-] BobGnarley@lemm.ee 4 points 9 months ago
[-] trolololol@lemmy.world 1 points 9 months ago

We need to know

[-] possiblylinux127@lemmy.zip 4 points 9 months ago

I can't imagine how predictive policing can go wrong. Also how do they use it in court? You are innocent until proven guilty.

[-] nooneescapesthelaw@mander.xyz 1 points 9 months ago

It just tells cops where to patrol, they don't arrest you based on it...

this post was submitted on 30 Jan 2024
142 points (99.3% liked)

Privacy

31609 readers
346 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS