this post was submitted on 23 Mar 2026
61 points (96.9% liked)

Technology

82992 readers
2828 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 39 comments
sorted by: hot top controversial new old
[–] BlackLaZoR@lemmy.world 3 points 3 hours ago

Reminds me of case when Google effectively swatted dude who sent medical images of his infant son intimate areas to a doctor (due to COVID lockdown direct visit wasnt possible)

Nice to be accused of pedophilia based on your perfectly legit medical documentation.

Fuck you Google.

This reminds me that there are 1000s of SysAdmins that stumbled onto the Trumpstien emails. They were not hidding anything at all, that shit was setting flags off everywhere. The question is what happened next? Did the CIA show up at their door telling them they will put you in the ground if they have to and remind them that their Google employee NDA shuts thems up. Maybe the honeypot was so in-gained in Google that the emails were auto flagged as "part of an ongoing investigation" and IT just ignored them.

[–] IAmNorRealTakeYourMeds@lemmy.world 48 points 15 hours ago (4 children)

severe mix feelings.

glad they caught him, but corporations casually snooping through your data and report whatever they want is definitely not an good thing

[–] Dojan@pawb.social 28 points 14 hours ago* (last edited 13 hours ago) (1 children)

Was a gay guy here in Sweden who got assaulted and kidnapped by masked police because some American company had found CSAM on his account while crawling through Yahoo email.

Only it wasn’t CSAM, the photos depicted the man’s 30 year old twinky boyfriend.

No restitution. No police were punished for assaulting a suspect proved innocent. The man and his boyfriend both were humiliated.

I’ve no mixed feelings about it. Spying through private data is entirely unforgivable. There are plenty of pedos out there who get caught and nothing happens anyway. They don’t need to violate innocent people’s privacy to do their job.

Like if the ends justify the means you can end all suffering in the world by just nuking everything. All problems solved.

Edit: pesos → pedos

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 47 minutes ago (1 children)

Oh gods now you have me worried. 20 years ago I was a hundred pounds lighter and just a bag of skin holding a skeleton. There are some photos of me on my Google account that skinny. (also in your medical textbooks but anyways) and I also have photos of me now. We look like completely different people.

[–] bluejayway@lemmy.zip 1 points 10 minutes ago (1 children)

never too late to use another service to back up your photos! ente is a good alternative, i personally use proton drive (it’s kind of a crappy interface and not nearly as good as google, but it works). if you’re at all curious about self hosting, immich is basically a 1:1 google photos replacement.

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 1 minute ago* (last edited 1 minute ago)

google is our offsite backup. i've got a decent onsite already. i haven't the energy to de-google right now but like, in six months maybe

[–] tidderuuf@lemmy.world 8 points 15 hours ago (1 children)

Microsoft has been doing this for years. It was with Onedrive at first but now that they've enabled "analytics" in every product that might connect to the internet they can have it all searched.

Supposedly it is first filtered by algorithms but that shit is still being uploaded somewhere other than your hard drive.

[–] wizardbeard@lemmy.dbzer0.com 4 points 14 hours ago (1 children)

I believe it was in preview build versions of Win 7 or 10 where researchers found it was sending the generated thumbnails of images on your PC to Redmond (MS HQ). Can't remember if they said it was for CSAM detection or just a debugging feature in the preview builds.

[–] 0x0@lemmy.zip 2 points 10 hours ago

the generated thumbnails of images on your PC

So the precursor to Recall?

[–] obvs@lemmy.world 6 points 14 hours ago (1 children)

Unfortunately, the negative effects from companies like Google turning in completely ethical people for doing things that should be completely legal and uncontroversial will do drastically more damage than the positive effects from said companies turning in the poorest of the pedophiles.

[–] Zamboni_Driver@lemmy.ca -3 points 11 hours ago (1 children)
[–] obvs@lemmy.world 2 points 6 hours ago (1 children)

The company is literally building death camps, installing statues of genociders, is run by the RICH pedophiles(who have ZERO interest in seeing pedophiles prosecuted), and is using Palantir and Flock cameras to monitor everything, meanwhile having secret police disappear people and just openly slaughter them.

The United States Government is well beyond deserving the benefit of the doubt.

[–] Zamboni_Driver@lemmy.ca 1 points 5 hours ago (1 children)

Great do you have a single example of what you're claiming, lol. Google turning in a perfectly ethical person for doing something that should be legal and uncontroversial.

You're moving the goal posts and changing your argument.

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 44 minutes ago (2 children)
[–] Zamboni_Driver@lemmy.ca 1 points 32 minutes ago

I am so confused. Did you read the article that you posted???? Are you just straight up defending pedophilia and rape?

The Toronto detective alleges that after the alerts were passed to the RCMP and then Toronto police, she looked at three of the images and found they depicted naked prepubescent girls. The images included an explicit sex act and exposed genitals.

depicted who I believe to be David Edward-Ooi Poon without a shirt, taking a selfie of himself while sticking out his tongue over an unconscious adult female," the search-warrant application states. The document goes on to describe the woman in the photo as naked below the waist and wearing a dark-coloured eye mask over her eyes. The detective alleges that that photograph and others she examined appeared to be stored in a folder on the iPhone titled "Girls I Drugged And Raped."

The images included adult females with breasts and genitals exposed "who appeared to be unconscious," the ITO says. "The body positioning of the females appeared to be limp and did not significantly change throughout the images taken." Police allege they found other files on the iPhone that appeared to be "upskirt" images or photographs focusing on the buttocks of females, in folders with names suggesting they were underage girls.

Detectives laid 41 more charges in December including making and possessing child pornography, sexual assault, voyeurism for a sexual purpose and drugging someone to facilitate sexual assault.

Either you can't read, or you are an incredibly disgusting person.

[–] CompactFlax@discuss.tchncs.de -3 points 15 hours ago (3 children)

They’re suggesting it was automated hash based recognition.

I don’t have a problem with CSAM hash matching.

[–] UnspecificGravity@piefed.social 14 points 14 hours ago* (last edited 14 hours ago) (1 children)

Sure, until it starts flagging normal pictures with its janky AI and you get your door kicked in based on a warrant signed by Google.

[–] Dojan@pawb.social 8 points 14 hours ago

This literally already happened here in Sweden. A guy got assaulted by masked police in the middle of the night because an American company had gone through photos in his Yahoo mail and flagged his 30 year old boyfriend as possible CSAM.

Long article in Swedish.

People like to think that Sweden is progressive etc. and I’d rebut it with this. If it can happen here, it could happen anywhere.

[–] IAmNorRealTakeYourMeds@lemmy.world 8 points 14 hours ago (2 children)

my issue is that we have a framework for corporations to scan all your data and inform the state. used to stop CSAM, but it's a matter of state policy wether said structure will be used to fight discent.

[–] Nindelofocho@lemmy.world 1 points 5 hours ago

Eventually “sprinkle some crack on him” will turn into “put some CSAM in his google drive”

[–] CompactFlax@discuss.tchncs.de 6 points 14 hours ago

I agree. We’ve seen this happening in the USA “yes technically they can do that but they would never”. Now we know better.

[–] org@lemmy.org 2 points 14 hours ago (1 children)

"The first image in the 'Nudity' collection … depicted who I believe to be David Edward-Ooi Poon without a shirt, taking a selfie of himself while sticking out his tongue over an unconscious adult female," the search-warrant application states. The document goes on to describe the woman in the photo as naked below the waist and wearing a dark-coloured eye mask over her eyes.

The detective alleges that that photograph and others she examined appeared to be stored in a folder on the iPhone titled "Girls I Drugged And Raped."

Doesn’t sound like hashes to me.

[–] CompactFlax@discuss.tchncs.de 3 points 14 hours ago* (last edited 14 hours ago)

That is the result of the search warrant, not the trigger.

[–] Pika@sh.itjust.works 30 points 14 hours ago* (last edited 14 hours ago) (2 children)

In the US companies(where the company is located last I knew) are legally mandated to report specific things such as CSAM and other things if they come across it.

What the issue should be isn't the fact that they are reporting it, the issue should be they have the capability to see it in the first place to be able to report it.

This isn't me defending CSAM or anything like that but, in a decent storage system, google shouldn't be able to even see what you have, let alone what the images actually are.

[–] BrianTheeBiscuiteer@lemmy.world 2 points 5 hours ago (1 children)

Today it's for CSAM. Tomorrow it could be for saying anything negative about dear leader. Our Constitution clearly won't protect us.

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 40 minutes ago* (last edited 39 minutes ago)

Not to get too pedantic, but dammit I just got off the phone with a lawyer. The constitution itself never did anything directly to the public. It outlines the powers given to and withheld from the main branches of the federal government of the US. Those branches empower the agencies that you expect to protect you. Yup, all the three letter agencies you hate.

[–] Blue_Morpho@lemmy.world 8 points 14 hours ago (1 children)

They not only look at your files but will decrypt any encrypted zip files to see what you have.

https://news.ycombinator.com/item?id=37086814

[–] Quetzalcutlass@lemmy.world 1 points 21 minutes ago* (last edited 21 minutes ago)

That seems less like them decrypting encrypted archives and more like the zip format not encrypting filenames so they're easily read from the zip's metadata.

Which is still a privacy violation, to be clear, but not nearly on the same scale as somehow obtaining and using your passwords to decrypt data you yourself encrypted.

[–] MountingSuspicion@reddthat.com 15 points 16 hours ago (2 children)

The detective alleges that that photograph and others she examined appeared to be stored in a folder on the iPhone titled "Girls I Drugged And Raped."

[–] herrvogel@lemmy.world 5 points 14 hours ago (1 children)

Also the guy's last name is Poon.

[–] db2@lemmy.world 4 points 13 hours ago

I thought you both were trying to be funny. How is that even real.

[–] org@lemmy.org 3 points 14 hours ago (2 children)
[–] Zetta@mander.xyz 1 points 13 hours ago

If you read the article, it looks like only 9 images were originally reported by Google. The images in the folder called the girls I drugged and raped were on his iPhone that they broke into with Celebrate.

[–] MountingSuspicion@reddthat.com 1 points 14 hours ago

The images I am referring to are likely distinct from the ones in the title as they are from his iPhone and Google is who reported him. Regardless in the article it says the detective looked at one of the Google reported images. Whether they just referenced a known hash I don't know for sure, but I think it's pretty well known that FAANG scan basically all images for CSAM nowadays.

[–] Crt_static@lemmy.world -4 points 12 hours ago (1 children)

Kind of fine with this. It gives me the ick they can do that, but so does CSAM and I don't see a middle ground.

[–] BUGS@lemmy.org 5 points 11 hours ago (1 children)

I don't know why anyone would expect Google isn't sifting through what you upload to its cloud.

[–] Crt_static@lemmy.world 1 points 5 hours ago

I never expected any privacy from Google. Combing thru data is their business model