this post was submitted on 11 Feb 2026
128 points (97.8% liked)

Technology

80978 readers
4678 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

In the days after the US Department of Justice (DOJ) published 3.5 million pages of documents related to the late sex offender Jeffrey Epstein, multiple users on X have asked Grok to “unblur” or remove the black boxes covering the faces of children and women in images that were meant to protect their privacy.

top 35 comments
sorted by: hot top controversial new old
[–] Paranoidfactoid@lemmy.world 9 points 55 minutes ago (1 children)

How do these AI models generate nude imagery of children without having been trained with data containing illegal images of nude children?

[–] RedGreenBlue@lemmy.zip 2 points 18 minutes ago

Can't ask them to sort that out. Are you anti-ai? That's a crime! /s

[–] ToTheGraveMyLove@sh.itjust.works 32 points 2 hours ago* (last edited 2 hours ago) (3 children)

Are these people fucking stupid? AI can't remove something hardcoded to the image. The only way for it to "remove" it is by placing a different image over it, but since it has no idea what's underneath, it would literally just be making up a new image that has nothing to do with the content of the original. Jfc, people are morons. I'm disappointed the article doesn't explicitly state that either.

[–] usualsuspect191@lemmy.ca 18 points 1 hour ago* (last edited 37 minutes ago) (2 children)

The black boxes would be impossible, but there are some types of blur that keep enough of the original data they can be undone. There was a pedofile that used a swirl to cover his face in pictures and investigators were able to unswirl the images and identify him.

With how the rest of it has gone it wouldn't surprise me if someone was incompetent enough to use a reversible one, although I have doubts Grok would do it properly.

Edit: this technique only works for video, but maybe if there are several pictures of the same person all blurred it could be used there too?

https://youtu.be/acKYYwcxpGk

[–] Barracuda@lemmy.zip 3 points 1 hour ago

A swirl is a distortion that is non-destructive. Am anonymity blur averages out pixels over a wide area in a repetitive manner, which destroys information. Would it be possible to reverse? Maybe a little bit. Maybe one pixel out of every %, but there wouldn't be any way to prove the accuracy of that pixel and there would be massive gaps in information.

[–] BarneyPiccolo@lemmy.today 3 points 1 hour ago

Several years ago, authorities were searching the world for a guy who had been going around the world, molesting children, photographing them, and distributing them on the Internet. He was often in the photos, but he had chosen to use some sort of swirl blur on his face to hide it. The authorities just "unswirled" it, and there was his face, in all those photos of abused children.

They caught him soon after.

[–] pkjqpg1h@lemmy.zip 2 points 1 hour ago

Actually, there is a short video on that page that explains this with examples

[–] Pyr_Pressure@lemmy.ca 5 points 2 hours ago* (last edited 2 hours ago) (2 children)

There was someone who reported that due to the incompetence of whitehouse staffers, some of the Epstein files had simply been "redacted" in ms word by highlighting the text black, so people were actually able to remove the redactions by turning the pdf back into word and removing the black highlighting to reveal the text.

Who knows if some of the photos might be the same issue.

[–] unmagical@lemmy.ml 1 points 1 hour ago

It was simpler than that. You can just copy the black highlighter text and paste it anywhere.

[–] KyuubiNoKitsune@lemmy.blahaj.zone 1 points 1 hour ago (1 children)

That's, not how images like png or jpgs work.

[–] unmagical@lemmy.ml 1 points 1 hour ago (1 children)

In the case of what wound up on Roman Numeral Ten (formerly twitter) that's correct, but given the actual PDF dump from the gov, if they just slapped an annotation on top of the image it'll be possible to remove it and reveal what's underneath.

[–] KyuubiNoKitsune@lemmy.blahaj.zone 1 points 42 minutes ago

I didn't realise that they released the images as pdfs too.

[–] tehn00bi@lemmy.world 1 points 1 hour ago

So my company was involved with a lawsuit that I was asked to help review files and redact information. They used a specific software that all the files were loaded into and the software performed the redactions and saved the redacted files. It really is mind blowing the government wouldn’t use a similar process.

[–] melsaskca@lemmy.ca 7 points 2 hours ago (1 children)

Of course they are. Who's left on Twitter nowadays? Elon acolytes?

[–] pkjqpg1h@lemmy.zip 2 points 1 hour ago

When I realized that tweets from paid account's always stuck at top, Really?? I immediatily stopped using it.

[–] nymnympseudonym@piefed.social 12 points 3 hours ago (1 children)

I doubt any of these people are accessing X over Tor. Their accounts and IPs are known.

In a sane world, they'd be prosecuted.
In MAGAMERICA, they are protected by the Spirit of Epstein

[–] clay_pidgin@sh.itjust.works 2 points 1 hour ago

What crime do you imagine they would be committing?

I don't know what they hope to gain by seeing the kid's face, unless they think they can match it up with an Epstein family member or something (seems unlikely to be their goal).

[–] pkjqpg1h@lemmy.zip 73 points 6 hours ago (6 children)

unblur the face with 1000% accuracy

They have no idea how this models work :D

[–] criss_cross@lemmy.world 6 points 1 hour ago (1 children)

It’s the same energy as “don’t hallucinate and just say if you don’t know the answer”

[–] pkjqpg1h@lemmy.zip 3 points 1 hour ago

and don't forget "make no mistakes" :D

[–] pkjqpg1h@lemmy.zip 101 points 5 hours ago (2 children)
[–] cupcakezealot@piefed.blahaj.zone 53 points 5 hours ago

biblically accurate cw casting

[–] TheBat@lemmy.world 12 points 5 hours ago

Barrett O'Brien

[–] annoyed_onion@lemmy.world 31 points 5 hours ago

Though it is 2026. Who's to say Elon didn't feed the unredacted files into grok while out of his face on ket 🙃

[–] otter@lemmy.ca 18 points 5 hours ago (1 children)

It feels like being back on the playground

"nuh uh, my laser is 1000% more powerful"

"oh yea, mine is googleplex percent more powerful"

[–] albbi@piefed.ca 2 points 2 hours ago (1 children)

Wait, what? My son has been using "googleplex" when he wants a really big number. I thought it was a weird word he made up. I guess it's a thing....

[–] CannonFodder@lemmy.world 5 points 1 hour ago* (last edited 1 hour ago)

It is, with a slight different spelling. A googol is 10^100, a googolplex is a 10^(googol) or written conventionally, a one followed by a metric shit ton of zeros.

[–] Armand1@lemmy.world 9 points 5 hours ago

Or percentages

[–] SpicyLizards@reddthat.com 4 points 3 hours ago

And gruk, being trained on elons web history, doesn't need to be asked to find, let alone unblur said images.

[–] My_IFAKs___gone@lemmy.world 2 points 3 hours ago
[–] Sims@lemmy.ml 3 points 5 hours ago

"Bellingcat" paid for 'damage-control' ?