this post was submitted on 09 Jan 2026
217 points (97.8% liked)

United Kingdom

5724 readers
424 users here now

General community for news/discussion in the UK.

Less serious posts should go in !casualuk@feddit.uk or !andfinally@feddit.uk
More serious politics should go in !uk_politics@feddit.uk.

Try not to spam the same link to multiple feddit.uk communities.
Pick the most appropriate, and put it there.

Posts should be related to UK-centric news, and should be either a link to a reputable source, or a text post on this community.

Opinion pieces are also allowed, provided they are not misleading/misrepresented/drivel, and have proper sources.

If you think "reputable news source" needs some definition, by all means start a meta thread.

Posts should be manually submitted, not by bot. Link titles should not be editorialised.

Disappointing comments will generally be left to fester in ratio, outright horrible comments will be removed.
Message the mods if you feel something really should be removed, or if a user seems to have a pattern of awful comments.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] xxce2AAb@feddit.dk 36 points 2 days ago* (last edited 2 days ago) (4 children)

They wanted everyone to scan their faces to look at legal porn featuring consensual performers, but now that it's synthetic CSAM featuring other people's kids, eh, they're not quite sure if intervention is really warranted?

Edit: Why, I'm starting to think it was never about "thinking of the children", or maybe I took it entirely the wrong way and they meant something completely different than I'd assumed.

[–] ragepaw@lemmy.ca 19 points 2 days ago

It's never been about the children, but if you enact something "in the name of child safety" they can shout down any opposition by screaming child predator at anyone who does.

After all, anyone who opposes a child safety bill, must be a predator right?

[–] eestileib@lemmy.blahaj.zone 8 points 2 days ago

Yuuuup. OfCom wants to control you not protect children.

[–] FishFace@piefed.social 2 points 2 days ago (1 children)

Eh? It's the exact same law that's being used, with the exact same problems. The reason it's not been done yet is because Twitter is not - or was not - a porn site, so it escaped scrutiny. But the sites failing to comply with the Online Safety Act aren't just deleted from the internet immediately, there's an investigation first.

[–] UnspecificGravity@piefed.social 4 points 2 days ago (1 children)

It is a website that hosts a shit ton of porn and even features a tool for creating porn. How is that not a porn site?

[–] FishFace@piefed.social 0 points 2 days ago (1 children)

As far as I understand (going from what was reported on this in the last couple of days) Grok's ability to create porn is recent, so that explains that.

I don't use it, but my impression was that, Grok aside, the content is primarily not porn, which would make it not a porn site, surely.

[–] UnspecificGravity@piefed.social 4 points 2 days ago (1 children)

Twitter has always hosted a considerable amount of porn GROK being deliberately developed into a CSAM machine is just the most recent thing.

To be clear, Twitter (and now X) explicitly permit pornographic content by policy:

https://help.x.com/en/rules-and-policies/adult-content

It is a porn site. Not a site that sometimes people break the rules and post porn to. It is a site that deliberately and intentional hosts pornography.

[–] FishFace@piefed.social 1 points 2 days ago (1 children)

Fair enough. But it also (I just checked) requires age verification like regular porn sites, so I don't really get the raising of the treatment of twitter as some kind of double standard.

[–] UnspecificGravity@piefed.social 2 points 2 days ago (1 children)

They ARE NOT being subjected to the same age verification process as regular porn sites. That is the entire thing that we are talking about here.

[–] FishFace@piefed.social 0 points 2 days ago (1 children)

Do you mean that Twitter itself is not forcing all users to undergo ID verification, like for example Pornhub does?

Because that can be explained by the law not requiring a site which hosts adult content to go to those lengths if such content is not shown to those whose age is not reliably known. If you think the law is being applied unfairly maybe it would be worth being specific about what exact provision of the law is being applied to porn sites and not to twitter.

[–] UnspecificGravity@piefed.social 1 points 2 days ago (1 children)

Congratulations, you just caught up to where the rest of us started this discussion.

[–] FishFace@piefed.social 2 points 2 days ago

Right, so it's being upset for no good reason.

The OSA was always a shit law, but the fact that it has not forced twitter users to undergo ID verification is not reflective of that. It's not a problem that the law doesn't force all sites that host adult content to age-verify all users because those sites can instead just not show the adult content to those unverified users.

But because it's a bad law, people will validate literally any complaint about it because it aligns with their other opinions.

[–] fonix232@fedia.io 2 points 2 days ago

tbf that face scanning BS was 100% on the companies contracted out to do the verification, because the moronic law had absolutely no control over HOW the user's age must be verified and how the data used to verify the age should be stored.