this post was submitted on 01 Dec 2025
98 points (95.4% liked)

Buy European

7938 readers
710 users here now

Overview:

The community to discuss buying European goods and services.


Matrix Chat of this community


Rules:

  • Be kind to each other, and argue in good faith. No direct insults nor disrespectful and condescending comments.

  • Do not use this community to promote Nationalism/Euronationalism. This community is for discussing European products/services and news related to that. For other topics the following might be of interest:

  • Include a disclaimer at the bottom of the post if you're affiliated with the recommendation.

  • No russian suggestions.

Feddit.uk's instance rules apply:

  • No racism, sexism, homophobia, transphobia or xenophobia.
  • No incitement of violence or promotion of violent ideologies.
  • No harassment, dogpiling or doxxing of other users.
  • Do not share intentionally false or misleading information.
  • Do not spam or abuse network features.
  • Alt accounts are permitted, but all accounts must list each other in their bios.
  • No generative AI content.

Useful Websites

Benefits of Buying Local:

local investment, job creation, innovation, increased competition, more redundancy.

European Instances

Lemmy:

Friendica:

Matrix:


Related Communities:

Buy Local:

Continents:

European:

Buying and Selling:

Boycott:

Countries:

Companies:

Stop Publisher Kill Switch in Games Practice:


Banner credits: BYTEAlliance


founded 11 months ago
MODERATORS
 

There's a lot of talk about the new Chat Control regulation here with many scary statements but few details so I decided to check what's actually in it. Best way to learn about at is to just read it:

https://data.consilium.europa.eu/doc/document/ST-15318-2025-INIT/en/pdf

It's long but not hard to understand. If you don't have the time here's a short summary for you.

First, two points:

  1. There's no TLDR. It's a complex legislation and it can't be summarized in two sentences.
  2. I will only say what's in the law. I don't care about how you think corporations will break it or conspiracy theories about what's really behind it. Feel free to post them but I will just ignore it.

So, to the point. The goal of the law is to prevent and combat child sexual abuse and it applies to hosting providers and publicly available communication services. At the core of the regulation is risk assessment. Each service will have to asses the risk that it will be used to distribute CSAM or to groom children. The risk is based on things like:

  • are there known cases of the service being used for such things
  • is there a mechanism to report CSAM by the users
  • does it have parent control features
  • do kids use it
  • can you search for users and identify underage users
  • can you send photos and videos using private chats

Once the risk assessment is done providers will have to address the identified risks. They can do this by implementing things like:

  • moderation (having proper tools and staffing)
  • letting users report child abuse
  • letting users control what personal information visible to other users and how other users can contact them
  • voluntarily doing things covered by Chat Control 1.0 (so client side message scanning)

If the provider identified that the service can be used for grooming it should implement age verification. The regulation says that age verification "shall be privacy preserving, respecting the principles relating to the processing of personal data, notably the principles of lawfulness, purpose limitation and data minimisation".

Regarding E2EE it clearly states that "the regulation shall not prohibit, make impossible, weaken, circumvent or otherwise undermine cybersecurity measures, in particular encryption, including end-to-end encryption".

So, does it allow all services to require an ID and store all your personal data? No. Can messengers break E2EE and scan all your messages? Also no.

What can happen is that some services will verify your age and store your date of birth. Anything beyond that will still be illegal and protected by GDPR. Providers can keep doing whatever they have been doing under Chat Control 1.0 (which applies since 2021) but E2EE is still protected.

Knowing all that let's think how it will apply to some example services. This is just my best guess but I think those are reasonable assumptions:

Signal: it does not let you search for other users, you don't share any personal information with anyone, strangers can't message you. There's very low risk that it will be used for grooming so it doesn't have to do any age verification. It allows you to share videos, has e2ee and I believe there were known cases of using to share CSAM. Based on that it COULD scan media client side as allowed by Chat Control 1.0 but it's voluntary. It will have to implement tools to report it.

Roblox: users of different ages interacting with each other, known cases of grooming on the platform: it should implement age verification.

Pornhub: low risk of grooming: no age verification. High risk of distributing CSAM: moderation and reporting.

Lemmy: my guess would be it's not used by many kids, instances don't have communities targeting children, it doesn't have tools that let you easily find kids on the platform: low risk of grooming and no age verification necessary. It can be used to publish CSAM and it has happened in the past: it should have proper moderation and reporting functionality and it can scan media and compare it against known CSAM hashes.

That's pretty much it. This is what Chat Control 2.0 is all about.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] 87Six@lemmy.zip 10 points 1 month ago

Control was never and never will be the answer to fucking anything, and education will always be the answer to all