November 26, 2025

Safety vs. snooping: choose your fighter

EU council reaches position on Chat Control

EU child-safety rules: no chat snooping, or just not yet

TLDR: EU leaders backed child-safety rules asking tech firms to assess risks and remove illegal content, with a new agency to help victims. Commenters split: some see no chat snooping, others fear “high risk” labels could pressure encrypted apps and startups; Parliament still has a say.

Europe just took a big step on its so-called “Chat Control” plan, backing rules to fight child abuse online by forcing tech companies to assess risks, curb grooming, and remove or block illegal content. A new EU Center would help victims and push platforms to act, while voluntary scanning stays on the table. That’s the headline — but the comments section lit up.

One camp is breathing easy. “No more intrusions into chats,” cheered one user, saying this mostly builds help lines for victims. Another added, “Seems… fine,” noting no explicit attack on encryption, then worried out loud that red tape could squeeze startups rather than predators. The vibe: cautious thumbs up, with a side-eye at innovation.

Then came the plot twist. Skeptics say the devil lives in those risk assessments. If authorities can label a service “high risk,” they fear apps like Signal get cornered with “mitigation” demands and fines — a stealth path to pressure encrypted chats without saying “backdoor.” Cue the meme of the day: “competent authority,” which commenters turned into a running joke about who’s actually, well, competent.

Procedural geeks chimed in too: is this the final text, or will Parliament rewrite it? Translation: the cliffhanger isn’t over.

Key Points

  • EU member states agreed on the Council’s position for a regulation to prevent and combat child sexual abuse online.
  • Digital companies must prevent dissemination of CSAM and solicitation of children; authorities can order removals, blocks, and de-indexing.
  • A new EU agency—the EU Centre for the Prevention and Combating of Child Sexual Abuse—will support implementation and victims.
  • Providers must conduct risk assessments, implement mitigation measures, and face potential fines for non-compliance.
  • Online services are classified into high, medium, and low risk; high-risk providers can be required to help develop technologies to reduce risks; voluntary CSAM scanning becomes permanent.

Hottest takes

"Seems… fine? At least i dont see any invasion of privacy or encryption related obligations" — thecopy
"The crux is in those „risk assessments”, to be approved by authorities" — throw_a_grenade
"‘competent authority’ seems to conflate two traits that do not necessarily co-habit" — jacknews
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.