January 13, 2026
Minority Report for your DMs
UK Expands Online Safety Act to Mandate Preemptive Scanning
Phones will judge your pics before you do — “1984 vibes”
TLDR: UK now requires apps to pre-scan messages and photos to block cyberflashing and self-harm content. Commenters blasted it as surveillance creep, joking about “pre-cog” tech and asking who decides what’s “unwanted,” while a smaller crowd backed the move for safety—fueling a big privacy vs. protection fight.
The UK just flipped the switch on a bigger Online Safety Act: platforms must scan messages, pics, and searches in real time to block “cyberflashing” and content that encourages serious self-harm. Officials showed a promo phone that flags an “unwanted nude,” and the internet instantly turned courtroom. Commenters piled in with “UK has fallen” and “Pre-cog, you say?”, riffing on Minority Report vibes and asking how an app can tell what’s “wanted.” The fines—up to 10% of global revenue or £18M—only dialed the drama higher.
Supporters cheered the aim—safer spaces for women and girls—and pointed to the government’s press release and the expanded Online Safety Act. But the loudest chorus accused a slide into “always-on surveillance,” saying private chats are becoming monitored hallways. One user snarked that every platform already has a block button, while another dropped the “Ministry of Truth” meme. The big fight: safety vs. privacy. Is preemptive scanning smart protection or a machine making moral calls it can’t possibly understand? The meme factory delivered “Dick-Pic Defense Grid” jokes and side-eye at the word “Unwanted,” with users arguing that algorithms can’t read consent or context. However it’s framed, this is a major test of trust in tech—and in government.
Key Points
- •The UK enacted the Online Safety Act 2023 (Priority Offenses) (Amendment) Regulations 2025 on January 8, 2026.
- •“Cyberflashing” and “encouraging or assisting serious self-harm” are now designated as priority offenses under the OSA.
- •Platforms must proactively scan, detect, and block prohibited content before users can see it.
- •Compliance is expected to rely on automated scanning, content detection algorithms, and AI models.
- •Non-compliance can result in fines up to 10% of global turnover or £18 million and potential service blocking in the UK.