The European Commission unveiled a new plan to combat child sexual abuse material Wednesday, and it's already drawing a backlash from privacy experts who say it would create a new and invasive surveillance regime in Europe.
The proposal would require tech companies in Europe to scan their platforms and products for CSAM and report their findings to law enforcement. Lots of tech companies already do some form of this, of course, using hashed versions of known CSAM to automatically block new uploads matching that content. But the European plan would take that work a step farther, allowing EU countries to ask the courts to require tech companies to seek out and report new instances of CSAM. The plan also proposes using AI to detect patterns of language associated with grooming.
“We are failing to protect children today,” EU commissioner for Home Affairs Ylva Johansson said at a press conference.
But critics argue these requirements would risk breaking end-to-end encryption and would force companies to peer into the personal communications of all users. "This document is the most terrifying thing I’ve ever seen," Matthew Green, an associate professor at Johns Hopkins' Information Security Institute, tweeted after a draft of the proposal leaked. "Once you open up 'machines reading your text messages' for any purpose, there are no limits."
"Today is the day that the European Union declares war upon end-to-end #encryption, and demands access to every persons private messages on any platform in the name of protecting children," tweeted Alec Muffett, a leading security expert and former Facebook software engineer.
What the EU is proposing bears some resemblance to Apple's child safety plan, which the company introduced last summer only to retract it a few months later. At the time, Apple said it would scan iMessages for users under 17 and warn them if they were about to send or receive what Apple's systems deemed to be "sexually explicit" imagery. If those kids were under 13 and opted into family plans, Apple would notify their parents or guardians too. Apple also proposed scanning iCloud content for known CSAM and alerting the National Center for Missing and Exploited Children, or NCMEC, when it detected above a certain threshold of content in a single account.
But Apple put the plan on pause following fierce opposition from privacy groups, as well as from advocates for LGBTQ+ youth, who said kids could be at even more risk if Apple were to out them to abusive authority figures.
"This is Apple all over again," Green tweeted.
There are some parts of the plan that may prove less controversial, like the creation of an EU version of NCMEC, which has become an important repository of known CSAM in the U.S. The need to combat CSAM is, after all, urgent and growing, and it's critical for companies to coordinate efforts.
It may be years before a final version of the proposal is approved by member states and European Parliament. Until then, tech giants and privacy groups are likely to fight it with everything they've got.