More than 90 civil liberties organizations around the world sent a letter to Apple's Tim Cook Thursday, urging the CEO to walk back its plans to use machine learning to automatically detect child sexual abuse material on users' devices.
"Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children," the organizations wrote.
In the weeks since Apple's announcement, researchers and activists have worried that a feature which automatically informs parents if their child under 13 has sent or received sexually explicit material might put LGBTQ+ youth in jeopardy. The groups reiterated that concern in their letter, writing, "An abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child's safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk."
The groups also worried about the slippery slope effect of this feature. "Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit," the groups wrote.
Another feature, which automatically scans photos in iCoud for known child sexual abuse material, and alerts the National Center for Missing and Exploited Children once a certain number of photos have been flagged, has been similarly controversial. "Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable," the groups wrote.
Apple has sought to address some of these concerns in the press. In an interview with The Wall Street Journal last week, Apple's senior vice president of software engineering, Chris Federighi, said the tools would be auditable. "If any changes were made that were to expand the scope of this in some way — in a way that we had committed to not doing — there's verifiability, [security researchers] can spot that that's happening," he said.