Meta has partnered with a U.K. nonprofit to launch a tool that allows people to create unique identifiers — known as hashes — of sexually explicit or nude photos and submit those hashes to a nonprofit database. Tech platforms, including Facebook and Instagram, can then use that database to detect whether that specific photo has been posted or shared.
The tool — called StopNCII.org — is intended for people over the age of 18 who suspect an image of them may have been posted on a social platform without consent, in a practice commonly referred to as nonconsensual intimate imagery (NCII) or revenge porn. It's launching with Facebook and Instagram as participating partners, with plans to expand to other platforms.
Facebook's previous efforts to create a system for revenge porn reporting drew widespread condemnation. A 2017 pilot asked users to submit sexually explicit photos of themselves to Messenger so that they could proactively be tagged as images shared without consent. But users were wary of trusting Facebook with the same photos they were trying to prevent from being shared in the first place.
The new tool, operated by the U.K. Revenge Porn Helpline rather than Meta, instead relies on technology that allows people to create the hashes directly on their phones in order to avoid the same problem as the 2017 pilot. "While participating companies use the hash they receive from StopNCII.org to identify images that someone has shared or is trying to share on their platforms, the original image never leaves the person’s device. Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms.
"This feature prevents further circulation of that content and keeps those images securely in the possession of the owner," Antigone Davis, Facebook's global head of Safety, wrote in the announcement.
Major tech companies already access another database of hashed photos that are identified as child sexual abuse material in order to try to prevent their proliferation on the internet. Apple has also faced criticism from privacy advocates for its plan to scan personal iCloud accounts for photos that have been hashed for CSAM, though it delayed the planned rollout in September following backlash regarding how the the plan could violate people's privacy and have unintended consequences.