As pressure mounts on Big Tech to do more to protect the youngest users, Snap is launching a new family center that will allow parents to see who their kids are friends with on Snap and report suspicious accounts.
It’s part of a wave of new kid safety features being launched by tech giants, including Meta and Apple. But Snap has an arguably bigger hill to climb in implementing these features than either of those companies. To get parents of teens to use Snap’s parental controls, first, parents of teens need to actually use Snap.
Snap has spent a little over a year working on the family center, conducting a study with more than 9,000 parents and teens around the world to assess their needs and, hopefully, avoid the privacy pitfalls that have derailed other companies’ efforts. Snap landed on a tool that will give parents access to their kids’ friend lists, but not the content of their conversations — a setup Snap says mirrors the dynamic parents and teens often already have offline. “Parents will know who's coming over and be able to ask questions about who their teen is hanging out with, but they won't be in the basement with them listening to their conversation,” said Nona Farahnik Yadegar, Snap’s director of platform policy.
The new family center will be rolled out first in the U.S., U.K., Australia, New Zealand and Canada before expanding globally this fall. It includes features that lawmakers and regulators around the world are already requiring or considering requiring of platforms. The Kids Online Safety Act, which recently advanced out of the Senate Commerce Committee, would require platforms to make it easy for parents to report harm to minors. The U.K.’s Age Appropriate Design Code, meanwhile, also requires certain parental controls and disclosures to children about those features.
Yadegar insists the family center isn’t a response to the legislative landscape. (“Tech moves at a speed that doesn’t always wait for regulation,” she said.) But considering these tools are coming more than a decade into Snapchat’s existence, following so much hand-wringing over its impact on kids, it’s hard to imagine the sudden uptick in regulatory pressure didn’t have just a little something to do with it.
Snap is hardly alone. Instagram recently introduced a new facial analysis tool to weed underage users off of the platform and rolled out nudges that encourage younger users to stop scrolling. TikTok is using filters to keep kids from being served mature content. And then, of course, there was Apple’s disastrous attempt to combat child sexual abuse material with new features that would scan users’ iCloud photos for CSAM and alert parents of “sexually explicit” images their child was sending or receiving. Apple paused the project due to concerns for kids’ safety and privacy. But it’s since implemented a watered-down version that scans kids’ iMessages for nudity and serves them a warning notice, without alerting their parents or guardians.
Unlike Instagram and Apple, though, Snap skews young, meaning the company will need to do more than just send existing users a notification letting them know this thing exists. Snap is also working with parenting bloggers and targeting videos promoting the tool on other social media platforms to get the word out. “Parents are not typically on Snapchat,” Yadegar said. “We’re really going to do our best to make sure we speak to parents in other avenues, so they learn about this tool and can use it.”
And if that push happens to boost Snap’s user base, well, it wouldn’t hurt.