After months of hearings and debates, Sens. Richard Blumenthal and Marsha Blackburn have introduced their long-promised legislation to require tech platforms to implement new controls for minors and their parents and make changes to mitigate harms to kids online.
The Kids Online Safety Act applies to any app or online service that could be used by kids 16 and younger. Under the bill, those platforms would have a duty to prevent the promotion of certain harmful behaviors, including suicide and self-harm, eating disorders, substance abuse and more. They would also have to give parents and users under 16 the ability to opt out of algorithmic recommendations, prevent third parties from viewing a minor's data and limit the time kids spend on the platform, among other things. Those controls would have to be turned on by default. The bill also includes provisions regarding platforms' disclosure policies and advertising systems.
"Big Tech has brazenly failed children and betrayed its trust, putting profits above safety. Seared in my memory — and motivating my passion — are countless harrowing stories from Connecticut and across the country about heartbreaking loss, destructive emotional rabbit holes, and addictive dark places rampant on social media," Blumenthal said in a statement. "The Kids Online Safety Act would finally give kids and their parents the tools and safeguards they need to protect against toxic content — and hold Big Tech accountable for deeply dangerous algorithms. Algorithms driven by eyeballs and dollars will no longer hold sway."
Momentum for this kind of legislation has been growing since whistleblower Frances Haugen came forward with internal research on Instagram's impact on teens' mental health. Congress has since called Instagram head Adam Mosseri and Facebook head of Safety Antigone Davis to testify on those findings. Leaders from TikTok, Snap and YouTube have also testified about the well-being of kids on their apps.
"This legislation is the product of those hours of hearings, those hours of work that have gone into answering that question: How do we make certain that our children are safer online?" Sen. Blackburn said on a call with reporters Wednesday.
In addition to the changes the bill would make to the platforms themselves, it would also bring some much-needed transparency to what platforms know about their impact on kids. That, of course, was part of the shock value of Haugen's revelations: the idea that Instagram had discovered one thing in private and that its leaders were saying another thing in public. The Kids Online Safety Act would require covered companies to issue an annual public report about potential risks to minors.
It would also require the National Telecommunications and Information Administration to set up a program under which researchers interested in studying kids' safety on a given platform could apply for data sets that companies would then have to hand over. And it would give safe harbor to researchers who collect data on harms to minors on their own. Tech giants have recently clashed with the research community over data scraping practices.
Sen. Blumenthal said on the call that while the White House has not offered any formal support for the bill, "informally there have been very encouraging comments from the White House."
The bill follows the implementation of the U.K.'s
Age Appropriate Design Code
last fall, which has already compelled platforms to make changes with regard to how they treat kids' data. TikTok, meanwhile, announced this month that it's testing
age-based safety ratings
to filter out mature content for younger users.
This story has been updated to include additional details from the press call.