Meta still hasn’t impressed Congress.
On Wednesday, Instagram chief Adam Mosseri addressed a Senate hearing on the safety of young users on the app, in an appearance that lawmakers demanded in the wake of revelations by whistleblower Frances Haugen about Meta’s effects on kids and teens.
Mosseri touted changes that Instagram had announced on Tuesday: new controls for teens meant as an offering to senators who have blasted Facebook and Instagram’s parent company, particularly after Haugen testified in front of the same subcommittee in October.
Mosseri made more concessions in his remarks, too, aimed at the longtime demands of the subcommittee’s chairman, Democratic Sen. Richard Blumenthal. He asked for more insight into the company’s operations and effects on users, and for changes to tech’s prized liability shield.
Blumenthal’s response, which was shared by almost every lawmaker who spoke, was that it wasn’t good enough, at least not anymore.
“These simple time-management and parental-oversight tools should have — they could have — been announced years ago,” he said. “These changes fall way short.”
Instagram’s new controls, some of which the company had previously floated, included limiting the types of recommendations it makes to teens. The app also said it would be nudging those users away from topics they're dwelling on and urging young users in several countries to take a break from the app.
“A nudge? A break?” Blumenthal said at one point. “That ain’t going to save kids.”
Instagram’s changes echo revelations in internal documents from Haugen, which suggested that the app made body image problems worse for teen girls. The documents also showed that some teen users essentially can’t shut the app off despite being aware it negatively affects them.
In response, many members of the panel holding the hearing said Meta can’t be trusted and demanded access to documents and research from the company, and from other social media platforms including Snapchat and TikTok. Other services have agreed to share some of their materials, and Mosseri did promise “meaningful access to data to third-party researchers outside the company.”
Yet Meta’s existing relationships with researchers have proven highly fraught when the latter made unflattering findings, and some lawmakers slammed Mosseri when he didn’t embrace new legal requirements on transparency, tough enforcement mechanisms or other measures on privacy, competition and liability.
“This is now the fourth time in the past two years that we have spoken with someone from Meta,” said Sen. Marsha Blackburn, the top Republican on the panel. “I feel like the conversation continues to repeat itself ad nauseam.”
‘The new tobacco’
While lawmakers’ threats to regulate Big Tech have so far proven all bark and no bite, the safety of young users is a bipartisan issue and social media companies face more than just Congress. The platforms are also dealing, for instance, with international efforts to make the internet safer and more private for teens in addition to kids. (The pushes have also prompted occasional concerns about denying young users the benefits of social media.)
Teens’ experience online has previously differed little from adults’, and companies including Meta have unveiled a series of policy responses to try to get ahead of regulators’ desire for something like a PG-13 internet — more mature than what’s allowed for children, but not the free-for-all that those over 18 enjoy.
In July, for instance, as the U.K. prepared for its new “Age Appropriate Design Code” to go into effect, Meta said it would limit the options for targeting ads to users under 18 on all its properties. The company also said it would make private accounts the default for Instagram users under 16, although Blackburn told Mosseri that her staff had discovered young users could get around the setting. Mosseri said his team would correct a bug on the web interface.
In addition, the Federal Trade Commission has hinted that social media giants’ commitments abroad on teens should be brought to the U.S., and on Tuesday the surgeon general advised limits on “children’s exposure to harmful online content” as a way to combat youth mental health concerns.
Yet Instagram’s new policies on teen users — which will also include the ability of parents to set time limits on teens’ usage starting next year — did little to mollify lawmakers’ concerns, for both teens and kids.
Meta, which is struggling to keep up among the young users who will make up the consumers of the future, has paused its plans for an Instagram Kids for those under age 13, but Mosseri repeatedly dodged on Wednesday when asked to end the project completely. The company has described the proposed app as a careful way to introduce kids to the social media services they’ll encounter as they grow.
“We all want teens to be safe online,” Mosseri said. “The internet isn't going away."
Many critics of the idea, including several lawmakers and most state attorneys general, however, portray Instagram Kids as an effort to hook children with no regard for the harm the apps can cause.
“You’re the new tobacco whether you like it or not, and you’ve gotta stop selling the tobacco — in quotation marks — to kids,” Republican Sen. Mike Lee said. “Don’t let them have it. Don’t give it to them.”
Earning trust
Mosseri tried to add to Tuesday’s announcements live at the hearing. He said, for instance, the app is hoping to launch an option for people to view their feeds in chronological order next year. Haugen had urged the option in her testimony, suggesting a return to the time-ordered feeds that defined social media’s early years would dampen the frequency of shocking and extreme content that pops up in engagement-based orderings. Experts have suggested such changes might have more modest effect, however, and would come with their own discontents.
Meta’s relationship with lawmakers has been deteriorating for years, and as Mosseri tried to pile on commitments and proposals, members of the subcommittee frequently shot him down, at times even dismissing ideas they might have celebrated before.
He suggested, for instance, “an industry body that will determine best practices” on age verification, age-appropriate design and parental controls. He added that services should have to earn some of their liability protections for user content under Section 230 by following the body’s standards. The idea echoed the EARN IT Act, a bill Blumenthal introduced in 2020, but Blumenthal himself scoffed at Mosseri’s suggestion as mere self-regulation.
“Self-policing depends on trust,” he said. “The trust is gone.”