FTC building exterior
The FTC has already expressed an interest in protecting teens.
Photo: bpperry/Getty Images
Policy

Who needs Congress? The FTC is already taking on teen privacy.

Senators are due to hear from companies like TikTok and YouTube on Tuesday, but the FTC has already made moves that could radically change how the social media sites think about the privacy of kids and teens.

Congress is still feeling out the best approaches to making rules for young social media users, but the FTC is already moving forward with plans that could make the internet safer for children and teenagers.

On Tuesday, the same senators who heard from Facebook whistleblower Frances Haugen about mental health concerns for teens on Instagram will follow up by hearing from Snapchat, YouTube and TikTok about similar worries on their platforms.

The hearing, however, comes more than a month after the FTC declared it was making investigations of issues relating to teens a priority for the next 10 years.

It was a quiet announcement — just one of more than a dozen new priority areas where the commission has said it would expedite its ability to compel information from companies. Yet as Congress dithers, the announcement highlights the ways an ambitious and tech-skeptical FTC is trying to respond to changing assumptions about kids and teens online and the policies that social media services built themselves around.

"There are a lot of levers that the FTC can pull," said Phyllis Marcus, now a partner at Hunton Andrews Kurth who as an FTC staffer oversaw the commission's last rewrite of its regulations on online privacy for kids under 13.

For years, the rules Marcus worked on were essentially the last word in the privacy of young Americans online. Congress passed the Children's Online Privacy Protection Act in 1998, which enabled the FTC to write rules requiring sites directed to children under 13 to obtain parental consent to collect kids' information. The latest amendments to the rule were adopted in 2013.

The rules are more or less why the major social media services say they don't allow users 12 and under to sign up or use their platforms. For users 13 and up, data collection, whether for ad-targeting or for-profit edtech, is fair game — though the sites are well aware they have plenty of underage users, and YouTube and TikTok both paid fines over alleged COPPA violations in 2019.

Now, however, new information, such as Haugen's data about Instagram's effects on teenage girls, is leading lawmakers to wonder about how better to protect teens online.

"There may have been a viewpoint in some areas that teens were more like adults and kids were kids," Marcus said. "I think that notion is changing."

The FTC is mulling a full-scale regulation to govern online privacy, although the move would likely generate significant blowback and getting the votes to launch the rulemaking will probably require bringing on board a third Democratic commissioner, who is still awaiting Senate confirmation. Additionally, the FTC is still processing answers to questions it sent in 2020 to Facebook, Snap, TikTok, YouTube, Twitter, Discord and other companies about their data practices, including with regard to children and teens.

Any new rules the FTC adopts under the auspices of COPPA must, by law, focus on children 12 and under. That means the answers the agency receives from the companies could suggest ways that the commission could expand what information the rule covers and how it treats the responsibilities of platforms versus those of creators with regard to children who do use Big Tech services. The FTC could also weigh in on how the regulations should treat technologies that have emerged since 2013.

In 2019, for instance, the FTC asked for comments on how its children's online privacy regulations should approach "interactive gaming, or other similar interactive media" as well as the role of creators. The latter question is particularly important because, as part of YouTube's response to its record settlement, channel owners bear the brunt of flagging their content as being primarily for children within the FTC's definitions. The responsibility — and the resulting loss of data that feeds ad revenue — falls to creators despite the platform's unique access to information on users and armies of lawyers.

Results from the FTC's survey about data practices could also inform any future commission actions and guidance on privacy for teens, although any such rules would not fall under COPPA as it currently stands.

In addition, some children's privacy advocates in Congress are trying to push companies into putting into place in the U.S. protections they already use for teens in a handful of nations with stronger regulation — with consequences from the FTC if the companies fall short.

Since September, companies operating in the U.K. have had to comply with the country's 15-point "Age Appropriate Design Code," which urges the strictest privacy protections for the youngest children, with progressively looser guardrails suggested up to age 17. Three Democratic lawmakers led by Sen. Ed Markey, who as a member of the House was key to the passage of COPPA, have urged companies to offer the protections to kids and teens in the U.S. Some social media sites have even announced they'll bring certain changes they made in response to the UK's rules to the U.S.

In response to the companies' voluntary commitments in the U.S., the lawmakers wrote to FTC chair Lina Khan earlier in October and urged the commission "to use all its authority to ensure that these powerful companies comply with their new policies, to hold them accountable if they fail to do so, and to prioritize the protection of children's and teen's privacy."

The three Democrats cited in particular the FTC's authority to punish companies that violate their public promises — a power that, thanks to the ubiquity of privacy policies but dearth of privacy laws, has made the commission the de facto federal privacy regulator for the U.S. Any potential new probes, meanwhile, would get fast-tracked from the policy the FTC announced of prioritizing matters relating to kids and teens.

The lawmakers, who also included Reps. Kathy Castor and Lori Trahan, seemed to acknowledge that Congress itself has fallen behind public concerns, but in the meantime, the FTC already has power to fill some gaps.

"These policy changes are no substitute for congressional action on children's privacy," the three lawmakers wrote, "but they are important steps towards making the internet safer for young users."