TikTok is experimenting with a rating system that vets the age-appropriateness of videos in order to keep more mature content from reaching younger users.
The system, which is currently being tested on a small scale, borrows from the familiar rating system already used for film — G, PG, PG-13 and so on. But instead of showing those ratings to users, TikTok will use the ratings internally to categorize videos on the back end and automatically decide which users to show them to.
“When the system is fully launched, content that we've identified as containing overtly mature themes could be restricted from teens,” Tracy Elizabeth, TikTok’s global issue policy lead, said on a call with reporters. “And for content that has less overtly mature themes, or even just less mature themes, our community members are going to be able to choose from the comfort zones or content maturity that they would prefer to skip.”
The company is also working on ways to allow creators themselves to target mature audiences and prevent their videos from being shown to young users. “We are drawing upon the types of content rating standards and maturity ratings expectations that regulators across the globe already have in place,” said Elizabeth. “So we're learning from them in order to make that type of approach applicable to our unique platform.”
Lawmakers and regulators around the world have been stepping up pressure on social media platforms, particularly TikTok, to protect kids online. The U.K.’s Age Appropriate Design Code, which went into effect in September, prompted a broad range of tech companies to change the way they deal with data associated with teenage users.
TikTok’s experiment with rating videos tries to tackle a different problem: Once kids turns 13, they largely have the same exact same experience on social media that adult users have. Adjusting the algorithm to promote more age-appropriate content to younger users could address some of those concerns.
Of course, in order for that to work, TikTok still needs to know how old its users are — and if the numbers are any indication, that’s a particularly difficult problem for TikTok. The company’s most recent content enforcement report, which also went live Tuesday, shows TikTok removed more than 12 million suspected underage accounts in the third quarter of 2021. That’s up from more than 11 million accounts in the prior quarter. Facebook, by contrast, has previously reported removing about 600,000 underage Instagram accounts in a quarter, a significantly lower number for a substantially larger company.
To account for the contrast, Elizabeth said, the company is aggressive in policing underage accounts and that “when in doubt, we're going to err on the side of being conservative and cautious.”
But determining users’ ages isn’t the only hurdle. The new rating system opens TikTok up to being wrong about the appropriateness of the content itself, which could wind up limiting the exposure that videos labeled as mature are able to get on the platform.
YouTube, for one, faced widespread backlash when users alleged that its Restricted Mode feature was blocking certain videos with LGBTQ+ themes. “Our system sometimes makes mistakes in understanding context and nuances when it assesses which videos to make available in Restricted Mode,” the company said at the time, before manually changing its restriction on a handful of videos. The company has continued to face allegations of censorship by YouTubers, who say the company unfairly demonetizes videos by queer creators. As TikTok expands the ratings test, it will no doubt face similar questions.
TikTok’s latest report also showed the company removed more than 91 million videos from the platform in the third quarter of 2021, the vast majority of which were removed before users reported them and before they had received any views.
In addition to removal stats, the company announced changes to its content policies, including banning content that includes dead-naming, misgendering or misogyny, or that promotes conversion therapy. The company will also no longer promote videos related to disordered eating.
On the call with reporters, TikTok’s head of U.S. Safety, Eric Han, talked about using this year “to mature as a safety team and as a platform.” While he didn’t mention it outright, that includes emulating systems that more established U.S. tech companies, most notably Meta, already have in place. One such system Han mentioned is coming up with a “prevalence” measurement, which is a number that shows not how many videos have been removed from the platform, but how much violative content is actually being seen by users.
“Our approach has to become more sophisticated, especially as we grow and mature as a platform,” Han said.
Update: The headline of this article was updated to better reflect the scope of the filters that TikTok is testing. Updated Feb. 8, 2022.