TikTok is pushing COVID-19 misinformation to children and teens within minutes of creating a new account, whether they actively engage with videos on the platform or not, a new report has found.
The report, published Wednesday by the media rating firm NewsGuard, raises questions not only about how effectively TikTok is enforcing its medical misinformation policies, but also about how its own recommendation algorithms are actively undermining those policies.
In the report, NewsGuard researchers describe the results of a limited experiment they conducted in August and September, in which they asked nine kids between the ages of 9 and 17 to create brand new TikTok accounts and record their experiences on the app over the course of 45 minutes. In that time — and under the supervision of their parents — eight of the nine kids were shown COVID-19 misinformation, the report says. That was true even for four of the kids, who were told not to follow any accounts or interact with the videos they saw.
All in, the kids in the experiment were shown a total of 32 COVID-19 misinformation videos over the course of that 45-minute period, including videos that claimed COVID-19 vaccines kill people and that COVID-19 is actually "the name of the international plan for the control and reduction of populations." One of the teens in the experiment who was instructed to engage heavily with health-related content on TikTok was "almost exclusively shown misinformation" within a 30-minute time frame, the report reads.
"The thing that surprised me was the speed," said Alex Cadier, NewsGuard's U.K. managing director, who co-authored the report. "With limited engagement, eight minutes in, you started seeing conspiracy theories related to COVID-19."
In a statement, a TikTok spokesperson said: "The safety and well-being of our community is our priority, and we work diligently to take action on content and accounts that spread misinformation while also promoting authoritative content about COVID-19 and educating users about media literacy."
The NewsGuard experiment builds on other similar investigations, including by The Wall Street Journal , which recently used 100 bot accounts to analyze how TikTok's recommendation algorithm uses "subtle cues" to drive users toward increasingly more niche content, including videos that glorify eating disorders and depression.
While TikTok may have defenses in place for redirecting active searches for COVID-19 misinformation, the report's findings suggest that its hypersensitive recommendation algorithm may be actively pushing those videos anyway. TikTok's community guidelines prohibit posting COVID-19 and vaccine misinformation, and the company says it redirects searches associated with both to its community guidelines. But Cadier said many of the videos surfaced in the experiment did not have labels, and those that did directed users to national health authorities' websites, rather than engaging with the substance of the videos themselves.
The report also calls into question TikTok's defenses against underage signups. The platform nominally prohibits anyone under 13 from creating an account, and TikTok did remove 7 million suspected underage accounts this year. The company also has a version of its platform, with additional safeguards, for users 13 and under .
But children under 13 can easily sign up for the regular TikTok experience by giving a fake date of birth. That's not a problem unique to TikTok, but given the platform's popularity with young people and its fledgling content-moderation operation, it's an issue that NewsGuard's researchers argue is particularly concerning. "It's of concern across the board," Cadier said, "But with TikTok, it's compounded by the fact that once you're through the fig-leaf safeguard that is the date-of-birth check, the information seems a lot more unchecked. Once you're through, TikTok can slightly feel like the Wild West of misinformation."