As the midterm election nears, TikTok has faced unrelenting scrutiny about the role it plays in spreading misinformation and the way influencers and political operatives skirt its advertising rules. But according to seven former employees from TikTok's trust and safety team, the company may have an even more basic problem inhibiting its efforts to secure the midterm election: High turnover among the employees who are supposed to carry out that work.
TikTok is still the new kid on the social media block. In 2018, the up-and-comer was barely a blip in the conversation about U.S. elections. By the 2020 election, it had built out its trust and safety team. But since that time, former employees told Protocol, members of that team have scattered, leaving TikTok with limited muscle memory just when it needs it most. “Since so many people are new, they don’t necessarily have the history or institutional knowledge,” one former employee said.
These former employees attributed the trust and safety team’s high attrition to TikTok’s grueling work culture and a lack of transparency from leadership about access to and policies around user data. “If they don’t stem the tide of all the people they’re losing, it’s going to be hard for them to be effective,” another former employee told Protocol.
TikTok refused to say exactly how many employees have left its trust and safety teams, or how many are employed now. The company also declined to specify exactly how the U.S. trust and safety team is structured, and if that structure has changed since 2020. In September, chief operating officer Vanessa Pappas told Congress that trust and safety is “our largest labor expense for TikTok’s U.S. operations,” with “thousands of people working across safety, privacy, and security on a daily basis.”
Protocol identified 239 people worldwide on LinkedIn who left TikTok’s trust and safety operations since 2021, with 94 of those leaving just this year and 67 based in the U.S. (LinkedIn member information may not fully represent staffing or attrition, due to the potential for fake accounts.) The company posted listings seeking to fill election misinformation roles as recently as October.
“We encourage a culture of transparency and feedback, and are committed to building an equitable platform and business that allows both our community and our employees to thrive,” a TikTok spokesperson told Protocol.
Why misinformation is particularly thorny on TikTok
Civil rights groups have been sounding the alarm about election misinformation for weeks. The Leadership Conference on Civil and Human Rights recently addressed a letter to social media companies urging them to tamp down on the “Big Lie,” false claims that President Joe Biden lost the 2020 election to former President Donald Trump, in addition to new misinformation. Conspiracy theories about general fraud within the U.S. election system abound not only on social media but also among a majority of Republican candidates on the ballot this fall.
Every platform is working on ways to tackle mis- and disinformation related to elections, but there are a few factors that make it even harder on TikTok than elsewhere. For one, video can be a challenging medium to analyze: It’s harder to extract information and search for keywords in images and audio than in text. YouTube faces similar challenges, but it’s been around for far longer than TikTok.
And then, of course, there are the challenges that have nothing to do with technology and everything to do with humans. Karan Lala, a fellow at the Integrity Institute and former software engineer at Meta, noted that mass enforcement is difficult because people might tell the same lie in completely different ways.
“Let’s say you review one video and say, ‘oh the content of this video was a lie,’” Lala said. “How do you effectively link that decision to all the videos that might be coming from different creators that phrased the lie in a different way?”
TikTok’s algorithm also largely displays content from strangers, which means information spreads far beyond a person’s social circle. You don’t necessarily need a following to go viral, so any video might have infinite reach. Because of this — and the fact that researchers don’t have access to TikTok via an API, though TikTok promises to release one soon — it’s especially hard for outside researchers and experts to recreate what an average “For You” page might look like.
“How do you keep TikTok’s For You page from picking a relatively obscure video that is harmful and blasting it to millions of people?” Odanga Madung, a researcher with the Mozilla Foundation, asked. “Because that's essentially what I was seeing on a consistent basis.”
Empowering trust and safety teams is critical in halting misinformation. TikTok is not very transparent internally, as it doesn’t provide an organizational chart to employees. Several former employees told Protocol they felt disconnected from the teams working on TikTok’s algorithm in China, often having to wait for engineers in China to respond to crises such as unflagging critical keywords.
“You need that team to have as much power and be as closely situated to the team that is building the algorithm in and of itself,” Lala said. “You need to be able to disrupt or slightly tweak the outcome of an algorithm based on integrity signals.”
TikTok’s ties to China have also led to heightened scrutiny of its content moderation decisions, even above and beyond accusations of “censorship” that routinely get leveled at other platforms. “TikTok has received a huge amount of criticism for both moderating too much and moderating too little,” said Casey Fiesler, an online communities professor at University of Colorado, Boulder.
Trust and safety needs institutional memory
All of this makes for a complex trust and safety landscape inside TikTok, which was just beginning to take shape during the 2020 election. Like other platforms at the time, TikTok’s employees had to decide how to handle videos discussing Hunter Biden’s laptop. “Do we just take it down and assume it’s all misinformation because it’s unverified?” a former employee told Protocol. “What do we do so that we’re not ‘big bad China’ and we’re not censoring everyone?” The team eventually decided not to fully take those videos down, unless they seemed egregious.
Another challenge during the 2020 election: handling political party-based hype houses. The former employee told Protocol that the Republican Hype House was especially difficult to deal with. The house members kept referencing unsubstantiated QAnon-related claims, but TikTok didn’t want to be accused of suppressing partisan speech. TikTok employees warned the house several times, but never took the account down.
Several told Protocol they were worried about the company’s ability to handle these types of issues with high turnover. One former TikTok employee said only one of the U.S.-based employees currently working on the threat analyst side worked at TikTok during the 2020 election. TikTok declined to comment on this claim.
David Polgar, founder of All Tech is Human, said one of the reasons for high trust and safety attrition more broadly is the explosion of the field. Every tech company worth its salt is looking for quality trust and safety employees, he said: “If you’re doing trust and safety for a major platform that is on the up-and-up like TikTok, you are also a really hot commodity for any startup.”
Burnout is common among trust and safety professionals and may also be a factor in high churn. Even if you’re not a direct content moderator, you’re working in a constant flow of disturbing content.
That type of turnover isn’t always a bad thing, said Katie Harbath, founder of Anchor Change and former public policy director at Facebook. “You are seeing people that were incubated in some of these other platforms, particularly your Metas and Googles, and also your Twitters and TikToks,” Harbath said. “They’re able to take that experience to other companies that may actually have nothing.”
TikTok, for its part, says it’s learned a lot from 2020 and is putting that knowledge to use this year. The company has already publicly released some of the lessons it learned after the 2020 election, including the need to improve its disinformation detection systems and educate creators on TikTok’s zero political ads policy. TikTok released its in-app election center six weeks earlier this year than in 2020 and is labeling content related to the 2022 midterms with a button leading to the information center. Hashtags like #elections2022 will also lead to the center and TikTok’s community guidelines. While content is in the process of being fact checked, it will be excluded from For You feeds.
Around the world, TikTok has hired more trust and safety employees, opening a Europe, Middle East, and Africa hub in Dublin and expanding its hub in San Francisco. It also launched content advisory councils in the U.S., Asia, and Europe. But former employees fear none of that will be enough without a battle-tested team in place.
Two years since the 2020 election, misinformation about the process and outcome still abounds. Mozilla’s Madung said TikTok will need to remain vigilant in the weeks following the midterms. The overarching goal is to avoid violence on or around election day, but Madung said TikTok needs to think about the deeper, pervasive damage caused by misinformation. Lies are persistent.
“Platforms often tend to start the interventions too late, and then exit too early, and don’t recognize that election misinformation is an incredibly durable piece of information,” Madung said.