As the midterm election nears, TikTok has faced unrelenting scrutiny about the role it plays in spreading misinformation and the way influencers and political operatives skirt its advertising rules. But according to seven former employees from TikTok's trust and safety team, the company may have an even more basic problem inhibiting its efforts to secure the midterm election: High turnover among the employees who are supposed to carry out that work.
TikTok is still the new kid on the social media block. In 2018, the up-and-comer was barely a blip in the conversation about U.S. elections. By the 2020 election, it had built out its trust and safety team. But since that time, former employees told Protocol, members of that team have scattered, leaving TikTok with limited muscle memory just when it needs it most. âSince so many people are new, they donât necessarily have the history or institutional knowledge,â one former employee said.
These former employees attributed the trust and safety teamâs high attrition to TikTokâs grueling work culture and a lack of transparency from leadership about access to and policies around user data. âIf they donât stem the tide of all the people theyâre losing, itâs going to be hard for them to be effective,â another former employee told Protocol.
TikTok refused to say exactly how many employees have left its trust and safety teams, or how many are employed now. The company also declined to specify exactly how the U.S. trust and safety team is structured, and if that structure has changed since 2020. In September, chief operating officer Vanessa Pappas told Congress that trust and safety is âour largest labor expense for TikTokâs U.S. operations,â with âthousands of people working across safety, privacy, and security on a daily basis.â
Protocol identified 239 people worldwide on LinkedIn who left TikTokâs trust and safety operations since 2021, with 94 of those leaving just this year and 67 based in the U.S. (LinkedIn member information may not fully represent staffing or attrition, due to the potential for fake accounts.) The company posted listings seeking to fill election misinformation roles as recently as October.
âWe encourage a culture of transparency and feedback, and are committed to building an equitable platform and business that allows both our community and our employees to thrive,â a TikTok spokesperson told Protocol.
Why misinformation is particularly thorny on TikTok
Civil rights groups have been sounding the alarm about election misinformation for weeks. The Leadership Conference on Civil and Human Rights recently addressed a letter to social media companies urging them to tamp down on the âBig Lie,â false claims that President Joe Biden lost the 2020 election to former President Donald Trump, in addition to new misinformation. Conspiracy theories about general fraud within the U.S. election system abound not only on social media but also among a majority of Republican candidates on the ballot this fall.
Every platform is working on ways to tackle mis- and disinformation related to elections, but there are a few factors that make it even harder on TikTok than elsewhere. For one, video can be a challenging medium to analyze: Itâs harder to extract information and search for keywords in images and audio than in text. YouTube faces similar challenges, but itâs been around for far longer than TikTok.
And then, of course, there are the challenges that have nothing to do with technology and everything to do with humans. Karan Lala, a fellow at the Integrity Institute and former software engineer at Meta, noted that mass enforcement is difficult because people might tell the same lie in completely different ways.
âLetâs say you review one video and say, âoh the content of this video was a lie,ââ Lala said. âHow do you effectively link that decision to all the videos that might be coming from different creators that phrased the lie in a different way?â
TikTokâs algorithm also largely displays content from strangers, which means information spreads far beyond a personâs social circle. You donât necessarily need a following to go viral, so any video might have infinite reach. Because of this â and the fact that researchers donât have access to TikTok via an API, though TikTok promises to release one soon â itâs especially hard for outside researchers and experts to recreate what an average âFor Youâ page might look like.
âHow do you keep TikTokâs For You page from picking a relatively obscure video that is harmful and blasting it to millions of people?â Odanga Madung, a researcher with the Mozilla Foundation, asked. âBecause that's essentially what I was seeing on a consistent basis.â
Empowering trust and safety teams is critical in halting misinformation. TikTok is not very transparent internally, as it doesnât provide an organizational chart to employees. Several former employees told Protocol they felt disconnected from the teams working on TikTokâs algorithm in China, often having to wait for engineers in China to respond to crises such as unflagging critical keywords.
âYou need that team to have as much power and be as closely situated to the team that is building the algorithm in and of itself,â Lala said. âYou need to be able to disrupt or slightly tweak the outcome of an algorithm based on integrity signals.â
TikTokâs ties to China have also led to heightened scrutiny of its content moderation decisions, even above and beyond accusations of âcensorshipâ that routinely get leveled at other platforms. âTikTok has received a huge amount of criticism for both moderating too much and moderating too little,â said Casey Fiesler, an online communities professor at University of Colorado, Boulder.
Trust and safety needs institutional memory
All of this makes for a complex trust and safety landscape inside TikTok, which was just beginning to take shape during the 2020 election. Like other platforms at the time, TikTokâs employees had to decide how to handle videos discussing Hunter Bidenâs laptop. âDo we just take it down and assume itâs all misinformation because itâs unverified?â a former employee told Protocol. âWhat do we do so that weâre not âbig bad Chinaâ and weâre not censoring everyone?â The team eventually decided not to fully take those videos down, unless they seemed egregious.
Another challenge during the 2020 election: handling political party-based hype houses. The former employee told Protocol that the Republican Hype House was especially difficult to deal with. The house members kept referencing unsubstantiated QAnon-related claims, but TikTok didnât want to be accused of suppressing partisan speech. TikTok employees warned the house several times, but never took the account down.
Several told Protocol they were worried about the companyâs ability to handle these types of issues with high turnover. One former TikTok employee said only one of the U.S.-based employees currently working on the threat analyst side worked at TikTok during the 2020 election. TikTok declined to comment on this claim.
David Polgar, founder of All Tech is Human, said one of the reasons for high trust and safety attrition more broadly is the explosion of the field. Every tech company worth its salt is looking for quality trust and safety employees, he said: âIf youâre doing trust and safety for a major platform that is on the up-and-up like TikTok, you are also a really hot commodity for any startup.â
Burnout is common among trust and safety professionals and may also be a factor in high churn. Even if youâre not a direct content moderator, youâre working in a constant flow of disturbing content.
That type of turnover isnât always a bad thing, said Katie Harbath, founder of Anchor Change and former public policy director at Facebook. âYou are seeing people that were incubated in some of these other platforms, particularly your Metas and Googles, and also your Twitters and TikToks,â Harbath said. âTheyâre able to take that experience to other companies that may actually have nothing.â
TikTok, for its part, says itâs learned a lot from 2020 and is putting that knowledge to use this year. The company has already publicly released some of the lessons it learned after the 2020 election, including the need to improve its disinformation detection systems and educate creators on TikTokâs zero political ads policy. TikTok released its in-app election center six weeks earlier this year than in 2020 and is labeling content related to the 2022 midterms with a button leading to the information center. Hashtags like #elections2022 will also lead to the center and TikTokâs community guidelines. While content is in the process of being fact checked, it will be excluded from For You feeds.
Around the world, TikTok has hired more trust and safety employees, opening a Europe, Middle East, and Africa hub in Dublin and expanding its hub in San Francisco. It also launched content advisory councils in the U.S., Asia, and Europe. But former employees fear none of that will be enough without a battle-tested team in place.
Two years since the 2020 election, misinformation about the process and outcome still abounds. Mozillaâs Madung said TikTok will need to remain vigilant in the weeks following the midterms. The overarching goal is to avoid violence on or around election day, but Madung said TikTok needs to think about the deeper, pervasive damage caused by misinformation. Lies are persistent.
âPlatforms often tend to start the interventions too late, and then exit too early, and donât recognize that election misinformation is an incredibly durable piece of information,â Madung said.