Content moderation is challenging, largely hidden work all over the globe. China’s tens of thousands of content moderators are no exception: They’re an invisible, overworked workforce who keep social apps from Douyin and WeChat to Bilibili and Weibo running. But unlike other overworked groups, such as gig delivery workers, their plight was nearly unknown to the public until the recent tragic death of a worker at one of China’s leading video platforms.
On Feb. 4, a 25-year-old content moderator for Bilibili died from a brain bleed. Screenshots floating online that his colleagues at Bilibili leaked to a tech blogger showed that he was asked to work extra shifts during the Lunar New Year, when almost the entire country’s workforce was on holiday. But Bilibili denied in an internal email he had worked overtime before his death, emphasizing he only worked regular eight-hour shifts the previous week.
Bilibili’s response to the unexpected death and ensuing public criticism was to hire 1,000 additional moderators to reduce individuals’ workloads, and to offer additional health screening and mental health counseling to the existing censorship workers.
The tragic death startled tens of millions of Chinese web users, sparking heated online discussions about the demanding nature of low-level work in Chinese tech companies, as well as the pervasiveness of censorship in China. The incident also prompted many former and current censors for various Chinese internet platforms to come out and talk about their own grueling experiences.
Content moderation is critical to keep platforms running, but the individuals toiling away behind the screens are certainly not treated as essential workers. One former Bilibili content moderator, using the pseudonym Chen Rou, told Chinese news aggregator Sohu that cameras are installed throughout the office to monitor them 24/7, and team leaders micromanage workers’ lunch breaks. The worst part, they added, is the endless overtime and performance assessments: If a content moderator wishes to pass monthly assessments, they must spend no more than 24 seconds screening each video clip, and must screen at least 1,500 clips per day. If a new hire missed the target too often, they’d be fired during their six-month probation period. “To save their jobs, new hires often have to clock in voluntary hours to make up for the set quota,” the worker said.
The salaries don’t match the stress of the job, Chen added, telling Sohu that a front-line content moderator’s monthly take-home pay after taxes is about 4,000 RMB ($631). In comparison, the 2021 average monthly base salary for private sector workers in Shanghai was 8,528 RMB ($1,348). And if someone fails the monthly exam, they’ll only get half that pay, another pseudonymous former Bilibili content moderator, Zhou Zhuo, said. This wage has barely budged from what a content moderator was paid 10 years ago, according to Liu Lipeng, who worked as an internet censor and a content quality manager at several Chinese tech companies for nearly 10 years.
Just like other tech jobs in China, logging in overtime is commonplace. White-collar tech workers are theoretically no longer held to the so-called “996” schedule, but content moderators face an even more exhausting schedule. According to the former Bilibili censorship workers, they had to work a 12-hour shift every other day. On top of that, they were required to provide support from home during their off hours, and if their shifts ran long by less than two hours, there was no overtime pay for them. “To meet their KPIs, some people might log in an extra six to eight hours,” Zhou told Sohu.
Moderation and censorship
Content moderation jobs are some of the most labor-intensive at any social platform in any country. Censorship workers at Chinese companies describe an unforgiving work environment much similar to what their peers at U.S. firms such as Facebook have reported: It’s an overly taxing, yet low-paying, job that leaves them barely any breaks throughout a long shift. The workers are heavily exposed to spam, crime, abuse, violence and more. In the U.S., a former content moderator in 2018 sued Meta (then Facebook), alleging that she developed PTSD on the job, resulting in Facebook paying $52 million in a class-action settlement.
But in China, the stakes are much higher. Not only do moderators have to handle objectionable content, but also they have to contend with outsized demand for political censorship. Workplace trauma is rarely discussed in public, even though Chinese content moderators arguably work under more extreme circumstances than their peers globally. In the U.S., censors missing something major could result in a bad PR cycle and maybe a Congressional hearing. In China, the consequences are more grave: The entire platform could be immediately shut down.
Even though many Chinese web users self-censor, and political speech comprises only a tiny part of deleted online content, “company executives are always on edge,” a former ByteDance tech worker told Protocol.
To mitigate risk, social platforms have developed AI-powered tools to make content moderators’ work more efficient. Still, they rely heavily on armies of censors to manually screen everything that’s posted online and, in the case of videos, before it goes live.
TikTok’s parent company, ByteDance, employs the biggest number of content moderators among Chinese tech companies by far: It has about 20,000 contract and in-house censors.
Not every Chinese tech company is as deep-pocketed as ByteDance, for sure. Bilibili, the firm at the center of the current controversy, revealed in a prospectus submitted to the Hong Kong Stock Exchange that by the end of 2020, it employed 2,413 moderators. That was 28% of its total workforce at the time. In comparison, Meta has more than 15,000 content moderators handling content on its platforms, though many are employed by contracting firms.
Like Meta and other U.S. social media platforms, Chinese tech companies also are increasingly outsourcing their content moderator workers to lower costs. Contract content moderation firms in the past five years have sprung up in second- or third-tier Chinese cities such as Jinan, Tianjin, Chengdu and Xi’an. To help expand their content moderation forces, the companies also lowered the education requirement for the workers. Liu told Protocol that when he was first hired as a content moderator for Weibo in 2011, all the new hires had college degrees. Today, it’s common for graduates of vocational schools or even high schools to get those jobs, he said.
“To some degree, a Chinese tech company's censorship mechanism determines the kind of social media product they can offer,” said Liu. “ByteDance is able to achieve a host of features in their products because of the size of their content moderation team.”