Source Code: Your daily look at what matters in tech.

source-codesource codeauthorShen Lunewsletter, as well as original research and events.None64fd3cbe9f

Sign up

Get free access to the

Will be used in accordance with our Privacy Policy

I'm already a subscriber.
Protocol | China

I helped build ByteDance's censorship machine

I wasn't proud of it, and neither were my coworkers. But that's life in today's China.

I helped build ByteDance's censorship machine

A view from outside ByteDance's headquarters in Beijing.


Emmanuel Wong
/ Contributor via Getty Images

This is the story of Li An, a pseudonymous former employee at ByteDance, as told to Protocol's Shen Lu.

It was the night Dr. Li Wenliang struggled for his last breath in the emergency room of Wuhan Central Hospital. I, like many Chinese web users, had stayed awake to refresh my Weibo feed constantly for updates on his condition. Dr. Li was an ophthalmologist who sounded the alarm early in the COVID-19 outbreak. He soon faced government intimidation and then contracted the virus. When he passed away in the early hours of Friday, Feb. 7, 2020, I was among many Chinese netizens who expressed grief and outrage at the events on Weibo, only to have my account deleted.

I felt guilt more than anger. At the time, I was a tech worker at ByteDance, where I helped develop tools and platforms for content moderation. In other words, I had helped build the system that censored accounts like mine. I was helping to bury myself in China's ever-expanding cyber grave.

I hadn't received explicit directives about Li Wenliang, but Weibo was certainly not the only Chinese tech company relentlessly deleting posts and accounts that night. I knew ByteDance's army of content moderators were using the tools and algorithms that I helped develop to delete content, change the narrative and alter memories of the suffering and trauma inflicted on Chinese people during the COVID-19 outbreak. I couldn't help but feel every day like I was a tiny cog in a vast, evil machine.

ByteDance is one of China's largest unicorns and creator of short video-sharing app TikTok, its original Chinese version Douyin and news aggregator Toutiao. Last year, when ByteDance was at the center of U.S. controversy over data-sharing with Beijing, it cut its domestic engineers' access to products overseas, including TikTok. TikTok has plans to launch two physical Transparency Centers in Los Angeles and Washington, D.C., to showcase content moderation practices. But in China, content moderation is mostly kept in the shadows.

I was on a central technology team that supports the Trust and Safety team, which sits within ByteDance's core data department. The data department is mainly devoted to developing technologies for short-video platforms. As of early 2020, the technologies we created supported the entire company's content moderation in and outside China, including Douyin at home and its international equivalent, TikTok. About 50 staff worked on the product team and between 100 to 150 software engineers worked on the technical team. Additionally, ByteDance employed about 20,000 content moderators to monitor content in China. They worked at what are known internally as "bases" (基地) in Tianjin, Chengdu (in Sichuan), Jinan (in Shandong) and other cities. Some were ByteDance employees, others contractors.

My job was to use technology to make the low-level content moderators' work more efficient. For example, we created a tool that allowed them to throw a video clip into our database and search for similar content.

When I was at ByteDance, we received multiple requests from the bases to develop an algorithm that could automatically detect when a Douyin user spoke Uyghur, and then cut off the livestream session. The moderators had asked for this because they didn't understand the language. Streamers speaking ethnic languages and dialects that Mandarin-speakers don't understand would receive a warning to switch to Mandarin. If they didn't comply, moderators would respond by manually cutting off the livestreams, regardless of the actual content. But when it comes to Uyghur, with an algorithm that did this automatically, the moderators wouldn't have to be responsible for missing content that authorities could deem to have instigated "separatism" or "terrorism." We eventually decided not to do it: We didn't have enough Uyghur language data points in our system, and the most popular livestream rooms were already closely monitored.

The truth is, political speech comprised a tiny fraction of deleted content. Chinese netizens are fluent in self-censorship and know what not to say. ByteDance's platforms — Douyin, Toutiao, Xigua and Huoshan — are mostly entertainment apps. We mostly censored content the Chinese government considers morally hazardous — pornography, lewd conversations, nudity, graphic images and curse words — as well as unauthorized livestreaming sales and content that violated copyright.

But political speech still looms large. What Chinese user-generated content platforms most fear is failing to delete politically sensitive content that later puts the company under heavy government scrutiny. It's a life-and-death matter. Occasionally, ByteDance's content moderation system would go down for a few minutes. It was nerve-wracking because we didn't know what kind of political disaster could occur in that window. As a young unicorn, ByteDance does not have strong government relationships like other tech giants do, so it's walking a tightrope every second.

The team I was part of, content moderation policymakers, plus the army of about 20,000 content moderators, have helped shield ByteDance from major political repercussions and achieve commercial success. ByteDance's powerful algorithms not only can make precise predictions and recommend content to users — one of the things it's best known for in the rest of the world — but can also assist content moderators with swift censorship. Not many tech companies in China have so many resources dedicated to moderating content. Other user-generated content platforms in China have nothing on ByteDance.

Many of my colleagues felt uneasy about what we were doing. Some of them had studied journalism in college. Some were graduates of top universities. They were well-educated and liberal-leaning. We would openly talk from time to time about how our work aided censorship. But we all felt that there was nothing we could do.

A dim light of idealism still burned, of course. Perhaps it was naive of me — I had thought if I tried a bit harder, maybe I could "raise the muzzle of the gun an inch," as they say in Chinese: to let a bit more speech sneak through. Eventually, I learned how limited my influence really was.

When it comes to day-to-day censorship, the Cyberspace Administration of China would frequently issue directives to ByteDance's Content Quality Center (内容质量中心), which oversees the company's domestic moderation operation: sometimes over 100 directives a day. They would then task different teams with applying the specific instructions to both ongoing speech and to past content, which needed to be searched to determine whether it was allowed to stand.

During livestreaming shows, every audio clip would be automatically transcribed into text, allowing algorithms to compare the notes with a long and constantly-updated list of sensitive words, dates and names, as well as Natural Language Processing models. Algorithms would then analyze whether the content was risky enough to require individual monitoring.

If a user mentioned a sensitive term, a content moderator would receive the original video clip and the transcript showing where the term appeared. If the moderator deemed the speech sensitive or inappropriate, they would shut down the ongoing livestreaming session and even suspend or delete the account. Around politically sensitive holidays, such as Oct. 1 (China's National Day), July 1 (the birthday of the Chinese Communist Party) or major political anniversaries like the anniversary of the 1989 protests and crackdown in Tiananmen Square, the Content Quality Center would generate special lists of sensitive terms for content moderators to use. Influencers enjoyed some special treatment — there were content moderators assigned specifically to monitor certain influencers' channels in case their content or accounts were mistakenly deleted. Some extremely popular influencers, state media and government agencies were on a ByteDance-generated white list, free from any censorship — their compliance was assumed.

Colleagues on my team were not in direct contact with content moderators or internet regulators. The Content Quality Center came up with moderation guidelines and worked directly with base managers on implementation. After major events or sensitive anniversaries, colleagues from the operational side would debrief everyone on what worked and what needed improvement. We were in those meetings to see what we could do to better support the censorship operation.

Our role was to make sure that low-level content moderators could find "harmful and dangerous content" as soon as possible, just like fishing out needles from an ocean. And we were tasked with improving censorship efficiency. That is, use as few people as possible to detect as much content as possible that violated ByteDance's community guidelines. I do not recall any major political blowback from the Chinese government during my time at ByteDance, meaning we did our jobs.

It was certainly not a job I'd tell my friends and family about with pride. When they asked what I did at ByteDance, I usually told them I deleted posts (删帖). Some of my friends would say, "Now I know who gutted my account." The tools I helped create can also help fight dangers like fake news. But in China, one primary function of these technologies is to censor speech and erase collective memories of major events, however infrequently this function gets used.

Dr. Li warned his colleagues and friends about an unknown virus that was encroaching on hospitals in Wuhan. He was punished for that. And for weeks, we had no idea what was really happening because of authorities' cover-up of the severity of the crisis. Around this time last year, many Chinese tech companies were actively deleting posts, videos, diaries and pictures that were not part of the "correct collective memory" that China's governments would later approve. Just imagine: Had any social media platform been able to reject the government's censorship directives and retain Dr. Li and other whistleblowers' warnings, perhaps millions of lives would have been saved today.

The metaverse is coming, and Robinhood's IPO is here

Plus, what we learned from Big Tech's big quarter.

Image: Roblox

On this episode of the Source Code podcast: First, a few takeaways from another blockbuster quarter in the tech industry. Then, Janko Roettgers joins the show to discuss Big Tech's obsession with the metaverse and the platform war that seems inevitable. Finally, Ben Pimentel talks about Robinhood's IPO, and the company's crazy route to the public markets.

For more on the topics in this episode:

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

After a year and a half of living and working through a pandemic, it's no surprise that employees are sending out stress signals at record rates. According to a 2021 study by Indeed, 52% of employees today say they feel burnt out. Over half of employees report working longer hours, and a quarter say they're unable to unplug from work.

The continued swell of reported burnout is a concerning trend for employers everywhere. Not only does it harm mental health and well-being, but it can also impact absenteeism, employee retention and — between the drain on morale and high turnover — your company culture.

Crisis management is one thing, but how do you permanently lower the temperature so your teams can recover sustainably? Companies around the world are now taking larger steps to curb burnout, with industry leaders like LinkedIn, Hootsuite and Bumble shutting down their offices for a full week to allow all employees extra time off. The CEO of Okta, worried about burnout, asked all employees to email him their vacation plans in 2021.

Keep Reading Show less
Stella Garber
Stella Garber is Trello's Head of Marketing. Stella has led Marketing at Trello for the last seven years from early stage startup all the way through its acquisition by Atlassian in 2017 and beyond. Stella was an early champion of remote work, having led remote teams for the last decade plus.

Facebook wants to be like Snapchat

Facebook is looking to make posts disappear, Google wants to make traffic reports more accurate, and more patents from Big Tech.

Facebook has ephemeral posts on its mind.

Image: Protocol

Welcome to another week of Big Tech patents. Google wants to make traffic reports more accurate, Amazon wants to make voice assistants more intelligent, Microsoft wants to make scheduling meetings more convenient, and a ton more.

As always, remember that the big tech companies file all kinds of crazy patents for things, and though most never amount to anything, some end up defining the future

Keep Reading Show less
Karyne Levy

Karyne Levy ( @karynelevy) is the West Coast editor at Protocol. Before joining Protocol, Karyne was a senior producer at Scribd, helping to create the original content program. Prior to that she was an assigning editor at NerdWallet, a senior tech editor at Business Insider, and the assistant managing editor at CNET, where she also hosted Rumor Has It for CNET TV. She lives outside San Francisco with her wife, son and lots of pets.

Protocol | China

China’s edtech crackdown isn’t what you think. Here’s why.

It's part of an attempt to fix education inequality and address a looming demographic crisis.

In the past decade, China's private tutoring market has expanded rapidly as it's been digitized and bolstered by capital.

Photo: Getty Images

Beijing's strike against the private tutoring and ed tech industry has rattled the market and led observers to try to answer one big question: What is Beijing trying to achieve?

Sweeping policy guidelines issued by the Central Committee of the Chinese Communist Party on July 24 and the State Council now mandate that existing private tutoring companies register as nonprofit organizations. Extracurricular tutoring companies will be banned from going public. Online tutoring agencies will be subject to regulatory approval.

Keep Reading Show less
Shen Lu

Shen Lu is a reporter with Protocol | China. She has spent six years covering China from inside and outside its borders. Previously, she was a fellow at Asia Society's ChinaFile and a Beijing-based producer for CNN. Her writing has appeared in Foreign Policy, The New York Times and POLITICO, among other publications. Shen Lu is a founding member of Chinese Storytellers, a community serving and elevating Chinese professionals in the global media industry.

It’s soul-destroying and it uses DRM, therefore Peloton is tech

"I mean, the pedals go around if you turn off all the tech, but Peloton isn't selling a pedaling product."

Is this tech? Or is it just a bike with a screen?

Image: Peloton and Protocol

One of the breakout hits from the pandemic, besides Taylor Swift's "Folklore," has been Peloton. With upwards of 5.4 million members as of March and nearly $1.3 billion in revenue that quarter, a lot of people are turning in their gym memberships for a bike or a treadmill and a slick-looking app.

But here at Protocol, it's that slick-looking app, plus all the tech that goes into it, that matters. And that's where things got really heated during our chat this week. Is Peloton tech? Or is it just a bike with a giant tablet on it? Can all bikes be tech with a little elbow grease?

Keep Reading Show less
Karyne Levy

Karyne Levy ( @karynelevy) is the West Coast editor at Protocol. Before joining Protocol, Karyne was a senior producer at Scribd, helping to create the original content program. Prior to that she was an assigning editor at NerdWallet, a senior tech editor at Business Insider, and the assistant managing editor at CNET, where she also hosted Rumor Has It for CNET TV. She lives outside San Francisco with her wife, son and lots of pets.

Latest Stories