Source Code: Your daily look at what matters in tech.

source-codesource codeauthorShen Lunewsletter, as well as original research and events.None64fd3cbe9f

Sign up

Get free access to the

Will be used in accordance with our Privacy Policy

I'm already a subscriber.
Protocol | China

I helped build ByteDance's vast censorship machine

I wasn't proud of it, and neither were my coworkers. But that's life in today's China.

I helped build ByteDance's vast censorship machine

A view from outside ByteDance's headquarters in Beijing.


Emmanuel Wong
/ Contributor via Getty Images

This is the story of Li An, a pseudonymous former employee at ByteDance, as told to Protocol's Shen Lu.

It was the night Dr. Li Wenliang struggled for his last breath in the emergency room of Wuhan Central Hospital. I, like many Chinese web users, had stayed awake to refresh my Weibo feed constantly for updates on his condition. Dr. Li was an ophthalmologist who sounded the alarm early in the COVID-19 outbreak. He soon faced government intimidation and then contracted the virus. When he passed away in the early hours of Friday, Feb. 7, 2020, I was among many Chinese netizens who expressed grief and outrage at the events on Weibo, only to have my account deleted.

I felt guilt more than anger. At the time, I was a tech worker at ByteDance, where I helped develop tools and platforms for content moderation. In other words, I had helped build the system that censored accounts like mine. I was helping to bury myself in China's ever-expanding cyber grave.

I hadn't received explicit directives about Li Wenliang, but Weibo was certainly not the only Chinese tech company relentlessly deleting posts and accounts that night. I knew ByteDance's army of content moderators were using the tools and algorithms that I helped develop to delete content, change the narrative and alter memories of the suffering and trauma inflicted on Chinese people during the COVID-19 outbreak. I couldn't help but feel every day like I was a tiny cog in a vast, evil machine.

ByteDance is one of China's largest unicorns and creator of short video-sharing app TikTok, its original Chinese version Douyin and news aggregator Toutiao. Last year, when ByteDance was at the center of U.S. controversy over data-sharing with Beijing, it cut its domestic engineers' access to products overseas, including TikTok. TikTok has plans to launch two physical Transparency Centers in Los Angeles and Washington, D.C., to showcase content moderation practices. But in China, content moderation is mostly kept in the shadows.

I was on a central technology team that supports the Trust and Safety team, which sits within ByteDance's core data department. The data department is mainly devoted to developing technologies for short-video platforms. As of early 2020, the technologies we created supported the entire company's content moderation in and outside China, including Douyin at home and its international equivalent, TikTok. About 50 staff worked on the product team and between 100 to 150 software engineers worked on the technical team. Additionally, ByteDance employed about 20,000 content moderators to monitor content in China. They worked at what are known internally as "bases" (基地) in Tianjin, Chengdu (in Sichuan), Jinan (in Shandong) and other cities. Some were ByteDance employees, others contractors.

My job was to use technology to make the low-level content moderators' work more efficient. For example, we created a tool that allowed them to throw a video clip into our database and search for similar content.

When I was at ByteDance, we received multiple requests from the bases to develop an algorithm that could automatically detect when a Douyin user spoke Uyghur, and then cut off the livestream session. The moderators had asked for this because they didn't understand the language. Streamers speaking ethnic languages and dialects that Mandarin-speakers don't understand would receive a warning to switch to Mandarin. If they didn't comply, moderators would respond by manually cutting off the livestreams, regardless of the actual content. But when it comes to Uyghur, with an algorithm that did this automatically, the moderators wouldn't have to be responsible for missing content that authorities could deem to have instigated "separatism" or "terrorism." We eventually decided not to do it: We didn't have enough Uyghur language data points in our system, and the most popular livestream rooms were already closely monitored.

The truth is, political speech comprised a tiny fraction of deleted content. Chinese netizens are fluent in self-censorship and know what not to say. ByteDance's platforms — Douyin, Toutiao, Xigua and Huoshan — are mostly entertainment apps. We mostly censored content the Chinese government considers morally hazardous — pornography, lewd conversations, nudity, graphic images and curse words — as well as unauthorized livestreaming sales and content that violated copyright.

But political speech still looms large. What Chinese user-generated content platforms most fear is failing to delete politically sensitive content that later puts the company under heavy government scrutiny. It's a life-and-death matter. Occasionally, ByteDance's content moderation system would go down for a few minutes. It was nerve-wracking because we didn't know what kind of political disaster could occur in that window. As a young unicorn, ByteDance does not have strong government relationships like other tech giants do, so it's walking a tightrope every second.

The team I was part of, content moderation policymakers, plus the army of about 20,000 content moderators, have helped shield ByteDance from major political repercussions and achieve commercial success. ByteDance's powerful algorithms not only can make precise predictions and recommend content to users — one of the things it's best known for in the rest of the world — but can also assist content moderators with swift censorship. Not many tech companies in China have so many resources dedicated to moderating content. Other user-generated content platforms in China have nothing on ByteDance.

Many of my colleagues felt uneasy about what we were doing. Some of them had studied journalism in college. Some were graduates of top universities. They were well-educated and liberal-leaning. We would openly talk from time to time about how our work aided censorship. But we all felt that there was nothing we could do.

A dim light of idealism still burned, of course. Perhaps it was naive of me — I had thought if I tried a bit harder, maybe I could "raise the muzzle of the gun an inch," as they say in Chinese: to let a bit more speech sneak through. Eventually, I learned how limited my influence really was.

When it comes to day-to-day censorship, the Cyberspace Administration of China would frequently issue directives to ByteDance's Content Quality Center (内容质量中心), which oversees the company's domestic moderation operation: sometimes over 100 directives a day. They would then task different teams with applying the specific instructions to both ongoing speech and to past content, which needed to be searched to determine whether it was allowed to stand.

During livestreaming shows, every audio clip would be automatically transcribed into text, allowing algorithms to compare the notes with a long and constantly-updated list of sensitive words, dates and names, as well as Natural Language Processing models. Algorithms would then analyze whether the content was risky enough to require individual monitoring.

If a user mentioned a sensitive term, a content moderator would receive the original video clip and the transcript showing where the term appeared. If the moderator deemed the speech sensitive or inappropriate, they would shut down the ongoing livestreaming session and even suspend or delete the account. Around politically sensitive holidays, such as Oct. 1 (China's National Day), July 1 (the birthday of the Chinese Communist Party) or major political anniversaries like the anniversary of the 1989 protests and crackdown in Tiananmen Square, the Content Quality Center would generate special lists of sensitive terms for content moderators to use. Influencers enjoyed some special treatment — there were content moderators assigned specifically to monitor certain influencers' channels in case their content or accounts were mistakenly deleted. Some extremely popular influencers, state media and government agencies were on a ByteDance-generated white list, free from any censorship — their compliance was assumed.

Colleagues on my team were not in direct contact with content moderators or internet regulators. The Content Quality Center came up with moderation guidelines and worked directly with base managers on implementation. After major events or sensitive anniversaries, colleagues from the operational side would debrief everyone on what worked and what needed improvement. We were in those meetings to see what we could do to better support the censorship operation.

Our role was to make sure that low-level content moderators could find "harmful and dangerous content" as soon as possible, just like fishing out needles from an ocean. And we were tasked with improving censorship efficiency. That is, use as few people as possible to detect as much content as possible that violated ByteDance's community guidelines. I do not recall any major political blowback from the Chinese government during my time at ByteDance, meaning we did our jobs.

It was certainly not a job I'd tell my friends and family about with pride. When they asked what I did at ByteDance, I usually told them I deleted posts (删帖). Some of my friends would say, "Now I know who gutted my account." The tools I helped create can also help fight dangers like fake news. But in China, one primary function of these technologies is to censor speech and erase collective memories of major events, however infrequently this function gets used.

Dr. Li warned his colleagues and friends about an unknown virus that was encroaching on hospitals in Wuhan. He was punished for that. And for weeks, we had no idea what was really happening because of authorities' cover-up of the severity of the crisis. Around this time last year, many Chinese tech companies were actively deleting posts, videos, diaries and pictures that were not part of the "correct collective memory" that China's governments would later approve. Just imagine: Had any social media platform been able to reject the government's censorship directives and retain Dr. Li and other whistleblowers' warnings, perhaps millions of lives would have been saved today.

Power

The video game industry is bracing for its Netflix and Spotify moment

Subscription gaming promises to upend gaming. The jury's out on whether that's a good thing.

It's not clear what might fall through the cracks if most of the biggest game studios transition away from selling individual games and instead embrace a mix of free-to-play and subscription bundling.

Image: Christopher T. Fong/Protocol

Subscription services are coming for the game industry, and the shift could shake up the largest and most lucrative entertainment sector in the world. These services started as small, closed offerings typically available on only a handful of hardware platforms. Now, they're expanding to mobile phones and smart TVs, and promising to radically change the economics of how games are funded, developed and distributed.

Of the biggest companies in gaming today, Amazon, Apple, Electronic Arts, Google, Microsoft, Nintendo, Nvidia, Sony and Ubisoft all operate some form of game subscription. Far and away the most ambitious of them is Microsoft's Xbox Game Pass, featuring more than 100 games for $9.99 a month and including even brand-new titles the day they release. As of January, Game Pass had more than 18 million subscribers, and Microsoft's aggressive investment in a subscription future has become a catalyst for an industrywide reckoning on the likelihood and viability of such a model becoming standard.

Keep Reading Show less
Nick Statt
Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Over the last year, financial institutions have experienced unprecedented demand from their customers for exposure to cryptocurrency, and we've seen an inflow of institutional dollars driving bitcoin and other cryptocurrencies to record prices. Some banks have already launched cryptocurrency programs, but many more are evaluating the market.

That's why we've created the Crypto Maturity Model: an iterative roadmap for cryptocurrency product rollout, enabling financial institutions to evaluate market opportunities while addressing compliance requirements.

Keep Reading Show less
Caitlin Barnett, Chainanalysis
Caitlin’s legal and compliance experience encompasses both cryptocurrency and traditional finance. As Director of Regulation and Compliance at Chainalysis, she helps leading financial institutions strategize and build compliance programs in order to adopt cryptocurrencies and offer new products to their customers. In addition, Caitlin helps facilitate dialogue with regulators and the industry on key policy issues within the cryptocurrency industry.
Protocol | Policy

Lina Khan wants to hear from you

The new FTC chair is trying to get herself, and the sometimes timid tech-regulating agency she oversees, up to speed while she still can.

Lina Khan is trying to push the FTC to corral tech companies

Photo: Graeme Jennings/AFP via Getty Images

"When you're in D.C., it's very easy to lose connection with the very real issues that people are facing," said Lina Khan, the FTC's new chair.

Khan made her debut as chair before the press on Wednesday, showing up to a media event carrying an old maroon book from the agency's library and calling herself a "huge nerd" on FTC history. She launched into explaining how much she enjoys the open commission meetings she's pioneered since taking over in June. That's especially true of the marathon public comment sessions that have wrapped up each of the two meetings so far.

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Protocol | Fintech

Beyond Robinhood: Stock exchange rebates are under scrutiny too

Some critics have compared the way exchanges attract orders from customers to the payment for order flow system that has enriched retail brokers.

The New York Stock Exchange is now owned by the Intercontinental Exchange.

Photo: Aditya Vyas/Unsplash

As questions pile up about how powerful and little-known Wall Street entities rake in profits from stock trading, the exchanges that handle vast portions of everyday trading are being scrutinized for how they make money, too.

One mechanism in particular — exchange rebates, or payments from the exchanges for getting certain trades routed to them — has raised concerns with regulators and members of Congress.

Keep Reading Show less
Tomio Geron

Tomio Geron ( @tomiogeron) is a San Francisco-based reporter covering fintech. He was previously a reporter and editor at The Wall Street Journal, covering venture capital and startups. Before that, he worked as a staff writer at Forbes, covering social media and venture capital, and also edited the Midas List of top tech investors. He has also worked at newspapers covering crime, courts, health and other topics. He can be reached at tgeron@protocol.com or tgeron@protonmail.com.

Protocol | Workplace

The Activision Blizzard lawsuit has opened the floodgates

An employee walkout, a tumbling stock price and damning new reports of misconduct.

Activision Blizzard is being sued for widespread sexism, harassment and discrimination.

Photo: Bloomberg/Getty Images

Activision Blizzard is in crisis mode. The World of Warcraft publisher was the subject of a shocking lawsuit filed by California's Department of Fair Employment and Housing last week over claims of widespread sexism, harassment and discrimination against female employees. The resulting fallout has only intensified by the day, culminating in a 500-person walkout at the headquarters of Blizzard Entertainment in Irvine on Wednesday.

The company's stock price has tumbled nearly 10% this week, and CEO Bobby Kotick acknowledged in a message to employees Tuesday that Activision Blizzard's initial response was "tone deaf." Meanwhile, there has been a continuous stream of new reports unearthing horrendous misconduct as more and more former and current employees speak out about the working conditions and alleged rampant misogyny at one of the video game industry's largest and most powerful employers.

Keep Reading Show less
Nick Statt
Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.
Latest Stories