China

I helped build ByteDance's vast censorship machine

I wasn't proud of it, and neither were my coworkers. But that's life in today's China.

I helped build ByteDance's vast censorship machine

A view from outside ByteDance's headquarters in Beijing.


Emmanuel Wong
/ Contributor via Getty Images

This is the story of Li An, a pseudonymous former employee at ByteDance, as told to Protocol's Shen Lu.

It was the night Dr. Li Wenliang struggled for his last breath in the emergency room of Wuhan Central Hospital. I, like many Chinese web users, had stayed awake to refresh my Weibo feed constantly for updates on his condition. Dr. Li was an ophthalmologist who sounded the alarm early in the COVID-19 outbreak. He soon faced government intimidation and then contracted the virus. When he passed away in the early hours of Friday, Feb. 7, 2020, I was among many Chinese netizens who expressed grief and outrage at the events on Weibo, only to have my account deleted.

I felt guilt more than anger. At the time, I was a tech worker at ByteDance, where I helped develop tools and platforms for content moderation. In other words, I had helped build the system that censored accounts like mine. I was helping to bury myself in China's ever-expanding cyber grave.

I hadn't received explicit directives about Li Wenliang, but Weibo was certainly not the only Chinese tech company relentlessly deleting posts and accounts that night. I knew ByteDance's army of content moderators were using the tools and algorithms that I helped develop to delete content, change the narrative and alter memories of the suffering and trauma inflicted on Chinese people during the COVID-19 outbreak. I couldn't help but feel every day like I was a tiny cog in a vast, evil machine.

ByteDance is one of China's largest unicorns and creator of short video-sharing app TikTok, its original Chinese version Douyin and news aggregator Toutiao. Last year, when ByteDance was at the center of U.S. controversy over data-sharing with Beijing, it cut its domestic engineers' access to products overseas, including TikTok. TikTok has plans to launch two physical Transparency Centers in Los Angeles and Washington, D.C., to showcase content moderation practices. But in China, content moderation is mostly kept in the shadows.

I was on a central technology team that supports the Trust and Safety team, which sits within ByteDance's core data department. The data department is mainly devoted to developing technologies for short-video platforms. As of early 2020, the technologies we created supported the entire company's content moderation in and outside China, including Douyin at home and its international equivalent, TikTok. About 50 staff worked on the product team and between 100 to 150 software engineers worked on the technical team. Additionally, ByteDance employed about 20,000 content moderators to monitor content in China. They worked at what are known internally as "bases" (基地) in Tianjin, Chengdu (in Sichuan), Jinan (in Shandong) and other cities. Some were ByteDance employees, others contractors.

My job was to use technology to make the low-level content moderators' work more efficient. For example, we created a tool that allowed them to throw a video clip into our database and search for similar content.

When I was at ByteDance, we received multiple requests from the bases to develop an algorithm that could automatically detect when a Douyin user spoke Uyghur, and then cut off the livestream session. The moderators had asked for this because they didn't understand the language. Streamers speaking ethnic languages and dialects that Mandarin-speakers don't understand would receive a warning to switch to Mandarin. If they didn't comply, moderators would respond by manually cutting off the livestreams, regardless of the actual content. But when it comes to Uyghur, with an algorithm that did this automatically, the moderators wouldn't have to be responsible for missing content that authorities could deem to have instigated "separatism" or "terrorism." We eventually decided not to do it: We didn't have enough Uyghur language data points in our system, and the most popular livestream rooms were already closely monitored.

The truth is, political speech comprised a tiny fraction of deleted content. Chinese netizens are fluent in self-censorship and know what not to say. ByteDance's platforms — Douyin, Toutiao, Xigua and Huoshan — are mostly entertainment apps. We mostly censored content the Chinese government considers morally hazardous — pornography, lewd conversations, nudity, graphic images and curse words — as well as unauthorized livestreaming sales and content that violated copyright.

But political speech still looms large. What Chinese user-generated content platforms most fear is failing to delete politically sensitive content that later puts the company under heavy government scrutiny. It's a life-and-death matter. Occasionally, ByteDance's content moderation system would go down for a few minutes. It was nerve-wracking because we didn't know what kind of political disaster could occur in that window. As a young unicorn, ByteDance does not have strong government relationships like other tech giants do, so it's walking a tightrope every second.

The team I was part of, content moderation policymakers, plus the army of about 20,000 content moderators, have helped shield ByteDance from major political repercussions and achieve commercial success. ByteDance's powerful algorithms not only can make precise predictions and recommend content to users — one of the things it's best known for in the rest of the world — but can also assist content moderators with swift censorship. Not many tech companies in China have so many resources dedicated to moderating content. Other user-generated content platforms in China have nothing on ByteDance.

Many of my colleagues felt uneasy about what we were doing. Some of them had studied journalism in college. Some were graduates of top universities. They were well-educated and liberal-leaning. We would openly talk from time to time about how our work aided censorship. But we all felt that there was nothing we could do.

A dim light of idealism still burned, of course. Perhaps it was naive of me — I had thought if I tried a bit harder, maybe I could "raise the muzzle of the gun an inch," as they say in Chinese: to let a bit more speech sneak through. Eventually, I learned how limited my influence really was.

When it comes to day-to-day censorship, the Cyberspace Administration of China would frequently issue directives to ByteDance's Content Quality Center (内容质量中心), which oversees the company's domestic moderation operation: sometimes over 100 directives a day. They would then task different teams with applying the specific instructions to both ongoing speech and to past content, which needed to be searched to determine whether it was allowed to stand.

During livestreaming shows, every audio clip would be automatically transcribed into text, allowing algorithms to compare the notes with a long and constantly-updated list of sensitive words, dates and names, as well as Natural Language Processing models. Algorithms would then analyze whether the content was risky enough to require individual monitoring.

If a user mentioned a sensitive term, a content moderator would receive the original video clip and the transcript showing where the term appeared. If the moderator deemed the speech sensitive or inappropriate, they would shut down the ongoing livestreaming session and even suspend or delete the account. Around politically sensitive holidays, such as Oct. 1 (China's National Day), July 1 (the birthday of the Chinese Communist Party) or major political anniversaries like the anniversary of the 1989 protests and crackdown in Tiananmen Square, the Content Quality Center would generate special lists of sensitive terms for content moderators to use. Influencers enjoyed some special treatment — there were content moderators assigned specifically to monitor certain influencers' channels in case their content or accounts were mistakenly deleted. Some extremely popular influencers, state media and government agencies were on a ByteDance-generated white list, free from any censorship — their compliance was assumed.

Colleagues on my team were not in direct contact with content moderators or internet regulators. The Content Quality Center came up with moderation guidelines and worked directly with base managers on implementation. After major events or sensitive anniversaries, colleagues from the operational side would debrief everyone on what worked and what needed improvement. We were in those meetings to see what we could do to better support the censorship operation.

Our role was to make sure that low-level content moderators could find "harmful and dangerous content" as soon as possible, just like fishing out needles from an ocean. And we were tasked with improving censorship efficiency. That is, use as few people as possible to detect as much content as possible that violated ByteDance's community guidelines. I do not recall any major political blowback from the Chinese government during my time at ByteDance, meaning we did our jobs.

It was certainly not a job I'd tell my friends and family about with pride. When they asked what I did at ByteDance, I usually told them I deleted posts (删帖). Some of my friends would say, "Now I know who gutted my account." The tools I helped create can also help fight dangers like fake news. But in China, one primary function of these technologies is to censor speech and erase collective memories of major events, however infrequently this function gets used.

Dr. Li warned his colleagues and friends about an unknown virus that was encroaching on hospitals in Wuhan. He was punished for that. And for weeks, we had no idea what was really happening because of authorities' cover-up of the severity of the crisis. Around this time last year, many Chinese tech companies were actively deleting posts, videos, diaries and pictures that were not part of the "correct collective memory" that China's governments would later approve. Just imagine: Had any social media platform been able to reject the government's censorship directives and retain Dr. Li and other whistleblowers' warnings, perhaps millions of lives would have been saved today.

LA is a growing tech hub. But not everyone may fit.

LA has a housing crisis similar to Silicon Valley’s. And single-family-zoning laws are mostly to blame.

As the number of tech companies in the region grows, so does the number of tech workers, whose high salaries put them at an advantage in both LA's renting and buying markets.

Photo: Nat Rubio-Licht/Protocol

LA’s tech scene is on the rise. The number of unicorn companies in Los Angeles is growing, and the city has become the third-largest startup ecosystem nationally behind the Bay Area and New York with more than 4,000 VC-backed startups in industries ranging from aerospace to creators. As the number of tech companies in the region grows, so does the number of tech workers. The city is quickly becoming more and more like Silicon Valley — a new startup and a dozen tech workers on every corner and companies like Google, Netflix, and Twitter setting up offices there.

But with growth comes growing pains. Los Angeles, especially the burgeoning Silicon Beach area — which includes Santa Monica, Venice, and Marina del Rey — shares something in common with its namesake Silicon Valley: a severe lack of housing.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

While there remains debate among economists about whether we are officially in a full-blown recession, the signs are certainly there. Like most executives right now, the outlook concerns me.

In any case, businesses aren’t waiting for the official pronouncement. They’re already bracing for impact as U.S. inflation and interest rates soar. Inflation peaked at 9.1% in June 2022 — the highest increase since November 1981 — and the Federal Reserve is targeting an interest rate of 3% by the end of this year.

Keep Reading Show less
Nancy Sansom

Nancy Sansom is the Chief Marketing Officer for Versapay, the leader in Collaborative AR. In this role, she leads marketing, demand generation, product marketing, partner marketing, events, brand, content marketing and communications. She has more than 20 years of experience running successful product and marketing organizations in high-growth software companies focused on HCM and financial technology. Prior to joining Versapay, Nancy served on the senior leadership teams at PlanSource, Benefitfocus and PeopleMatter.

Policy

SFPD can now surveil a private camera network funded by Ripple chair

The San Francisco Board of Supervisors approved a policy that the ACLU and EFF argue will further criminalize marginalized groups.

SFPD will be able to temporarily tap into private surveillance networks in certain circumstances.

Photo: Justin Sullivan/Getty Images

Ripple chairman and co-founder Chris Larsen has been funding a network of security cameras throughout San Francisco for a decade. Now, the city has given its police department the green light to monitor the feeds from those cameras — and any other private surveillance devices in the city — in real time, whether or not a crime has been committed.

This week, San Francisco’s Board of Supervisors approved a controversial plan to allow SFPD to temporarily tap into private surveillance networks during life-threatening emergencies, large events, and in the course of criminal investigations, including investigations of misdemeanors. The decision came despite fervent opposition from groups, including the ACLU of Northern California and the Electronic Frontier Foundation, which say the police department’s new authority will be misused against protesters and marginalized groups in a city that has been a bastion for both.

Keep Reading Show less
Issie Lapowsky

Issie Lapowsky ( @issielapowsky) is Protocol's chief correspondent, covering the intersection of technology, politics, and national affairs. She also oversees Protocol's fellowship program. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University's Center for Publishing on how tech giants have affected publishing.

Enterprise

These two AWS vets think they can finally solve enterprise blockchain

Vendia, founded by Tim Wagner and Shruthi Rao, wants to help companies build real-time, decentralized data applications. Its product allows enterprises to more easily share code and data across clouds, regions, companies, accounts, and technology stacks.

“We have this thesis here: Cloud was always the missing ingredient in blockchain, and Vendia added it in,” Wagner (right) told Protocol of his and Shruthi Rao's company.

Photo: Vendia

The promise of an enterprise blockchain was not lost on CIOs — the idea that a database or an API could keep corporate data consistent with their business partners, be it their upstream supply chains, downstream logistics, or financial partners.

But while it was one of the most anticipated and hyped technologies in recent memory, blockchain also has been one of the most failed technologies in terms of enterprise pilots and implementations, according to Vendia CEO Tim Wagner.

Keep Reading Show less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Fintech

Kraken's CEO got tired of being in finance

Jesse Powell tells Protocol the bureaucratic obligations of running a financial services business contributed to his decision to step back from his role as CEO of one of the world’s largest crypto exchanges.

Photo: David Paul Morris/Bloomberg via Getty Images

Kraken is going through a major leadership change after what has been a tough year for the crypto powerhouse, and for departing CEO Jesse Powell.

The crypto market is still struggling to recover from a major crash, although Kraken appears to have navigated the crisis better than other rivals. Despite his exchange’s apparent success, Powell found himself in the hot seat over allegations published in The New York Times that he made insensitive comments on gender and race that sparked heated conversations within the company.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Latest Stories
Bulletins