Policy

One year in, Meta’s civil rights team still needs a win

“The real question is: Are Sheryl and Mark going to listen?”

Many hands making fists surrounding an illustration of the Facebook "like" button.

If Roy Austin Jr.’s team can prove it’s capable of making more than incremental changes inside Meta, then those changes, and the team behind them, could very well become a model for everyone else, too.

Illustration: invincible_bulldog/iStock/Getty Images Plus; Protocol

Like everyone, Roy Austin Jr. saw President Trump’s Facebook post that went up in the summer of 2020 — the one where he threatened to shoot looters in Minnesota following George Floyd’s murder. And, like a lot of people, he was baffled by Facebook’s refusal to take it down.

“I honestly was like, ‘Why is that there?” Austin remembered.

Austin started his career working on police abuse cases and has spent the better part of three decades working on civil rights issues, including as a top official in the DOJ’s civil rights division and as a member of President Obama’s domestic policy council in the White House.

He recognized the words Trump was quoting — “when the looting starts, the shooting starts” — and how they’d been used as pretext for police violence in the past. “I saw it for what many other civil-rights-oriented people saw it for,” Austin said.

But unlike a lot of those people, or the Facebook employees who protested or the advertisers who boycotted in disgust, Facebook’s response didn’t alienate Austin. It only made him wonder how a decision like that got made and, more importantly, what might be done to ensure a decision like that wouldn’t get made again.

It turned out he was part of the answer — or at least, he hopes to be.

Roy Austin Jr., vice president of civil rights & deputy general counsel at Meta Platforms Inc., speaks during the Bloomberg Equality Summit in New York, U.S., on Tuesday, March 22, 2022. The event convenes leaders to discuss the progress on pledges, policies and programs that are creating a more equitable workplace and society. Austin and his team have a lot of people rooting for them, but they also have a lot left to prove. Photo: Jeenah Moon/Bloomberg via Getty Images

A little more than a year ago, Austin stepped into a new role at Facebook as vice president of Civil Rights — a position that was only created after a bruising third-party civil rights audit, which dropped shortly after the Trump post, accused the company of taking a "reactive and piecemeal" approach to civil rights.

That a major U.S. tech company would create a role like this was its own kind of victory for civil rights advocates. By all accounts, Facebook was the first tech giant to do so. The announcement, months later, that someone as serious as Austin was going to take the job — and hire a bunch of other serious people to help him — was an added bonus. “Roy is a person who likes a challenge. He likes to fix things,” said Lisa Rice, CEO of the National Fair Housing Alliance, which sued Facebook over alleged discrimination in housing ads in 2018. “I was elated that he took the position.”

In its first year, the team has made some initial progress, including doing away with sensitive ad targeting categories and making a major public commitment to study Facebook’s impact on people of different races, a topic that has been a third rail inside the company for years.

But despite these and other steps, interviews with a dozen people who have worked with Meta on civil rights issues both inside and outside of the company suggest that while Austin and his team have a lot of people rooting for them, they also have a lot left to prove. “It’s pretty clear that more meaningful and substantial work must be done by them and platforms like them,” said Dave Toomey, a voting rights and tech fellow at the Leadership Conference on Civil and Human Rights.

The same people who pushed Facebook to hire someone like Austin are now eager for evidence that he and his team will actually be empowered to act on the problems they inevitably unearth. “I have every confidence [Roy’s] heart is in the right place. He’s a real person for this job. He’s not a figurehead,” said Aaron Rieke, managing director of tech-focused civil rights group Upturn. “The real question is: Are Sheryl and Mark going to listen to him?”

“Keep seeing videos about Primates?”

Austin wondered that too, when he was considering taking the position.

In his interviews with Sheryl Sandberg and Facebook’s chief legal officer Jennifer Newstead, Austin said he peppered them with questions about whether the civil rights role would be more than just a rubber stamp to placate Facebook’s critics. “Do you really mean I can build a team? Do you really mean I have support? Are you serious about the civil rights audit?” Austin remembers asking.

He also consulted with his old colleague Laura Murphy, the woman who had conducted the Facebook civil rights audit and who arguably played a more instrumental role than anyone else in pushing Facebook to hire a civil rights lead. “I was always a cheerleader for this position,” said Murphy, who encouraged Austin to take the job. “I think if you really care about civil rights, then this is one means to have a tremendous impact.”

Murphy specifically liked Austin for the job, too. “When you work at the White House, you gain an appreciation of how hard it is to get a project across the finish line,” she said. “I wouldn’t want to have seen anybody in that position who didn’t know how to deal with bureaucracies.”

Austin ended up accepting the gig in a meeting with Zuckerberg himself, which Austin took as a sign that his team would have support from the highest levels of the company. But, he said, he never did bring up his concerns about the Trump posts. “I saw my role as more forward-looking than relitigating the past decisions the company made,” he said.

A good part of Austin’s first year was spent hiring a team of eight, luring talent away from the ACLU and academia with the promise they could make a global difference. His first hire, Cynthia Deitle, spent nearly two decades fighting hate crimes at the FBI and was about to embark on what she calls her “swan song” as a college professor when Austin called her with another offer — one that sounded a lot like what Sean Parker might have said to Mark Zuckerberg in “The Social Network” if Zuckerberg and Parker had been do-gooders, not mercenaries.

“You could take this [teaching] job and reach hundreds of people,” Deitle remembers Austin saying. “Or you could think about joining me and reach billions of people.” The pitch worked, and Deitle joined the company in March 2021.

Since then, a not-insignificant part of their job has involved putting out fires as they arise. Last year, when Facebook placed a prompt that read, “Keep seeing videos about Primates?” on a video of two Black men, Austin and Deitle held an emergency meeting to figure out how to fix it. More recently, when there was a content moderation dispute over whether a Black rapper on the site should be able to use the n-word, staffers sought out Austin’s advice. Austin decided it was being used as a term of endearment and let it slide. “We’ve made it very clear we as a team are open to anybody reaching out to us,” Austin said. “I don’t look at the size of the decision as to whether or not I’m going to get involved.”

Illustration: invincible_bulldog/iStock/Getty Images Plus; Protocol

Both former Facebook employees and civil rights leaders outside the company said they soon started to see Austin’s team’s imprint on small decisions, like whether to sign on to a letter supporting voting legislation being proposed last year. For years, Facebook had been wary of “poking Republicans in the eye,” said Crystal Patterson, Facebook’s former public policy manager, who was involved in the discussions around the letter. Without Austin’s input, Patterson said, “There’s no way we would have gotten traction to sign on to it,” which the company ultimately did.

Austin’s team was also instrumental in getting Facebook to write a separate letter publicly flogging the LAPD for its social media surveillance program, which was discovered by the Brennan Center for Justice. “I think the civil rights team is the reason this letter existed in the first place,” said Rachel Levinson-Waldman, deputy director of the Brennan Center’s liberty and national security program.

It’s not that there was no one at Facebook who could make these judgment calls before Austin came along — civil rights issues had been a little part of a lot of people’s jobs at the company for years. But former employees said they appreciated having a clear sense of who to turn to with these questions. “I care a lot about human rights and civil rights, but it’s not my background. It’s not the first way I think,” said Brian Fishman, Facebook’s former head of counterterrorism, who worked with Austin’s team on issues related to white supremacists. “You want someone there [for whom] it is the first way they think.”

We don’t talk about “bias,” no, no, no

But Facebook didn’t hire Austin or his team to sign letters or be crisis responders. It hired them to prevent those crises from happening in the first place, which is a much tougher — maybe even totally impossible — job.

A big part of prevention is understanding where problems exist to begin with, which explains how the team came to its most ambitious undertaking yet. Last fall, Meta made a detailed and public commitment to investigate Facebook’s impact on users of different races. “Are Black users treated differently than other users? Being able to measure that and be very transparent and forthright about the fact we are measuring that was incredibly important to me,” Austin said.

If that sounds like an incremental step, it may be — but not for Meta. The topic had been all but off-limits inside Facebook for years. In mid-2019, the company went so far as to shut down research that found a new account-removal system on Instagram was suspending more accounts that were likely linked to Black users. At other times, Facebook instructed employees not to even use the words “discrimination” or “bias” in their work.

The company’s argument all along has been that since Facebook doesn’t explicitly collect racial data about its users, it’s got to be awfully careful about using other data as a proxy, not to mention the potential invasiveness of attempting to infer someone’s race.

Facebook had already begun trying to figure out a method to study race in a way that would both be accurate and protect user privacy before Austin started at the company. But by all accounts, it was Austin’s team that pushed the idea over the edge and prompted the company last year to publish a technical paper on how it plans to go about this work.

“I am really quite impressed that Roy helped get that out,” said Upturn’s Rieke. Upturn’s own research has found that, even though Facebook has tried to prevent advertisers from explicitly excluding users of certain races or genders from receiving housing and employment ads, the ad delivery algorithms still end up skewing which groups actually see those ads. One of Upturn’s co-authors on that study, Miranda Bogen, now works at Meta and co-authored the technical paper about the race measurement study.

But papers and blog posts alone aren’t enough, said Rieke. What he and other civil rights leaders want to know is: What will Meta be willing to share about what it uncovers? “Now that they’ve made this decision to do this, and their methodology is public, Facebook needs to be responsive to the question of: What are you seeing?” Rieke said.

That question is particularly relevant in the aftermath of whistleblower Frances Haugen’s disclosures regarding internal research from Facebook’s civic integrity team. Like the civil rights team, the civic integrity team was supposed to be the squeaky wheel at Facebook — the team that found the platform’s worst abuses and worked to fix them. But Haugen’s disclosures and subsequent reporting have called into question how much fixing the team was actually allowed to do before it was ultimately disbanded.

“It’s great these researchers are doing this and announcing they’re doing this, but it leaves a lot of wiggle room on what data they share,” Patterson said of the race measurement study.

Indeed, internal documents obtained by The Washington Post last year reveal that in 2020, Facebook researchers found that the most egregious hate speech on the platform was being primarily directed at minorities — but executives worried that taking certain actions to address the problem would prompt backlash from “conservative partners.” The company also reportedly withheld those findings from Murphy and the other civil rights auditors.

Austin, for his part, said, “It is our intent to be open about what we find.” But doing so could risk putting Meta on tricky legal footing. Facebook is currently facing a lawsuit over discrimination in its ad delivery system. If this new research finds that Facebook’s accusers are right, will Austin, who reports to Meta’s chief of legal, really be empowered to admit it? “I see my job as a lawyer as getting things right,” Austin said. “Doing things that are legal and getting things right.”

“There’s one Julie”

Whatever Austin’s intentions, there’s a lot that is, and will always be, out of his control. Like the fact that there are nine full-time staffers on his team but more than 3 billion users across Meta’s platforms.

The civil rights team is hardly alone in feeling the enormity of that gap. “There’s never enough [people]. That’s the paradox,” said Fishman, who had been in charge of maintaining Facebook’s list of dangerous individuals and organizations before he left the company in November. “I’m describing to you the challenge of an organization that has grown too quickly and is, in some ways, too big. And there’s only nine of them.”

Illustration: invincible_bulldog/iStock/Getty Images Plus; Protocol

Right now, the civil rights group’s associate general counsel, Julie Wenah, meets with teams as they’re developing new products to ensure they’re applying a civil rights lens to that work. But as Deitle acknowledged, there are a whole lot of developers at Meta, and “there’s one Julie.” To address their one-Julie problem, the civil rights team is holding trainings and working on developing what Deitle described as “a flow chart” that engineers at the company will use to ensure they’re building civil rights thinking into the product development process. But that effort — nicknamed Project Height — is still light on details and very much a work in progress.

As someone who’s navigated the vast bureaucracy of Facebook, Fishman says it’s critical that Austin’s team not try to solve every problem all at once, but instead, to seek out some tangible victories. “You get a win around a discrete thing, and then you’ve got credibility,” Fishman said.

Meta’s civil rights team needs that credibility as much inside the company as it does outside. Austin may be the most senior civil rights executive that the company has ever had, but there are still lots of bosses above him who may need some convincing when protecting users’ civil rights risks pissing off the wrong people. That’s especially true at a time when the company’s core business model is facing an existential threat, and lawmakers of all political stripes all around the world appear more ready than ever to pounce on Meta for any perceived political slight.

And those bosses — namely, Zuckerberg, Sandberg and, more recently, Nick Clegg — have been almost militant in their resistance to making decisions that appear to benefit one group over another, prioritizing “free expression” above all else. As Patterson put it, “They think if they treat everybody the same, that’s lifting up everybody the same.”

So far, Austin said, Sandberg has been deeply involved in the civil rights team’s work and he’s been in many meetings with Zuckerberg himself. “All I can say is that there hasn’t been an idea that me and my team have presented that hasn’t been taken seriously and that there hasn’t been a discussion where I feel our voices weren’t heard,” Austin said.

Ultimately, whatever progress Austin and his team are able to make matters not just to Meta and its billions of users, but also to the tech industry as a whole. For all its faults — and it has many — Meta has tended to be a trendsetter in tech when it comes to experimenting with new forms of governance and transparency. It’s been selective in its efforts, and it hasn’t always gotten them right — sometimes even snuffing out that work when it undermines the company’s other interests. But if Austin’s team can stick around and prove it’s capable of making more than incremental changes inside Meta, then those changes, and the team behind them, could very well become a model for everyone else, too.


Correction: This story has been updated to reflect that Austin was a top official at the DOJ's civil rights division, not head of the division.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins