Meta has agreed to settle a long-standing lawsuit filed by the Department of Housing and Urban Development alleging discrimination in Facebook's housing ad system. As part of the settlement, Meta vowed to change the way ads for housing, as well as employment and credit opportunities, are delivered on its platforms, and to pay a $115,054 fine.
"Discrimination in housing, employment and credit is a deep-rooted problem with a long history in the US, and we are committed to broadening opportunities for marginalized communities in these spaces and others," Roy Austin Jr., Meta's vice president of civil rights, wrote in a blog post.
The case, which was originally brought under then-HUD secretary Ben Carson in 2018, accused Facebook of allowing advertisers to discriminate on the basis of race and other protected characteristics when they were targeting housing ads. It also accused Facebook itself of discriminating through the actual delivery ads. Research has found that even when housing and employment ads are targeted in a neutral way, Facebook's algorithms can wind up skewing which demographics actually get to see the ads.
As part of the settlement, Meta is committing to institute what it's calling a “variance reduction system” to ensure that the intended target of an ad and the actual audience to which an ad is delivered align. "Today’s announcement reflects more than a year of collaboration with HUD to develop a novel use of machine learning technology that will work to ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad," Austin wrote in his statement.
Meta is also agreeing to do away with functionality called Special Ad Audiences that allows advertisers to target lookalike audiences of users for housing, employment and credit related ads. The company has previously removed certain ad targeting categories from housing, employment and credit related ads and added them to its public ad archive.
“This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit," Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division said in a statement.
Austin himself is a veteran of the DOJ's civil rights division. He joined Meta last year after a damning civil rights audit called Facebook out for everything from its failure to police former President Trump's posts to its history of enabling housing discrimination through ads.
Under Austin's leadership, Meta is undertaking a novel study that will analyze its impact on users of different races. "Are Black users treated differently than other users? Being able to measure that and be very transparent and forthright about the fact we are measuring that was incredibly important to me,” Austin told Protocol earlier this year. The techniques Meta has created to analyze users' races as part of that work will be crucial to the company's ability to vet the existence, or lack thereof, of discrimination in its ads.
Meta now has until the end of the year to implement the changes, at which point the Department of Justice will have an opportunity to approve the changes. "If Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation,” U.S. Attorney Damian Williams for the Southern District of New York said in a statement.