Meta’s fight over a rap video will test police ties to Big Tech

U.K. police asked Instagram to remove a rap video. Instagram obliged. The battle over whether to restore it could shake the fraught relationship between platforms and police.

Metropolitan Police badge on a male policeman's helmet on 14th September 2022 in London, United Kingdom. The Metropolitan Police Service, formerly and still commonly known as the Metropolitan Police, is the territorial police force responsible for the prevention of crime and law enforcement in Greater London. The Met was recently put into special measures following criticism over its poor performance. (photo by Mike Kemp/In Pictures via Getty Images)

Critics have argued that Internet Referral Units operate in a dangerous legal no man’s land, where law enforcement agencies use platforms’ own terms to circumvent the judicial system in an affront to free speech.

Photo: Mike Kemp/In Pictures via Getty Images

In January of this year, an Instagram account dedicated to British music posted a 21-second clip of a music video by a U.K. drill rapper named Chinx (OS). Within two days, Instagram took it down.

The lyrics to the song, Secrets Not Safe, make reference to a real-world gang shooting, and Instagram removed the video under its policy against inciting violence. The original poster appealed the decision, and the clip was restored, but eight days later, it was removed again.

It’s the kind of thing that happens millions of times a year on Instagram, not to mention on the rest of the web. But what makes this particular post notable is how it got on Instagram’s radar in the first place — not by way of automated detection or user reports, but via a referral from the police.

In recent years, the London Metropolitan Police and other so-called Internet Referral Units have pummeled platforms including Facebook, Instagram, and, most notably, YouTube with notifications about content that supposedly violates those companies’ terms of service but that isn’t necessarily illegal. The Met’s IRU has placed a particular emphasis on music videos. Last year alone, the unit reportedly recommended YouTube take down 510 music videos, and YouTube complied nearly 97% of the time.

Critics have argued the IRU operates in a dangerous legal no man’s land, where law enforcement agencies use platforms’ own terms to circumvent the judicial system in an affront to free speech. Even so, the prevalence of these units has only grown, with similar divisions popping up in Israel and across Europe. Even where formal IRUs don’t exist, some platforms, including Facebook, have dedicated channels for government agencies like the U.S. Department of Homeland Security to flag content.

But the Chinx (OS) video stands to be a turning point in this relationship between platforms and police. Earlier this year, Meta referred the case to its Oversight Board, which will soon decide the fate of the post in question. The board’s accompanying recommendations could also help answer a much bigger question facing Meta and other tech giants: How can they most effectively push back against growing pressure from law enforcement without sacrificing public safety?

“What the Oversight Board does will have consequences not just for Facebook, but for governments,” said Daphne Keller, director of the Program on Platform Regulation at Stanford's Cyber Policy Center.

The rise of IRUs is a relatively recent phenomenon, with the Met’s so-called Operation Domain initiative, which focuses on “gang-related content,” having launched in 2015. That same year, following the Charlie Hebdo attacks in France, Europol stood up its own IRU division.

But the biggest test of what IRUs could get away with — and how tech platforms were enabling them— came in a 2019 case out of Israel. In that case, two human rights groups asked the Israeli Supreme Court to shut down the country’s so-called Cyber Unit, arguing that this "alternative enforcement" mechanism violated people’s constitutional rights. The court ultimately rejected the petition last year, in part because of Facebook’s failure to tell users it was removing posts in response to Cyber Unit referrals. Without that information, the plaintiffs couldn’t prove the Cyber Unit was responsible for alleged censorship. Besides, the court reasoned, Facebook voluntarily removed the posts under its own terms.

To civil liberties experts, the case illustrated the role tech companies’ decisions play in shielding IRUs from accountability. ”If law enforcement can hide beyond this veil of, ‘It’s just company action,’ they can do things, including systematically target dissent, while completely severing the ability for people to hold them accountable in court,” said Emma Llansó, director of the Free Expression Project at the Center for Democracy and Technology, which is funded in part by Meta and the Chan Zuckerberg Initiative.

What the Oversight Board does will have consequences not just for Facebook, but for governments.

The Chinx (OS) case presents another test, and a chance for Meta to do things differently. Though the board didn’t specify which police agency requested the removal, according to its summary, the U.K. police warned Meta that the video in question “could contribute to a risk of offline harm.” The company appeared to agree and removed the video not once, but twice, after it was restored on appeal. But the decision clearly didn’t sit well with Meta. It referred the case to the board because, the company wrote in a blog post, “we found it significant and difficult because it creates tension between our values of voice and safety.”

The board has since asked for comments on the cultural significance of drill music in the U.K., and also on how social media platforms in general should handle law enforcement requests about lawful speech.

The case has drawn attention from leading civil liberties groups and internet governance experts, including CDT and the ACLU, which have both submitted comments to the board urging it to recommend stronger safeguards against the growing creep of IRUs. “Government-initiated removals — especially those that rely entirely on private content policies to take down lawful content — are a danger to free expression,” reads one comment by the ACLU and Keller.

There is, of course, good reason for the government and tech platforms to communicate. Law enforcement and government agencies often have better insights into emerging threats than platforms do, and platforms increasingly rely on those agencies for guidance.

The Met, for its part, has said it “works only to identify and remove content which incites or encourages violence” and that it “does not seek to suppress freedom of expression through any kind of music.” The agency also touted the effectiveness of the program in comments to the U.K.’s Information Commissioner’s Office, writing, “The project to date has brought to light threats and risk that would otherwise not have been identified through other policing methods.”

But these systems are also ripe for abuse, Llansó said. For one thing, companies may not always feel empowered to reject law enforcement referrals. “There can be a sense within a company that it’s better to be seen as a constructive and collaborative player, rather than one that’s always rejecting requests,” she said.

The fact that these requests are happening out of public view also prevents users from understanding how government agencies are seeking to censor them, and provides no recourse for them to challenge it. “From a company-centric perspective there’s potentially a lot of benefit to having people with expertise, including law enforcement, know about material you want to take down from your service,” said Llansó. “From a user-centric perspective it’s an entirely different story.”

In the U.K., where the Met’s IRU has focused on drill music specifically, these referrals may also disproportionately target Black communities engaged in entirely legal speech. “It's so subjective,” said Paige Collings, a senior speech and privacy activist at EFF, who recently wrote about the fraught relationship between the London Police and YouTube. “It's really racially oriented, racially driven.” Collings points to the widespread use of rap music in court as evidence of a “much wider structural issue” of police attempting to use songs as evidence. “It's not a testimony or evidence of crimes,” Collings said. “[Songs] are artistic expressions.”

Both CDT and the ACLU are calling on the board to urge Meta to notify users when their content is removed in response to a law enforcement request, and to publish detailed reports about these takedowns. Collings also believes platforms should publish examples of the content that’s being removed and list the formal and informal relationships it has with law enforcement units.

The ACLU and Keller also recommended that Meta be more discerning about which law enforcement agencies it trusts and refuse fast-track reporting channels to IRUs that make bad faith or inaccurate referrals. The Internet Archive has, in the past, called out IRUs for making faulty referrals, including the French IRU, which the Internet Archive said improperly flagged over 500 URLs as terrorist propaganda in 2019. “Governments should have, and maybe do have, an obligation to not be so sloppy about this,” Keller said.

While the board’s recommendations aren’t binding, the fact that Meta referred this case to the board at all suggests that the company is looking for help — or at least, backup — as it decides how to handle such requests in the future. And the volume of requests could soon increase. Under Europe’s Digital Services Act, platforms must have “trusted flagger” programs, like the one YouTube already runs, which allows law enforcement agencies and other public and private entities to refer content for removal.

For Meta and other companies operating in Europe, figuring out how to deal with this potential uptick in referrals without stifling users’ ability to speak freely is becoming increasingly urgent, Llansó said. The board’s recommendations stand to give Meta cover for changes it may have wanted to make anyway. “This case could be a way for [Meta] to get the fact that this is happening out on the record,” said Llansó. “If Facebook does want to roll out more transparency, they could use some political backing for that.”


Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.


Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories