Policy

Transparency can help fix social media — if anyone can define it

The latest buzzword in tech policy promises to give users more insight into and power over social media services, but mainly signals just how much more we need to figure out.

Images of social media app icons blurred out behind translucent white text that reads "TRANSPARENCY"

Social media companies, lawmakers and tech skeptics all say they want more visibility into how the sites work.

Image: dole777/Unsplash; Protocol

It's the one and only thing nearly everyone in tech and tech policy can agree on. Facebook and Twitter want it, as does the Facebook Oversight Board. Whistleblower Frances Haugen suggested it to Congress, and several lawmakers who heard her testimony agreed. Even the FTC is on board.

The vogue in tech policy is "transparency," the latest buzzword for addressing concerns about social media's reach, breadth and social effects. Companies, academics, regulators and lawmakers on both sides of the aisle all embrace transparency as a cure-all, or at least a necessary first step. But that agreement obscures a deeper problem: The various camps all have widely differing notions both of what the vague term actually means, and also what the public should do with any insights increased transparency might lead to.

The idea that we should have more visibility into how companies such as Facebook and Google make their decisions has gone through periods of popularity before, especially after the Snowden revelations about government surveillance became public in 2013. Haugen's testimony that Facebook suppressed conclusions about its effect on a range of real-world problems, however, has put the term back in the spotlight — even as Congress continues its years-long struggle to come to agreement on more comprehensive regulations such as privacy.

The case for transparency basically goes like this: Powerful institutions, from countries to giant companies, should be held accountable as they deal with citizens and customers, particularly around individuals' ability to express themselves. Transparency seems like it would lead to that, as well as to create a path toward redress when these mega-actors do something untoward. After all, social media services can face legal consequences from government regulators like the FTC or SEC for misleading consumers or investors. While enforcement has often been uneven, many tech skeptics say the possibility of these consequences is particularly vital as opaque algorithms drive more of our digital lives. Increased transparency potentially could clarify which online problems are most urgent and how they can be fixed.

"Transparency becomes a building block under which you enable people to understand what's happening, to build trust," Nick Pickles, Twitter's senior director for global public policy strategy, development and partnerships, told Protocol. "You give people an understanding of what's happening on a service so that they can make decisions about appropriate policies."

Twitter recently issued what it called "guiding principles for regulation," which addressed issues like competition but said users should be able to understand platforms' rules and lawmakers should guide social media by providing "suitable flexibility for valuable disclosures," including to researchers.

Several major companies have already enacted reforms meant to increase transparency, after a fashion. Facebook discloses aggregate figures about takedowns of some harmful content and bot networks. It also has called for the reform of website liability shields, saying such reform would incentivize transparency. Facebook, Google and other sites allow users to see why they were served a particular ad. Twitter's Transparency Center puts out an array of data, including aggregate information on COVID-19 misinformation. And all major social media companies publish terms of service, often including separate community standards documents.

Yet Facebook's Oversight Board, which the company set up but which operates independently, complained about Facebook's "opaque rules" in the wake of Haugen's disclosures. The board in a blog post discussed users' need to access more information about what specific rules they may have broken and takedown requests by international governments. The general public should also have access to terms of service accurately translated into more languages and insight into the "whitelisting" of prominent and newsworthy accounts to exempt them from certain content moderation decisions.

The board titled its post: "To treat users fairly, Facebook must commit to transparency."

Nate Persily, the co-director of Stanford Cyber Policy Center, said being able to check the platforms' work will mean they can't lie to the public, and could prompt them to pull back from actions they felt the need to hide.

"If you force the platforms to open themselves to outside review, it will change their behavior," said Persily, who has proposed legislation to let scholars access information companies such as Facebook hold. "They will know they're being watched."

Persily also resigned last year from leading the independent effort to get Facebook to open up more data to researchers. The company eventually did so, but the data was flawed.

Sara Collins, policy counsel at tech policy advocacy group Public Knowledge, argued, however, that while offering the public a look into how Facebook, Twitter and other companies function can be useful for studying the sites, it may do little to combat individual users' concerns over issues like extremism online.

"I don't know that that meaningfully changes behaviors," Collins said. "I don't know that it reduces harm in any significant way, and it sure doesn't incentivize the companies to change anything about what they're doing."

Collins compared transparency measures to privacy notices, which users rarely read and even more rarely understand. With few choices, users usually just click whatever they need to in order to install an app or use a service. Collins noted, however, "transparency" goes beyond how social media companies handle individuals' posts.

Lawmakers, researchers and advocates often push for deeper information about the advertising that generates revenue and the algorithms that structure everything from users' feeds to checking for copyrighted material. One bill from earlier this year, for instance, would require online platforms "to maintain detailed records describing their algorithmic process for review" by the FTC. Another bill would force large platforms to give researchers and the FTC access to more detailed ad libraries than companies currently put out, including a description of the target audience and information about how many people interacted with the ad.

Democratic Sen. Richard Blumenthal, who led the hearings where Haugen testified, echoed her suggestion that policy-makers should have access to the kinds of internal research she disclosed, which looked into the effects of Instagram on young users, among other issues. At the hearing, Blumenthal said he planned to pursue "compelling recommendations about requiring disclosures of research and independent reviews of these platforms' algorithms."

All the buzz about transparency reflects several concerns bubbling up at once, with some as small as a single post taken down for hate speech and others as momentous as how algorithms might drive political polarization and violence around the world.

"All of these conversations are kind of happening in parallel to each other," said Caitlin Vogus, deputy director of the Free Expression Project at the Center for Democracy & Technology. "They're all different strains of transparency, but the devil is in the details."

Many of the definitions of transparency could require companies to hand over vast amounts of data, some of it proprietary. Twitter allows researchers to gather huge datasets and has considered a shift to open-source ranking algorithms, but said privacy safeguards are necessary in any transparency offering and burdensome disclosure mandates could hurt small businesses.

In some cases, social media sites have already balked at disclosing more, particularly when the information ends up in the hands of people with little incentive to portray the companies in a positive light.

Over the summer, for instance, Facebook suspended the accounts of New York University researchers who had been studying disinformation and political ads on the platform. The move prompted accusations that the company was trying to squash unflattering conclusions. Facebook claimed its $5 billion settlement with the FTC, for privacy violations in the wake of the Cambridge Analytica scandal, required it to block the research .

The FTC eventually weighed in, blasting Facebook's rationale and siding with the academics in favor of more transparency.

"The consent decree does not bar Facebook from creating exceptions for good-faith research in the public interest," Samuel Levine, the acting director of the bureau of consumer protection, wrote in a letter to Mark Zuckerberg. "Indeed, the FTC supports efforts to shed light on opaque business practices, especially around surveillance-based advertising."

As the FTC suggested, access for researchers to the inner workings of companies has become the version of transparency that many tech skeptics hope for. They say that groups of specially vetted academics — or even a new U.S. government regulator — could bring expertise to examining massive, complex algorithms or working on cross-platform problems like the spread of disinformation. Limiting access to researchers or the government could also lessen concerns about the privacy of so much data and analysis circulating in the world.

Yet even when offering more data to researchers, penetrating visibility can only go so far to "solve" social media's problems. Rather, advocates for transparency say the clarity can help hold companies to account, but doesn't replace the process of further action, such as a federal privacy law.

"We need this transparency so lawmakers actually know what's going on," Vogus said.

Issie Lapowsky contributed reporting.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins