Policy

The FTC’s new enforcement weapon spells death for algorithms

It may have found a new standard for penalizing tech companies that violate privacy and use deceptive data practices: algorithmic destruction.

The FTC’s new enforcement weapon spells death for algorithms

Forcing companies to delete algorithmic systems built with ill-gotten data could become a more routine approach.

Illustration: CreepyCube/iStock/Getty Images Plus; Protocol

The Federal Trade Commission has struggled over the years to find ways to combat deceptive digital data practices using its limited set of enforcement options. Now, it’s landed on one that could have a big impact on tech companies: algorithmic destruction. And as the agency gets more aggressive on tech by slowly introducing this new type of penalty, applying it in a settlement for the third time in three years could be the charm.

In a March 4 settlement order, the agency demanded that WW International — formerly known as Weight Watchers — destroy the algorithms or AI models it built using personal information collected through its Kurbo healthy eating app from kids as young as 8 without parental permission. The agency also fined the company $1.5 million and ordered it to delete the illegally harvested data.

When it comes to today’s data-centric business models, algorithmic systems and the data used to build and train them are intellectual property, products that are core to how many companies operate and generate revenue. While in the past the FTC has required companies to disgorge ill-gotten monetary gains obtained through deceptive practices, forcing them to delete algorithmic systems built with ill-gotten data could become a more routine approach, one that modernizes FTC enforcement to directly affect how companies do business.

A slow rollout

The FTC first used the approach in 2019, amid scandalous headlines that exposed Facebook’s privacy vulnerabilities and brought down political data and campaign consultancy Cambridge Analytica. The agency called on Cambridge Analytica to destroy the data it had gathered about Facebook users through deceptive means along with “information or work product, including any algorithms or equations” built using that data.

It was another two years before algorithmic disgorgement came around again when the commission settled a case with photo-sharing app company Everalbum. The company was charged with using facial recognition in its Ever app to detect people’s identities in images without allowing users to turn it off, and for using photos uploaded through the app to help build its facial recognition technology.

In that case, the commission told Everalbum to destroy the photos, videos and facial and biometric data it gleaned from app users and to delete products built using it, including “any models or algorithms developed in whole or in part” using that data.

Technically speaking, the term “algorithm” can cover any piece of code that can make a software application do a set of actions, said Krishna Gade, founder and CEO of AI monitoring software company Fiddler. When it comes to AI specifically, the term usually refers to an AI model or machine-learning model, he said.

Clearing the way for algorithmic destruction inside the FTC

It hasn’t always been clear that the FTC might use algorithmic disgorgement more regularly.

“Cambridge Analytica a was a good decision, but I wasn’t certain that that was going to become a pattern,” Pam Dixon, executive director of World Privacy Forum, said regarding the requirement for the company to delete its algorithmic models. Now, Dixon said, algorithmic disgorgement will likely become a standard enforcement mechanism, just like monetary fines. “This is definitely now to be expected whenever it is applicable or the right decision,” she said.

The winds inside the FTC seem to be shifting. “Commissioners have previously voted to allow data protection law violators to retain algorithms and technologies that derive much of their value from ill-gotten data,” former FTC Commissioner Rohit Chopra, now director of the Consumer Financial Protection Bureau, wrote in a statement related to the Everalbum case. He said requiring the company to “forfeit the fruits of its deception” was “an important course correction.”

“If Ever meant a course correction, Kurbo means full speed ahead,” said Jevan Hutson, associate at Hintze Law, a data privacy and security law firm.

FTC Commissioner Rebecca Slaughter has been a vocal supporter of algorithmic destruction as a way to penalize companies for unfair and deceptive data practices. In a Yale Journal of Law and Technology article published last year, she and FTC lawyers Janice Kopec and Mohamad Batal highlighted it as a tool the FTC could use to foster economic and algorithmic justice.

“The premise is simple: when companies collect data illegally, they should not be able to profit from either the data or any algorithm developed using it,” they wrote. “The authority to seek this type of remedy comes from the Commission’s power to order relief reasonably tailored to the violation of the law. This innovative enforcement approach should send a clear message to companies engaging in illicit data collection in order to train AI models: Not worth it.”

Indeed, some believe the threat to intellectual property value and tech product viability could make companies think twice about using data collected through unscrupulous means. “Big fines are the cost of doing business. Algorithmic disgorgement traced to illicit data collection/processing is an actual deterrent,” David Carroll, an associate professor of media design at The New School’s Parsons School of Design, said in a tweet. Carroll sued Cambridge Analytica in Europe to obtain his 2016 voter profile data from the now-defunct company.

Forecast: Future privacy use

When people sign up to use the Kurbo healthy eating app, they can choose a fruit or vegetable-themed avatar such as an artichoke, pea pod or pineapple. In exchange for health coaching and help tracking food intake and exercise, the app requires personal information about its users such as age, gender, height, weight and their food and exercise choices, which improve the app.

In its case against WW, the FTC said that until late 2019, Kurbo users could sign up for the service either by indicating that they were a parent signing up for their child or that they were over the age of 13 and registering for themselves. The agency said the company failed to ensure that the people signing up were actually parents or adult guardians rather than kids pretending to be adults. It also said that from 2014 to 2019, hundreds of users who signed up for the app originally claiming they were over age 13 later changed their profile birth dates to indicate they were actually under 13, but continued to have access to the app.

The fact that algorithmic disgorgement was used by the FTC in relation to one of the country’s only existing federal privacy laws could be a sign that it will be used again, legal and policy experts said. While the Cambridge Analytica and Everalbum cases charged those companies for violating the FTC Act, the Kurbo case added an important wrinkle, alleging that WW violated both the FTC Act and Children’s Online Privacy Protection Act. Both are important pieces of legislation under which the agency can bring consumer protection cases against businesses.

“This means that for any organization that has collected data illegally under COPPA that data is at risk and the models built on top of it are at risk for disgorgement,” Hutson said.

The use of COPPA could be a foundational precedent paving the way for the FTC to require destruction of algorithmic models under future legislation, such as a would-be comprehensive federal privacy law. “It stands to reason it would be leveraged in any other arena where the FTC has enforcement authority under legislation,” Hutson said.

Application of algorithmic disgorgement in the COPPA context is “a clear jurisdiction and trigger of enforcement through a law that exists and explicitly protects kids’ data, [so] if there was a corollary law for everyone it would allow the FTC to enforce in this way for companies that are not just gathering kids’ data,” said Ben Winters, a counsel for the Electronic Privacy Information Center.

He added, “It shows it would be really great if we had a privacy law for everybody, in addition to kids.”

Sponsored Content

Great products are built on strong patents

Experts say robust intellectual property protection is essential to ensure the long-term R&D required to innovate and maintain America's technology leadership.

Every great tech product that you rely on each day, from the smartphone in your pocket to your music streaming service and navigational system in the car, shares one important thing: part of its innovative design is protected by intellectual property (IP) laws.

From 5G to artificial intelligence, IP protection offers a powerful incentive for researchers to create ground-breaking products, and governmental leaders say its protection is an essential part of maintaining US technology leadership. To quote Secretary of Commerce Gina Raimondo: "intellectual property protection is vital for American innovation and entrepreneurship.”

Keep Reading Show less
James Daly
James Daly has a deep knowledge of creating brand voice identity, including understanding various audiences and targeting messaging accordingly. He enjoys commissioning, editing, writing, and business development, particularly in launching new ventures and building passionate audiences. Daly has led teams large and small to multiple awards and quantifiable success through a strategy built on teamwork, passion, fact-checking, intelligence, analytics, and audience growth while meeting budget goals and production deadlines in fast-paced environments. Daly is the Editorial Director of 2030 Media and a contributor at Wired.

LA is a growing tech hub. But not everyone may fit.

LA has a housing crisis similar to Silicon Valley’s. And single-family-zoning laws are mostly to blame.

As the number of tech companies in the region grows, so does the number of tech workers, whose high salaries put them at an advantage in both LA's renting and buying markets.

Photo: Nat Rubio-Licht/Protocol

LA’s tech scene is on the rise. The number of unicorn companies in Los Angeles is growing, and the city has become the third-largest startup ecosystem nationally behind the Bay Area and New York with more than 4,000 VC-backed startups in industries ranging from aerospace to creators. As the number of tech companies in the region grows, so does the number of tech workers. The city is quickly becoming more and more like Silicon Valley — a new startup and a dozen tech workers on every corner and companies like Google, Netflix, and Twitter setting up offices there.

But with growth comes growing pains. Los Angeles, especially the burgeoning Silicon Beach area — which includes Santa Monica, Venice, and Marina del Rey — shares something in common with its namesake Silicon Valley: a severe lack of housing.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Policy

SFPD can now surveil a private camera network funded by Ripple chair

The San Francisco Board of Supervisors approved a policy that the ACLU and EFF argue will further criminalize marginalized groups.

SFPD will be able to temporarily tap into private surveillance networks in certain circumstances.

Photo: Justin Sullivan/Getty Images

Ripple chairman and co-founder Chris Larsen has been funding a network of security cameras throughout San Francisco for a decade. Now, the city has given its police department the green light to monitor the feeds from those cameras — and any other private surveillance devices in the city — in real time, whether or not a crime has been committed.

This week, San Francisco’s Board of Supervisors approved a controversial plan to allow SFPD to temporarily tap into private surveillance networks during life-threatening emergencies, large events, and in the course of criminal investigations, including investigations of misdemeanors. The decision came despite fervent opposition from groups, including the ACLU of Northern California and the Electronic Frontier Foundation, which say the police department’s new authority will be misused against protesters and marginalized groups in a city that has been a bastion for both.

Keep Reading Show less
Issie Lapowsky

Issie Lapowsky ( @issielapowsky) is Protocol's chief correspondent, covering the intersection of technology, politics, and national affairs. She also oversees Protocol's fellowship program. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University's Center for Publishing on how tech giants have affected publishing.

Enterprise

These two AWS vets think they can finally solve enterprise blockchain

Vendia, founded by Tim Wagner and Shruthi Rao, wants to help companies build real-time, decentralized data applications. Its product allows enterprises to more easily share code and data across clouds, regions, companies, accounts, and technology stacks.

“We have this thesis here: Cloud was always the missing ingredient in blockchain, and Vendia added it in,” Wagner (right) told Protocol of his and Shruthi Rao's company.

Photo: Vendia

The promise of an enterprise blockchain was not lost on CIOs — the idea that a database or an API could keep corporate data consistent with their business partners, be it their upstream supply chains, downstream logistics, or financial partners.

But while it was one of the most anticipated and hyped technologies in recent memory, blockchain also has been one of the most failed technologies in terms of enterprise pilots and implementations, according to Vendia CEO Tim Wagner.

Keep Reading Show less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Latest Stories
Bulletins