Policy

The FTC’s new enforcement weapon spells death for algorithms

It may have found a new standard for penalizing tech companies that violate privacy and use deceptive data practices: algorithmic destruction.

The FTC’s new enforcement weapon spells death for algorithms

Forcing companies to delete algorithmic systems built with ill-gotten data could become a more routine approach.

Illustration: CreepyCube/iStock/Getty Images Plus; Protocol

The Federal Trade Commission has struggled over the years to find ways to combat deceptive digital data practices using its limited set of enforcement options. Now, it’s landed on one that could have a big impact on tech companies: algorithmic destruction. And as the agency gets more aggressive on tech by slowly introducing this new type of penalty, applying it in a settlement for the third time in three years could be the charm.

In a March 4 settlement order, the agency demanded that WW International — formerly known as Weight Watchers — destroy the algorithms or AI models it built using personal information collected through its Kurbo healthy eating app from kids as young as 8 without parental permission. The agency also fined the company $1.5 million and ordered it to delete the illegally harvested data.

When it comes to today’s data-centric business models, algorithmic systems and the data used to build and train them are intellectual property, products that are core to how many companies operate and generate revenue. While in the past the FTC has required companies to disgorge ill-gotten monetary gains obtained through deceptive practices, forcing them to delete algorithmic systems built with ill-gotten data could become a more routine approach, one that modernizes FTC enforcement to directly affect how companies do business.

A slow rollout

The FTC first used the approach in 2019, amid scandalous headlines that exposed Facebook’s privacy vulnerabilities and brought down political data and campaign consultancy Cambridge Analytica. The agency called on Cambridge Analytica to destroy the data it had gathered about Facebook users through deceptive means along with “information or work product, including any algorithms or equations” built using that data.

It was another two years before algorithmic disgorgement came around again when the commission settled a case with photo-sharing app company Everalbum. The company was charged with using facial recognition in its Ever app to detect people’s identities in images without allowing users to turn it off, and for using photos uploaded through the app to help build its facial recognition technology.

In that case, the commission told Everalbum to destroy the photos, videos and facial and biometric data it gleaned from app users and to delete products built using it, including “any models or algorithms developed in whole or in part” using that data.

Technically speaking, the term “algorithm” can cover any piece of code that can make a software application do a set of actions, said Krishna Gade, founder and CEO of AI monitoring software company Fiddler. When it comes to AI specifically, the term usually refers to an AI model or machine-learning model, he said.

Clearing the way for algorithmic destruction inside the FTC

It hasn’t always been clear that the FTC might use algorithmic disgorgement more regularly.

“Cambridge Analytica a was a good decision, but I wasn’t certain that that was going to become a pattern,” Pam Dixon, executive director of World Privacy Forum, said regarding the requirement for the company to delete its algorithmic models. Now, Dixon said, algorithmic disgorgement will likely become a standard enforcement mechanism, just like monetary fines. “This is definitely now to be expected whenever it is applicable or the right decision,” she said.

The winds inside the FTC seem to be shifting. “Commissioners have previously voted to allow data protection law violators to retain algorithms and technologies that derive much of their value from ill-gotten data,” former FTC Commissioner Rohit Chopra, now director of the Consumer Financial Protection Bureau, wrote in a statement related to the Everalbum case. He said requiring the company to “forfeit the fruits of its deception” was “an important course correction.”

“If Ever meant a course correction, Kurbo means full speed ahead,” said Jevan Hutson, associate at Hintze Law, a data privacy and security law firm.

FTC Commissioner Rebecca Slaughter has been a vocal supporter of algorithmic destruction as a way to penalize companies for unfair and deceptive data practices. In a Yale Journal of Law and Technology article published last year, she and FTC lawyers Janice Kopec and Mohamad Batal highlighted it as a tool the FTC could use to foster economic and algorithmic justice.

“The premise is simple: when companies collect data illegally, they should not be able to profit from either the data or any algorithm developed using it,” they wrote. “The authority to seek this type of remedy comes from the Commission’s power to order relief reasonably tailored to the violation of the law. This innovative enforcement approach should send a clear message to companies engaging in illicit data collection in order to train AI models: Not worth it.”

Indeed, some believe the threat to intellectual property value and tech product viability could make companies think twice about using data collected through unscrupulous means. “Big fines are the cost of doing business. Algorithmic disgorgement traced to illicit data collection/processing is an actual deterrent,” David Carroll, an associate professor of media design at The New School’s Parsons School of Design, said in a tweet. Carroll sued Cambridge Analytica in Europe to obtain his 2016 voter profile data from the now-defunct company.

Forecast: Future privacy use

When people sign up to use the Kurbo healthy eating app, they can choose a fruit or vegetable-themed avatar such as an artichoke, pea pod or pineapple. In exchange for health coaching and help tracking food intake and exercise, the app requires personal information about its users such as age, gender, height, weight and their food and exercise choices, which improve the app.

In its case against WW, the FTC said that until late 2019, Kurbo users could sign up for the service either by indicating that they were a parent signing up for their child or that they were over the age of 13 and registering for themselves. The agency said the company failed to ensure that the people signing up were actually parents or adult guardians rather than kids pretending to be adults. It also said that from 2014 to 2019, hundreds of users who signed up for the app originally claiming they were over age 13 later changed their profile birth dates to indicate they were actually under 13, but continued to have access to the app.

The fact that algorithmic disgorgement was used by the FTC in relation to one of the country’s only existing federal privacy laws could be a sign that it will be used again, legal and policy experts said. While the Cambridge Analytica and Everalbum cases charged those companies for violating the FTC Act, the Kurbo case added an important wrinkle, alleging that WW violated both the FTC Act and Children’s Online Privacy Protection Act. Both are important pieces of legislation under which the agency can bring consumer protection cases against businesses.

“This means that for any organization that has collected data illegally under COPPA that data is at risk and the models built on top of it are at risk for disgorgement,” Hutson said.

The use of COPPA could be a foundational precedent paving the way for the FTC to require destruction of algorithmic models under future legislation, such as a would-be comprehensive federal privacy law. “It stands to reason it would be leveraged in any other arena where the FTC has enforcement authority under legislation,” Hutson said.

Application of algorithmic disgorgement in the COPPA context is “a clear jurisdiction and trigger of enforcement through a law that exists and explicitly protects kids’ data, [so] if there was a corollary law for everyone it would allow the FTC to enforce in this way for companies that are not just gathering kids’ data,” said Ben Winters, a counsel for the Electronic Privacy Information Center.

He added, “It shows it would be really great if we had a privacy law for everybody, in addition to kids.”

Climate

Sealed finds a market in home decarbonization

Sealed offers homeowners the chance to save money and help protect the planet.

Sealed is convincing homeowners to look at their HVAC systems and insulation in order to save energy and money.

Photo: Gabe Souza/Portland Portland Press Herald via Getty Images

Shiny silver panels hug the walls of Andy Frank’s attic; they vaguely remind me of a child’s robot Halloween costume. A sticky-looking foam lines both the gaps in the attic’s floorboards and the roof, plugging up holes where squirrels could have once taken shelter.

The space is positively sweat-inducing, even for the mere minute I have my head poking above the trapdoor.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Now that most organizations are returning to the office, there are varying extremes – some leaders demand that employees return to the office, with some employees revolting and some rejoicing to be together again. On the other hand, some companies have closed physical offices and made remote work permanent; creating a sigh of relief for some employees and creating frustration for others.

Most of us are somewhere in between, trying our best to take a measured approach at building the right hybrid strategy tailored to company culture. Some seemingly have begun to crack the code, while the majority are grappling with the when, how, why, and who of this new hybrid work reality.

Keep Reading Show less
Nathan Coutinho

Nathan Coutinho leads Logitech's global conferencing business strategy and analyst relations. A Swiss company focused on innovation and quality, Logitech designs products and experiences that have an everyday place in people's lives.Coutinho leads strategy and execution of Logitech's video conferencing solutions, from personal solutions to highly-scalable conference rooms.Coutinho has more than 25 years of experience in the IT industry with various roles in executive leadership, consulting, engineering, marketing and technical sales.

Workplace

Experts say tech companies need to prepare for the next SCOTUS decision

HR experts said companies need to be proactive about protections for contraception, privacy and LGBTQ+ rights.

Experts say tech leaders need to start thinking about future Supreme Court rulings.

Photo: Anna Moneymaker/Getty Images

Tech companies are still trying to prepare for a post-Roe world. But it might already be time to think about what the Supreme Court is planning next.

When the Supreme Court overturned Roe v. Wade Friday, Justice Clarence Thomas wrote in a concurring opinion that the court should also reconsider rulings protecting contraception and same-sex relationships, citing Griswold, Lawrence and Obergefell. If those decisions were ever overruled, it would have massive implications for everyone, but especially for employees living in states where same-sex marriage is at risk of becoming illegal without a federal shield.

Keep Reading Show less
Lizzy Lawrence

Lizzy Lawrence ( @LizzyLaw_) is a reporter at Protocol, covering tools and productivity in the workplace. She's a recent graduate of the University of Michigan, where she studied sociology and international studies. She served as editor in chief of The Michigan Daily, her school's independent newspaper. She's based in D.C., and can be reached at llawrence@protocol.com.

Policy

What’s next for tech in a post-Roe world

From employee support to privacy concerns, tech companies play a critical role in what’s to come for abortion access in the U.S.

States banning abortion means that tech will play a critical role in what’s to come for abortion access in the U.S.

Photo: Al Drago/Bloomberg via Getty Images

The end of Roe v. Wade has sent the world of tech scrambling. Many companies are now trying to quickly figure out how to protect workers in states where abortion will be banned, while also facing potential privacy and legal ramifications.


Here’s a look at tech companies’ roles and responses to the ruling. We will update this page as news and events change.

Keep Reading Show less
Alex Eichenstein

Alex Eichenstein (@alexeichenstein) is Protocol's social media editor. Previously, she managed social media and audience engagement efforts at the Center for Public Integrity. She earned an B.A. in English, women and gender studies and political science from the University of Delaware. She lives in Washington, D.C.

Fintech

You’re thinking about Apple Pay Later all wrong

Apple’s “buy now, pay later” product has a distinctly different distribution strategy that means it doesn’t directly threaten Affirm, Klarna and Afterpay.

Apple Pay Later emerges as a distinctly different product than what Klarna and Affirm offer.

Image: Apple; Protocol

Apple’s entry into the “buy now, pay later” market was one of its worst-kept secrets: Analysts had been predicting the company’s rollout of a pay-later service as early as 2020. The most common read on the move was predictable: Apple was here to smash the competition. The company has a track record of jumping into new sectors late and still managing to come out on top — the iPod came out when there were tons of MP3 players on the market.

But some analysts have a starkly different view. When you look at it under the hood, Apple Pay Later emerges as a distinctly different product than what Klarna and Affirm offer, they say — and one that isn’t much of a market predator.

Keep Reading Show less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

Latest Stories
Bulletins