How to kill an algorithm
Hello and welcome to Protocol Enterprise! Today: How the FTC’s algorithm death penalties will actually work in practice, a major earthquake in Japan disrupts chip production, and this week’s enterprise tech moves.
Spin up
The early days of cloud computing gave rise to the notion of “rogue IT,” or skunkworks projects that nobody told the CIO about. Something similar is happening in AI and machine learning: Only 39% of companies track the ML projects being used in their organizations and check those projects for effectiveness and bias, according to a recent report from Eversheds Sutherland.
Algorithms all the way down
“The premise is simple,” FTC Commissioner Rebecca Slaughter and FTC lawyers wrote last year.
They were talking about a little-used enforcement tool called algorithmic disgorgement, a penalty the agency can wield against companies that used deceptive data practices to build algorithmic systems like AI and machine-learning models. The punishment: They have to destroy ill-gotten data and the models built with it.
- But while privacy advocates and critics of excessive data collection are praising the concept in theory, in practice it could be anything but simple to implement.
- “Once you delete the algorithm, you delete the learning. But if it’s entangled with other data, it can get complex pretty fast,” said Rana el Kaliouby, a machine-learning scientist and deputy CEO of driver-monitoring AI firm Smart Eye.
- The FTC’s March 3 settlement order against WW, the company formerly known as Weight Watchers, marked the most recent time the agency has demanded a company destroy algorithmic systems.
- But the order provides little detail about how the company must comply or how the FTC will know for sure it did.
The order is “profoundly vague,” said Pam Dixon, executive director of the World Privacy Forum. “We’re not usually talking about a single algorithm. I would like to have seen more in their materials about what it is that is being disgorged specifically.”
- Companies decommission algorithmic models by taking them out of production all the time. In some cases, an algorithm is just a simple piece of code: something that tells a software application how to perform a set of actions.
- If WW used the data it was ordered to delete to build just one machine-learning model used in one particular feature of its app, for example, deleting the code for that feature could be a relatively straightforward process, el Kaliouby said.
- But algorithmic systems using AI and machine or deep learning can involve large models or families of models involving extremely complex logic expressed in code.
- Algorithmic systems used in social media platforms, for example, might incorporate several different intersecting models and data sets all working together.
But it’s not so easy to decouple data from algorithmic systems, in part because data used to train and feed them hardly ever sits in one place.
- Data obtained through deceptive means may end up in a data set that is then sliced and diced to form multiple data set “splits,” each used for separate purposes throughout the machine-learning model development process for model training, testing and validation, said Anupam Datta, co-founder and chief scientist at TruEra, which provides a platform for explaining and monitoring AI models.
- And once a model has been deployed, it might blend ill-gotten data along with additional information from other sources, such as data ingested through APIs or real-time data streams.
In addition to data blending, data copying adds more layers of complexity to the removal process. Data often is replicated and distributed so it can be accessed or used by multiple people or for multiple purposes.
- Krishnaram Kenthapadi, chief scientist at machine-learning model monitoring company Fiddler, called this problem — deleting algorithmic models built with ill-gotten information — one of data provenance.
- “You want to track all the downstream applications that touched or may have used this data,” he said.
- Many companies do not have processes set up to automatically attach lineage information to data they collect and use in building algorithmic systems, said Kevin Campbell, CEO of Syniti, a company that provides data technologies and services for things like data migration and data quality.
- “If you don’t have a centralized way of capturing that information, you have to have a whole bunch of people chase it down,” said Campbell. “A whole lot of people are going to write a lot of queries.”
As data use and AI become increasingly complex, monitoring for compliance could be difficult for regulators, said el Kaliouby.
- “It’s not impossible,” she said, but “it’s just hard to enforce some of these things, because you have to be a domain expert.”
A MESSAGE FROM DATAIKU

Dataiku is the only AI platform that connects data and doers, enabling anyone to transform data into real business results — from the mundane to the moonshot. Because AI can do so much, but there's no soul in the machine, only in front of it. Without you, it's just data.
Japan’s chip industry damaged by powerful earthquake
A powerful earthquake on Wednesday has disrupted semiconductor production at several manufacturing sites in Japan, threatening to further damage an already fragile supply of chips for autos and memory.
The 7.3 quake has disrupted production at memory-maker Kioxia’s plant in Japan, according to industry analyst TrendForce. The K1 factory is responsible for about 8% of Kioxia’s production, and production was suspended to conduct inspections.
The market for memory chips is a weird business: Since they are treated like a commodity, the less memory there is, the more expensive it becomes. That means disruptions to the supply can boost revenue and even profit for some companies — as customers bear the brunt of the costs.
Kioxia’s K1 fab, which is involved in a joint venture with Western Digital, suffered from material contamination in February. Western Digital said the production had resumed normal levels in early March and it was not immediately clear what effect this week's earthquake may have.
Most of the other fabs in Japan are operating normally, except for several fabs run by auto-chipmaker Renesas, according to TrendForce. The quake caused manufacturing problems at three of Renesas’ fabs, and could further hurt Japan’s vehicle producers. Globally, auto-makers have already halted production in some cases and asked consumers to buy vehicles without normally standard features.
Enterprise moves
Over the past week, Google and AMD poached talent from rivals AWS and Intel, Luminous named a new president, and both MemryX and Netlify announced a string of leadership changes. Here’s what else is happening with the people of enterprise tech.
Duncan Lennox joined Google as VP of Engineering. Lennox was previously the GM of the elastic file system group at AWS.
Michael Hochberg was named president of Luminous Computing. Hochberg previously founded four companies, including Luxtera and Elenion, which were acquired by Cisco and Nokia respectively.
Mike Burrows joined AMD as corporate vice president. Burrows joined AMD from Intel, where he was CTO and director of the gaming and graphics group.
Mark Dorsi is Netlify’s new CISO. Dorsi most recently led security efforts at HelloSign which was acquired by DropBox. The appointment comes alongside three other leadership appointments at Netlify.
La’Naia J. Jones is the CIA’s new CIO. Jones was previously deputy CIO at the National Security Agency and was acting CIO within the Office of the Director of National Intelligence.
Puneet Singh joined MemryX as VP of Engineering. Singh was previously the senior director of Technology for Qualcomm, and worked as a design manager at Intel before that.
Roger Peene also joined MemryX, as VP of Product and Business Development. Peene was most recently VP and GM of Datacenter Storage at Micron and previously worked at Intel.
Around the enterprise
Github suffered a fairly widespread outageduring the East Coast morning hours, which is otherwise known in software development circles as a “snow day.”
Snowflake joined several enterprise tech companies by launching a new version of its core product designed for the health care industry.
A MESSAGE FROM DATAIKU

Dataiku is the only AI platform that connects data and doers, enabling anyone to transform data into real business results — from the mundane to the moonshot. Because AI can do so much, but there's no soul in the machine, only in front of it. Without you, it's just data.
Thanks for reading — see you tomorrow!
Recent Issues
In a tough economy, benefits of the cloud 'only magnify'
November 14, 2022
Twitter’s security leads just quit. Now what?
November 10, 2022
Intel finally serves up a chip
November 09, 2022
The great AI race that wasn’t
November 08, 2022
Cloudflare sets a target
November 07, 2022
How Elon will bring back the Fail Whale
November 04, 2022
See more
To give you the best possible experience, this site uses cookies. If you continue browsing. you accept our use of cookies. You can review our privacy policy to find out more about the cookies we use.