Government and private-sector organizations want to update data privacy and management approaches.
Dense privacy policies and misleading website cookie notices are legacies of a bygone era. Today, data collection is becoming more ambient, often happening in places where there’s no ability to post a notice at all. Instead, digital experiences have expanded beyond our phones and web interactions, and data is collected in virtually augmented environments, whether through IoT devices on city streets or in our homes.
Enter data intermediaries.
These third parties would serve as links between people and entities collecting their data, or between businesses and their partners. They could take on many forms, from digital agents to fiduciaries representing people and their data privacy choices. Or they might be trusts, facilitating collective data negotiations on behalf of individual people or businesses. Data intermediaries could even make automated decisions using artificial intelligence.
The World Economic Forum published a report on Tuesday giving an overview of these approaches as assessed by a data intermediaries task force. “The task force has had as their mission the idea that we would explore various different ways of looking not just at data privacy, but improving data availability,” said Anne Flanagan, project lead of Data Policy at WEF, during an event presenting the “Advancing Digital Agency: The Power of Data Intermediaries” report.
The goal of the report, Flanagan said, is “to improve rights for people, and simultaneously to de-risk and improve certainty for companies and organizations that need access to data; to build innovation and to fuel the world that we all live in today, where we all do business, everything that we're doing online, particularly in a post-pandemic era.”
An AI data agent
One potentially risky scenario floated in the report: the automated digital agent, which would use AI to decide an individual’s data-permission preferences. The goal of this approach would be to remove the burden of having to make case-by-case data decisions by replacing humans with AI that learns from previous data choices and then takes over to automate future decisions.
However, algorithmic models can make flawed decisions if they are trained with inaccurate, incomplete or biased data sets. They are also at risk of drifting from their original state once deployed, and making decisions in ways that were not intended.
“The key to success here lies in the quality of the automated decision-making and the underlying algorithm,” noted the report.
The WEF report follows similar work from the U.N., EU and U.K. In general, all these groups are in evaluation mode, assessing a variety of models for enabling secure and trustworthy data connections and sharing as the digital landscape evolves.
The U.K. Department for Digital, Culture, Media and Sport published a paper last year focused on the role of data intermediaries such as data trusts, nonprofit or commercial data exchanges and co-ops for purposes such as medical data sharing.
In November, the EU passed the Data Governance Act, which established a framework for data sharing among companies as well as data management and consent controls for people via data intermediation services like data wallets.
Questioning for-profit data intermediaries
Despite a growing collection of research, there are few practical data intermediary approaches in place thus far, but people like Richard Whitt, former Google corporate director of Strategic Initiatives, are ready to take a stab at steering business investment and early development in updated approaches to data stewardship. Whitt, currently president of GLIA Foundation and a member of the task force that spoke at the event, is also founder and CEO of Deeper Edge, a “proof-of-concept” company developing digital fiduciary services.
Digital fiduciaries might resemble doctors, lawyers or financial advisers, professions that are licensed and abide by standard codes of conduct. While digital fiduciaries might “self-certify” by incorporating guarantees in their terms of service or privacy policies, said Whitt during the panel discussion, “those promises may amount to little” without industry standards or uniform codes of conduct that emerge through competition.
“Over time, these collaborations could turn into an actual profession,” said Whitt.
Some privacy advocates — including FTC Chairwoman Lina Khan — have opposed the data fiduciary concept, particularly in relation to digital platforms. Khan argued in a 2019 Harvard Law Review article that the fiduciary framework is not an apt response to “problems associated with outsized market share and business models built on pervasive surveillance” and rather “invites an enervating complacency toward online platforms’ structural power and a premature abandonment of more robust visions of public regulation.”
The WEF, whose task force included academics and representatives of companies including Mastercard and Facebook parent Meta, briefly questioned for-profit and private data intermediary models in its report. Risks and conflicts of interest can emerge even in public sector or nonprofit models, Flanagan suggested, adding, “It’s a more baseline regulatory question around how do we build regulation and what types of checks and balances are put in place.”
However, she continued, “It is likely that a for-profit intermediary might need to have additional checks and balances to prevent corruption. But regulatory capture, or capture by other entities, is possible across all types of regulatory models.”