
What are the most important elements for building and maintaining trust in the digital ecosystem and why is it necessary?
Trusted Future is a non-profit organization dedicated to the belief that we need smarter, better-informed efforts to enhance trust in today's digital ecosystem in order to expand opportunities for tomorrow.
Adam Golodner

Co-Chair, Trusted Future
The most important element for building trust in the digital ecosystem is to have producers of products and services dedicate themselves to infusing trust into the lifecycle of their products and services. Only with trust can we maintain a global information infrastructure and obtain the full benefits of technology into the future. From software and hardware development, to supply chain, to product security and incident response, to threat modeling, to certification, to vulnerability management, to treatment of customer data, to corporate governance, to relations with governments globally, to security information sharing with others in the ecosystem, and through employing a systems engineering approach to trust management — employing this holistic approach to trust is the cornerstone of trust going forward.
To get there, we are creating the first holistic Trust Framework that existing and emerging technology producers and users can use to instantiate trust and answer the simple question — should I trust this product or service in my infrastructure, or with my data? As producers adopt and users demand this Trust Framework — the trust needle will move.
Jim Kohlenberger

Co-Chair, Trusted Future
For those that are designing, developing or deploying cutting-edge technologies, trust has now emerged as a central factor in determining whether and how fast transformative new technologies will be adopted in the marketplace. We’ve all seen how technologies have transformed the way we work, live and learn. And just over the horizon a new set of breakthroughs offers even more potential.
Innovators across the country are unlocking new technological frontiers using AI, 5G, IoT and the cloud to create opportunities never before possible that fundamentally expand our ability to solve important problems —technologies that can improve health outcomes, cut greenhouse gasses and make factories more competitive. But we risk squandering or even delaying these opportunities when people lack the foundational trust necessary to take full advantage of their potential. For example, a small factory owner may not adopt smart manufacturing technologies to improve business if they don’t trust it to protect them from factory-idling ransomware or a consumer privacy data breach. It is more important than ever that we infuse strong privacy, robust security and inclusive design into our technologies from the start so that we can trust that the technologies of tomorrow will be even better than today.
Edward “Smitty” Smith

Chair of DLA Piper’s U.S. Regulatory and Government Affairs Practice
Trust, by its nature, is a function of human expectations about the reliability of another party (or parties) in the context of some relationship. For our purposes, building trust in the digital ecosystem requires consideration of the existing relationships that parties in that ecosystem have with each other.
In the United States, the trust relationships between many public and private institutions that are prominent participants in the digital ecosystem (from government agencies to tech and telecom companies) and communities of color are notoriously weak.
These relationships are frequently stained by deep-seated and historically reinforced suspicion, trauma and fear of exploitation or persecution. For example, survey data shows that many in communities of color harbor particularly strong distrust of the government when it comes to their personal information. This data also shows stark differences in community perceptions of how much control individuals have over personal data, privacy risks and concerns about information disclosure.
In order to foster greater trust in the digital ecosystem, we must approach our task from a culturally and historically informed perspective. This approach is essential to effectively communicate to all consumers how holders of data are managing the security and privacy of their data, to best develop and implement policy and to understand the efficacy of those policies.
Daniel Weitzner

3Com Founders Principal Research Scientist at the MIT Computer Science and Artificial Intelligence Lab, Founding Director of the MIT Internet Policy Research Initiative and an adviser to Trusted Future
Trust depends on knowing that those who hold data about us are managing the security and privacy of that data. Yet, you can’t manage what you don’t measure. We’ve made real progress on digital privacy and security — we know more about how to build more-secure systems, many organizations have chief privacy officers and chief security officers and there are new laws like the GDPR that demand companies pay more attention to privacy.
But we’re still in the Stone Age when it comes to our ability to actually verify that systems are worthy of public (or regulatory) trust.
We’re measuring all the wrong things. For example, in cybersecurity, we count the number of threats detected and averted, number of systems with up-to-date software and how many potential vulnerabilities are visible to adversaries. But that only tells us how much effort we are putting in, not whether the billions of dollars we spend on cybersecurity are pointed in the right direction. What we really should care about is which potential risks are likely to cause actual losses. The only way to do this is to also collect data about actual losses from among the billions of otherwise harmless attacks. It’s understandable the firms don’t want to share sensitive (and embarrassing) information but without it we are flying blind. In my research group at MIT, we’re closing this gap with a tool called SCRAM that allows us to collect data on security losses privately and then analyze what specific failures are the cause of those losses. That will allow firms to make better investment decisions, and regulators to craft more sensible, efficient cybersecurity rules. The result will be systems that we can all trust.
Maureen K. Ohlhausen

Chair, Baker Botts’ Global Antitrust and Competition Practice
Strong privacy protections that safeguard consumers’ personal information are essential to building trust so that consumers may feel confident enjoying the many benefits of innovative products offered in today’s dynamic marketplace. Consumers need to be able to trust that their sensitive personal information, such as health and financial information, real-time precise geo-location information, Social Security numbers and children’s information, will not be used or disclosed in ways that could result in harm. It’s also vital that businesses honor the privacy commitments they make to the public. A false promise to provide certain privacy or data security protections undermines consumer choice about whether to use a product or service, and erodes consumers’ trust in the ability of businesses to protect their information. Without this trust, many consumers may be less willing to share their data or participate in the benefits of the Internet economy.