Having senior executives active in the process or articulating the value of data can help, while expecting perfection or weaponizing data can work against an organization, members of the Braintrust say.
Chief Strategy Officer at DataStax
The least effective protocol is to attack this as a technical problem. Large organizations are slow to change, and a new SDK, API or library is not going to enable new behavior. A few motivated developers may pick them up and build apps that become useful and popular, but a bottoms-up approach doesn't change people's rewards, and therefore doesn't change behavior at scale.
The most effective protocol is to recruit the CEO as the change agent for becoming great at using data. I saw this at Autodesk in 2018 when I worked as part of Andrew Anagnost's change strategy to move the company from desktop software to data services, and I see it in JPMorgan Chase as Jamie Dimon presents publicly on the business value of data and his mandate to all business managers to make AI part of their business plan for the next fiscal year.
The CEO-led approach must be paired with effective, usable technology, or it will be an empty mandate that generates cynicism; and it must also carry a robust set of measurable goals and rewards. Promotions for technical and business leaders must include a "data change leadership" flag: This sends the message that good or great performance by the old standards is no longer enough, and that the bar is moving.
This can create the necessary conditions for employees to self-organize into guilds or communities of practice, encourage each other, and shift the culture to continuously evolve towards higher competency with data.
President at VMware
The most effective ways to help organizations get better at using data is to focus on three key areas — relevancy, accessibility and a single source of truth.
One of the problems with data is that often there is a lot of it to sort through. This is where relevancy comes in — it is critical to identify metrics that are relevant to the decision-making process. This requires picking the right metrics so organizations are looking at the right measurements when making a decision. This applies to both results and early indicators.
For any data strategy to be effective, data needs to be accessible in an easy form across the organization. For example, if there is a requirement for an analyst to pull reports, then it is not operationally viable because we have inserted complexity into the process.
Finally, there needs to be a single source of truth. It is paramount for organizations to ensure that everyone is working off a single, trusted version of the data.
As far as ineffective strategies go, organizations need to accept that on some level data will never be perfect. Waiting for perfection will only impede progress, so there is a need to be comfortable making decisions with the data you have, rather than waiting for what's missing. And while metrics are important as stated above, it is possible to have too many metrics — it is critical to take the time to define which metrics really matter rather than sifting through too many data points.
Global Chief AI Officer at IBM
Too many data and AI projects fail because their creators rush into the technical implementation of solutions without first clearly defining what real-world problems they are trying to solve and what success looks like for their business. That's a losing strategy.
Using data properly means aligning the business strategy to the data and AI strategy to provide solutions that solve real human problems. A successful data strategy-setting process should:
- Set intent: Discover the intent by uncovering the targeted data-driven business opportunities and the unanswered questions that are critical to achieve the objectives.
- Identify: Define the use cases, selecting the types of data and solutions needed by the users.
- Evaluate: Take a critical look at the data sources needed to implement DataOps.
- Plan: Set concrete actions by using statements of intent as a guide for the technical implementation.
Chief Technology Officer at Talend
Data management must change. It must move beyond data management and storage to focus on reliability and integrity. Talend believes data quality is vital to a successful enterprise data strategy which is analogous to health care. What's it take to build a protocol based on Data Health?
- Proactive inoculation. Vaccines teach the body to recognize and fight a pathogen before infection begins. For data infrastructure, machine learning is similar.
- Regular monitoring. In the data world, we use terms like assessment or profiling, but it is similar to patient monitoring.
- Identification of risk factors. Some risks are endogenous, such as the company's applications, processes and employees, while others (partners, suppliers, customers) come from the outside. Recognizing the areas that present the most risk, we can more effectively prevent dangers before they arise.
- Prevention programs. Good data hygiene requires good data practices and disciplines. The Talend Trust Score™ lets us assess and control the intake of data, producing information that is easier to understand and harder to ignore.
- Efficient treatments. Data quality can introduce extra steps into the process, which can also slow things down. Talend masters this balancing act.
- Protocols for continuous prognosis. Doctors prescribe the right therapy when they know what to treat. Prognosis is a model that requires constant revision and improvement. Data health is a continuously improving model too.
Data health is Talend's vision for a holistic system of preventative measures, effective treatments and a supportive culture necessary to ensure that the data fueling decisions and direction is sound.
Wendy M. Pfeiffer
Chief Information Officer at Nutanix
Value of Data: Data is the fuel that empowers digital transformation, but many organizations limit the collection and use of operational data because they do not have a focus on data monetization. Once an organization has articulated the operational and competitive value of its data, data practices can be purpose-built to support those values, and costs can be justified. Instrumenting a business's core operating model to ensure it is collecting, analyzing and retaining operational and performance data ensures that this fuel is available at capacity to energize the company's path forward.
Disintermediation of Data: In many organizations, data resides in the hands of just a few — the data scientists and executives — who seemingly "weaponize" data to influence employee and customer actions. Disintermediating this process by incorporating modern NLP technologies within core data service environments enables every team member to interact directly and uniquely with operational and market data. This direct and customized access optimally utilizes the organization's rich operational data to inform and refine the myriad of atomic human and autonomous activities upon which companies rely.
CEO at Informatica
I firmly believe data is the soul of digital transformation. You first need to create a centralized repository of all data that exists throughout the organization. With so much data coming from disparate sources, it's important to look for a solution that can automate data cataloging tasks. Second, by applying AI and machine learning technologies to automate programs to discover what may be at risk and what may be hidden potential, you'll drive greater use of the data for analytical and operational processes. Lastly, it's imperative both business and technical stakeholders are aligned to ensure you can achieve your objectives. CDOs, CISOs, risk and compliance officers and their teams need to understand the data and prioritize business agendas. If you don't align on key drivers for your data management, then you won't agree on how to empower value creation across the business.
By democratizing data and empowering more users with trusted, timely and actionable data insights at scale, organizations can digitally transform business initiatives from customer experience, to ecommerce, to financial transformation, etc.
Organizations fall short when using solutions that work for only one cloud or solely on-prem or don't work in a hybrid environment. Another issue is tying data governance programs to only meet security and compliance requirements and not actual value creation. By aligning to revenue generating initiatives — such as enhancing customer loyalty or democratizing data analytics — it's easier to demonstrate the value of increased transparency and reduced risks in order to accelerate data-driven digital transformation programs.
Kevin McAllister ( @k__mcallister) is a Research Editor at Protocol, leading the development of Braintrust. Prior to joining the team, he was a rankings data reporter at The Wall Street Journal, where he oversaw structured data projects for the Journal's strategy team.
More from Braintrust