Politics

New DNC data tool aims to give an edge to campaigns light on tech expertise

Called Blueprint, the new tool will help down-ballot campaigns figure out what voters to target, the latest effort in a multi-year effort to fix the Democrats' data operation.

A sign reading "Vote Today"

The DNC's new Blueprint tool will help campaigns more easily target voters with ads and other outreach.

Photo: Bloomberg/Getty Images

In its latest effort to update its creaky digital systems, the Democratic National Committee is launching a new tool that will help campaigns and state parties across the country — particularly those with little tech expertise — tap into a trove of voter data to better target their messages as they work to wrangle seats at every level of government.

The DNC is touting Blueprint, a tool to help Democratic campaigns sift through data on hundreds of millions of people across the U.S., as a step towards shoring up the Democratic party's digital capabilities at a time when campaigning in person is impossible. It will help campaigns decide who to call, text, email and target with digital advertisements during this contentious and unprecedented election year and beyond.

"We're taking this rich set of data we have about voters, and we're making it really easy for campaigns to use that data," the DNC's chief technology officer Nell Thomas told Protocol in a phone interview. Now, rather than having to comb through the party's voter database themselves, campaigns at every level will have access to ready-made tables on voters' addresses, ethnicities, and voting history, among other data points, in a given area.

Thomas said Blueprint is a boon especially for down-ballot candidates, who often don't have the luxury of an in-house tech team focused on using data to mobilize prospective voters and volunteers. "Especially at smaller campaigns … the data person is also the IT person and is also the digital person," Thomas said.

"This gives those building blocks to users right away so they can spend less time doing that data-cleaning work and more time doing what really matters to campaigns, which is reaching voters," she said.



Protocol Cloud, your weekly guide to the future of enterprise computing. Sign up now.



Blueprint is part of a multi-year effort to modernize the party's data infrastructure, which was built back in 2011 for President Obama's reelection campaign. By 2016, that system, known as Vertica, became overloaded, and its near-daily breakdowns proved to be a frequent source of frustration for Hillary Clinton's staff and that of other Democratic campaigns. Meanwhile, the Republican party had spent the years since the 2012 race building itself a brand new data operation called Data Trust, which allowed outside groups and the party to exchange information they were collecting on voters.

In the wake of the 2016 loss, the DNC resolved to catch up to the Republicans' much-touted data operation that helped deliver the White House to President Trump, launching a range of new projects designed to improve upon its data infrastructure.

As part of that, the DNC and state parties agreed to create a new, outside entity called the Democratic Data Exchange, which would mirror the Republican Data Trust, allowing for data swapping between campaigns and outside groups. Last year, the party also announced it was deprecating Vertica and replacing it with a new cloud-based data warehouse called Phoenix, which is based on Google's BigQuery technology. That, the DNC's tech team said at the time, would at least provide campaigns with a more stable system that wouldn't crash under the pressure of so many campaigns using it at once.

And yet, even then, some worried that the new data warehouse would still require a data analyst with SQL coding skills to operate. Blueprint is an effort to solve that problem. The committee's tech and data teams spent months cleaning up the existing data in its warehouse. They looked for places where there were gaps — like, say, where voter's addresses were missing — and worked to fill them. It was an intensely manual project, Thomas says, but one that she hopes will save time-strapped campaigns down the line. Now, rather than doing that clean up work themselves, campaigns can browse the Blueprint tables and use that information to drive voter outreach.

The DNC has been piloting Blueprint in a number of states over the past several months, including Texas.

Last month, the Texas Democratic Party introduced a new model that scores every Texas voter from 1-100 according to how likely they are to vote for a Democratic candidate. Lauren Pully, the data and analytics director for the Texas Democratic Party, said her team used Blueprint to help expand that model and make it more effective.

This so-called "partisanship model" helps the state party more efficiently identify and target voters who might be inclined to help turn red areas blue. And Pully said Texas has used Blueprint's data to make its scoring system "even smarter," pointing out that the DNC's tool allowed the state party to access demographic and consumer data that it didn't already have. That includes information on a voter's ethnicity, neighborhood and income.

Before Blueprint, Pully said that campaigns lost lots of time processing data from dozens of different sources. Now, she said, "not every campaign that's going to use that data needs a data scientist to do it."

"This is one of many things that's being done to really modernize the infrastructure that all of the Democratic campaigns have available to them," Pully said. "Doing this at a national level means I don't have to do this in Texas while Florida does the same thing while Arizona does the same thing."

Given the party's calamitous security breach in 2016, Thomas said the DNC is taking extra security precautions to protect this data. That's one reason why the DNC chose Google's BigQuery technology. "We feel confident in Google's security program," she said. The party requires users to implement two-factor authentication before accessing the data warehouse. Users also sign agreements to abide by guidelines that prevent them from using the voter data for purposes other than campaigning.



Get in touch with us: Share information securely with Protocol via encrypted Signal or WhatsApp message, at 415-214-4715 or through our anonymous SecureDrop.



"We believe that privacy and security are just as critical to being good custodians of the data as making the data high-quality," Thomas said.

The DNC is working on other future data projects, including its own "partisanship model" that predicts the likelihood that someone is a Democrat or a Republican and a system that more accurately tracks voters who have moved to different states.

Ultimately, it's unclear whether any of this will help Democrats compete with Republicans, who have, after all, been building a data program since the 2012 election. They also benefit from having a sitting president in the White House whose 2020 re-election campaign began as soon as he was inaugurated and whose massive rallies over the last four years have been data gold mines for Republicans.

But whatever happens this November, it's clear the DNC's investment in data will be critical in the years to come.

Correction: This story originally misidentified the Texas Democratic Party's data and analytics director as Laura Pully. Her name is Lauren.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins