Twitter's global data protection officer on trying to build a privacy-conscious culture
Damien Kieran has one of the hardest jobs in tech.
When Damien Kieran joined Twitter's legal team in 2016, he was set to handle some of the company's litigation and regulatory affairs. He took the job, after a stint at Google and instead of an offer at Facebook, because he thought he could have more impact at Twitter. "When I left Google, the legal team was over 1,000 people already," he says. "Twitter's was like 35."
Not long after Kieran started, a new job emerged. With GDPR looming, Twitter — like every other tech company — had to figure out how to operate in a new era of privacy regulation. Kieran, who is French and Irish and started his law career in Dublin, had a history of working with European regulators. So in early 2018, he was named Twitter's first-ever global data protection officer, reporting directly to Twitter's board of directors.
This interview has been condensed and edited for clarity.
I would think it would be complicated for a company like Twitter to make the internal declaration that "privacy matters to us" in such a way that there's a team with a person in charge of it, and then have to integrate that into how Twitter works every day. Was there a phase where you're the enemy of every other team in Twitter, who feel like you're just going to hamstring them?
Privacy is a really diverse and polarizing thing. And we have people who work at Twitter from all parts of the world, all parts, all cultural experiences, educational experiences. They view privacy in different ways. So there's obviously a learning curve for people in different respects to figure out how Twitter does things. And we're constantly evolving: Building out a privacy review process so that all of our products get reviewed before they get shipped — for some teams, that was a new thing. For other teams, it was something they'd long done. So it's just finding that balance between each of these.
Twitter is not a historically top-down company. It's pretty flat, and people are encouraged to be innovative and to think holistically about how our product will interact and do things. The challenge with that, though, from a privacy and data protection perspective, because whether we liked it or not, a component of this, this is compliance. My team's goal is to make sure that we're doing the things to protect people.
How do you do that without being heavy-handed? Or being the downer in the room every time, sitting there saying, "No, we can't do that, no, we can't do that"?
First of all, every new hire gets privacy and data protection training. Every new hire that comes through the door, because everybody goes to San Francisco for their onboarding week. And as part of that, they get PDP training.
And then what we did December 2018 was we announced and launched the Doves program. The Doves are an internal group of stewards. Any employee in any team can opt into it, and they get two days of deeper data privacy, data security, data management training. At the end of it, they meet every two weeks to discuss topics, both within Twitter and outside of Twitter, that are privacy and data-protection related. And then we bring in outside speakers. The goal is to both make these things interesting and informative, but also keep the topics at the top of people's mind.
The goal is to weave this into the fabric of the way we think about things, so it's not a compliance exercise. If you're a product manager, you're just going to be naturally thinking about these things. But when you also go back into your team as a Dove, you're a recognized sort of expert.
I don't want people to come to the lawyers. If they're coming to the lawyers, we failed, because it is then a compliance exercise, right? Instead of just giving people the tools, making sure that we're there to support them, but ultimately empowering them to do the right thing.
So what does the training look like?
It's over two days, spread out — several sessions in the morning and in the afternoon. It starts with very basic stuff, and then it builds up. Some of it is talking about just philosophically how we view the world. Then we talk about the technical debt. Projects that were long-term things that we were doing to resolve technical debt that we've discovered over the years, and then how we're going to build products going forward, and the steps that we have in place to make sure that people are doing that the right way. How they can get support and resources to think through the problems.
And then there's more tactical things. Training around how we manage data and how people need to be aware of how they manage data, and a variety of tactical things like privacy review, our data map. Who to contact for specific tactical questions related to those things. And then we sort of just encourage debate around all of those topics.
Is there, like, a big final exam at the end?
No. We don't want it to be a test. But what we do offer is anybody who becomes a Dove can also go and do one of the external certifications for things like IAPP. It will be paid for, and then they can get their accreditation.
The goal with that is twofold. One is to encourage people to get that additional externally validated training. The other thing that we did was we built the responsibilities that are in the Doves' remit into all of the career ladders.
So it doesn't say "if you're a Dove," but it says, if you are really good at data mapping, data hygiene, and how are you developing this line of products from a PDP perspective, those things are taken into account from your promotion cycles and your year-end evaluations. So trying to incentivize these things as opposed to making it a stick.
If GDPR was the forcing function that started a lot of this work, and now there's a team and a strategy, what happens now when something like CCPA happens? Is there a meeting where you read aloud the whole bill together?
Yes, we stand and do a dramatic reading. We dress up in old wigs. No, so, candidly, most of the work that we did for GDPR prepared us for CCPA. So tactically there wasn't a ton to do. There was some stuff around the edges.
Was that on purpose? It sounds like it was your assumption all along that whatever else is going to happen is likely going to follow sort of the broad strokes of GDPR.
Do you think that'll keep being true? What if we get to a point where 50 states and 85 countries all have different privacy policies? Are you going to just have to follow the strictest one and hope it trickles down?
Yes and no. So it is obviously something that Twitter's very mindful of because we can't effectively provide a single service to the world if we have to comply with 50 U.S. state laws, you know, Brazilian GDPR, Indian version of GDPR or European version of GDPR. After Brexit, what happens to the UK?
We absolutely support strong and robust privacy laws. What we don't want is things that are in conflict and that make it very hard for us to provide that same privacy experience irrespective of where you go.
So some of my time and my team's resources are spent going to talk to legislators as they consider changes to their national and state-level laws. To be clear, we absolutely support strong and robust privacy laws. What we don't want is things that are in conflict and that make it very hard for us to provide that same privacy experience irrespective of where you go.
It's a little early to say how it will work out. For now, we continue to try and provide that same experience. I don't want to say that we go to the lowest or highest common denominator, but to some extent that is what necessarily happens.
But I'm optimistic about it. I truly believe that the services that will flourish are those that are offering the best privacy and data protection they can provide to people irrespective of where they live. And that's why when we talk about these things internally, we don't talk about GDPR, we don't talk about CCPA. The lawyers do, but what we talk about internally is PDP. It is privacy and data protection. It's about protecting the people that use our service.
And the goal was to, as simply as we could, distill down the rights and obligations that both sides have with respect to the data we collect, how it's used, and when it's shared. That was the goal: those three things.
The tension we arrived at was that there's certain things that mean different things to different people, and no matter how often we change the wording we're confronted by it. So we started to look at things like graphics and pop-ups within the policy to help people understand visually what the things are.
I want to make sure that if people decide to sit down and do it, it's as good as it can be.
But we still can't get anyone to read it.