Overturning Roe could change how digital advertisers use location data. Can they regulate themselves?

Over the years, the digital ad industry has been resistant to restricting use of location data. But that may be changing.

Map with location service

Over the years, the digital ad industry has been resistant to restrictions on the use of location data. But that may be changing.

Illustration: Christopher T. Fong/Protocol

When the Supreme Court overturned Roe v. Wade on Friday, the likelihood for location data to be used against people suddenly shifted from a mostly hypothetical scenario to a realistic threat. Although location data has a variety of purposes — from helping municipalities assess how people move around cities to giving reliable driving directions — it’s the voracious appetite of digital advertisers for location information that has fueled the creation and growth of a sector selling data showing who visited specific points on the map, when, what places they came from and where they went afterwards.

Over the years, the digital ad industry has been resistant to restrictions on the use of location data. But that may be changing. The overturning of Roe not only puts the wide availability of location data for advertising in the spotlight, it could serve as a turning point compelling the digital ad industry to take action to limit data associated with sensitive places before the government does.

Friday’s Supreme Court decision has heightened the significance of risks associated with location data reflecting sensitive places, said Grace Briscoe, senior vice president of Client Development at digital ad company Basis Technologies.

It appears that the industry organizations have been slow to act, and I do think there's opportunity for them to tighten the location data policies.

“It wasn't as sensitive a month ago as it suddenly is now,” Briscoe said. “It appears that the industry organizations have been slow to act, and I do think there's opportunity for them to tighten the location data policies, and prohibiting collection of certain types of sensitive location data and putting some consistent industry-wide guidance around that would be really valuable.”

There is moderate movement in this direction already. The Network Advertising Initiative, a digital ad industry trade group, last week announced a new set of voluntary guidelines prohibiting member companies that adopt them from using, selling or sharing any information about device or user activity associated with locations deemed sensitive, such as fertility or abortion clinics, mental health treatment facilities, places of religious worship, correctional facilities, addiction treatment centers, immigration centers, military bases or payday loan institutions.

The NAI wants to preempt an outright ban on location data, said David LeDuc, vice president of Public Policy at the NAI. “The notion that all the data should just be eliminated — I just don’t think it passes the test. It’s really fair how concerned people are, but we just don’t think throwing all the data away is where we should end up,” he said.

Pharmaceutical makers or other advertisers do use location data associated with health care facilities, Briscoe said. “Finding people who are at the hospital every day — that is actually a really good way to identify people who work in health care or frontline workers,” she said. Briscoe said she was not aware of industry-imposed restrictions on use of location data associated with health or medical facilities, although there are some limits on location data reflecting school locations or places of worship.

The NAI’s standards did not come together in response to last week’s Supreme Court decision, of course. They had been in the works for months, propelled in part by news last July that mobile app records obtained by Catholic news outlet The Pillar showed a Wisconsin priest visited gay bars and private residences while using location-based hookup app Grindr. The publication’s investigation led to his resignation.

“We as an industry need to take proactive steps to make sure we are being more responsible and raising the bar and preventing this type of data leakage,” said LeDuc, noting that the NAI wanted to help prevent bounty hunters or law enforcement from obtaining data associated with sensitive locations that could be used to out or penalize people.

“We recognize that self-regulation hasn’t solved all the problems and we can do more and we need to do more — and that’s where this came from,” LeDuc said regarding the voluntary standards.

But there are gaps in this self-regulatory regime.

Only three member companies have publicly agreed to implement the principles, including Foursquare. Google, an NAI member, did not sign on publicly to adopt the standards. After a draft Supreme Court opinion presaging the eventual overturning of Roe v. Wade was leaked in May, legislators wrote to Google urging the company to “stop unnecessarily collecting and retaining customer location data, to prevent that information from being used by right-wing prosecutors to identify people who have obtained abortions.” Google declined to comment for this story.

The NAI does not categorize Google as a precise location information solution provider, the narrow category the group carved out for applying the new standard. “We didn’t seek to include Google because it’s a different business model,” said LeDuc. He added that “it’s fair” to ask why Google or other companies outside the narrow location provider category have not adopted the standards.

Other NAI member companies selling or using location data including Place Exchange and Ubimo also did not publicly agree to the standards.

Fodder for abortion bounty hunters

As industry introduces new self-regulatory limits on data reflecting sensitive locations, pressure from actual regulators is mounting.

Senators recently demanded that location data providers SafeGraph and Placer.ai give details about their data collection practices related to abortion clinics. After it was reported in May that SafeGraph sold information showing where groups of people visiting family planning and abortion clinics had traveled from, how long they stayed and where they traveled, the company said it would stop selling data associated with family planning center locations.

After Roe was overturned last week, lawmakers signaled their intentions to establish new laws that could affect data use in their states. When announcing his plan to push for a constitutional amendment to protect abortion rights in Washington state on Saturday, Governor Jay Inslee said, “We are going to be very alert to plug any gaps in our privacy laws, so that no one can expose private information from a Washington citizen or a citizen of a different state, who comes here for services. We are not going to allow that data to get back to Texas or Missouri or Idaho.”

It’s not just Google, Apple or mobile carriers that have been subject to law enforcement demands for location data. “We have received subpoenas in the past, and we’ve had to provide information,” said Elizabeth Hein, associate general counsel of privacy, product and compliance and global data protection officer at Foursquare. “We do only respond when we have the appropriate legal documentation; they would have to have a subpoena or a court order,” Hein said.

Foursquare has limited its use of information associated with sensitive places for the past few years. Today Hein said the company includes 1.5 million locations throughout the U.S. in its list of sensitive places.

Foursquare already self-imposes the limitations put in place by the NAI’s voluntary standards. In practice, that means that while a Foursquare app user can still “check in” at a sensitive place – say a church or doctor’s office or gay bar – Foursquare does not ingest check-in information associated with places on the sensitive list into its system. It also prevents partners that use its location data and services in their apps from getting that information. That means ads cannot be targeted using that information, and customers can’t use it to measure the performance of their ads.

“This information is just too sensitive to be sharing or using in products downstream,” Hein said.

A history of fighting regulation while pushing for more location data collection

The recent Supreme Court decision may have made the harmful implications of location data more palpable, but the ad industry has faced pressure from lawmakers for more than a decade to put better protections and limits on the location data flowing through its ad systems.

“Location information is extremely sensitive. But it’s not being protected the way it should be,” said Minnesota Democratic Sen. Al Franken in 2014, when he pushed for passage of his Location Privacy Protection Act, originally introduced in 2011, during a Senate Judiciary Committee hearing. The goal of the bill was to protect people from stalking facilitated by location-tracking apps.

During the hearing, Lou Mastria, then executive director of the Digital Ad Alliance, a consortium of advertising trade associations, emphasized that existing industry self-regulation “is not intended to prevent criminal activity.” He added, “The DAA does not believe that such new legislation is needed at this time.”

Even then, the DAA did require companies to get consent from people before collecting and sharing precise location data “or obtain reasonable assurances that the app developer or owner has obtained consent to that data collection.” However, at the time, the DAA and ad industry at large lobbied against federal privacy legislation, arguing that their own self-regulatory approaches were preferable.

Two years after that hearing, the Interactive Advertising Bureau, the digital ad industry’s most prominent trade group and a member of the DAA, actively encouraged its digital publisher members to take advantage of monetizing mobile location data, which it called “a treasure trove in the marketing world.” A 2016 IAB guide said publishers could generate ad revenue of 20% to 30% more when location data is used to target ads.

Without mentioning anything about the potential to exploit data associated with visits to places people may consider sensitive or private, the guide told publishers how to sell location data through licensing agreements. Partnerships between mobile location data providers and the mobile app publishers they gather location data from typically are shielded from the public by non-disclosure agreements, making it near impossible to know which entities actually supply the data. The IAB guide did note that publishers should consult the DAA’s self-regulatory principles and the NAI’s code of conduct when considering how to get user permission for location data collection.

The guide also explained that location data was available for the taking through ad requests in open ad exchanges or through the software development kits plugged into mobile apps, a process sometimes referred to as bidstream siphoning. That process is often downplayed by the ad industry and location data providers because it allows data to be used without explicit consent. To this day, location data is available for collecting from ad bidding systems.

Last month, the Irish Council for Civil Liberties called the exposure of location data in the real-time bidding systems that run the digital ad market “the biggest data breach.” It reported, “On average, a person in the U.S. has their online activity and location exposed 747 times every day by the RTB industry.”

Since the DAA testified about location data before Congress in 2014, a slew of state privacy laws have been established, prompting industry groups including the DAA and IAB to change their tunes on a federal privacy law. The groups now argue that one federal law that would supersede state laws would be better than the jumble of differing requirements and restrictions they face today.

A post-Roe window cracks open to industry change

Today the DAA, IAB and NAI support a framework for federal privacy legislation that would prohibit companies from obtaining geolocation information without obtaining people’s express consent. However, it does not create specific rules for data associated with sensitive locations.

When asked why the IAB does not have self-regulatory standards encouraging member companies to adopt the principles in the framework, Lartease Tiffith, executive vice president for Public Policy at IAB, told Protocol, “We don’t come out with a proposal if we don't think that that's what our members need to do and they ought to do and we have the support to do. I think it's actually pretty clear that we are doing those things; we just don't have everything in the public sphere for you to sort of pull down off our websites.”

However, the IAB did make a statement on its website Monday about providing employees with access to reproductive health care.

“For women employees who live in states that are restricting access to reproductive healthcare, IAB will fund travel to locations that provide it,” the group wrote. “The U.S. Supreme Court’s decision to overturn Roe v. Wade permits states to significantly restrict women’s ability to support their families, make crucial healthcare choices, and continue to participate in the economy and society. Furthermore, this ruling directly and disproportionately harms poor women and communities of color.”

Now, the IAB is promising there’s more to come. “We actually are looking at what additional things could we do beyond pushing for this in legislation and regulation. [A lot] of people are, so we don't want to stop there, but we want to make sure that we're taking the right approach,” said Tiffith.

“There’s openness in the industry to taking action around this,” Briscoe said. “The moment gives us some real specifics to address and some momentum around making sure that we solve for some of these potential abuses of the data that's been created.”

Correction: This story was updated to reflect the fact that Factual has adopted the NAI’s sensitive location data standards because it was acquired by Foursquare and is no longer a standalone company. This update was made June 29, 2022.


Binance’s co-founder could remake its crypto deal-making

Yi He is overseeing a $7.5 billion portfolio, with more investments to come, making her one of the most powerful investors in the industry.

Binance co-founder Yi He will oversee $7.5 billion in assets.

Photo: Binance

Binance co-founder Yi He isn’t as well known as the crypto giant’s colorful and controversial CEO, Changpeng “CZ” Zhao.

That could soon change. The 35-year-old executive is taking on a new, higher-profile role at the world’s largest crypto exchange as head of Binance Labs, the company’s venture capital arm. With $7.5 billion in assets to oversee, that instantly makes her one of the most powerful VC investors in crypto.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Sponsored Content

How cybercrime is going small time

Blockbuster hacks are no longer the norm – causing problems for companies trying to track down small-scale crime

Cybercrime is often thought of on a relatively large scale. Massive breaches lead to painful financial losses, bankrupting companies and causing untold embarrassment, splashed across the front pages of news websites worldwide. That’s unsurprising: cyber events typically cost businesses around $200,000, according to cybersecurity firm the Cyentia Institute. One in 10 of those victims suffer losses of more than $20 million, with some reaching $100 million or more.

That’s big money – but there’s plenty of loot out there for cybercriminals willing to aim lower. In 2021, the Internet Crime Complaint Center (IC3) received 847,376 complaints – reports by cybercrime victims – totaling losses of $6.9 billion. Averaged out, each victim lost $8,143.

Keep Reading Show less
Chris Stokel-Walker

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.


Trump ordered social media visa screening. Biden's defending it.

The Knight First Amendment Institute just lost a battle to force the Biden administration to provide a report on the collection of social media handles from millions of visa applicants every year.

Visa applicants have to give up any of their social media handles from the past five years.

Photo: belterz/Getty Images

Would you feel comfortable if a U.S. immigration official reviewed all that you post on Facebook, Reddit, Snapchat, Twitter or even YouTube? Would it change what you decide to post or whom you talk to online? Perhaps you’ve said something critical of the U.S. government. Perhaps you’ve jokingly threatened to whack someone.

If you’ve applied for a U.S. visa, there’s a chance your online missives have been subjected to this kind of scrutiny, all in the name of keeping America safe. But three years after the Trump administration ordered enhanced vetting of visa applications, the Biden White House has not only continued the program, but is defending it — despite refusing to say if it’s had any impact.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.


The US plans to block sales of older chipmaking tech to China

The Biden administration will attempt to roll back China’s chipmaking abilities by blocking tools that make a widely used type of transistor other chipmakers have employed for years.

By using a specific, fundamental building block of chip design as the basis for the overall policy, the White House hopes to both tighten existing controls and avoid the pitfalls around trying to block a generation of manufacturing technology.

Illustration: Christopher T. Fong/Protocol

The Biden administration has for several months been working to tighten its grip on U.S. exports of technology that China needs to make advanced chips, with the goals of both hurting China’s current manufacturing ability and also blocking its future access to next-generation capabilities.

According to two people familiar with the administration’s plans, President Joe Biden’s approach is based around choking off access to the tools, software and support mechanisms necessary to manufacture a specific type of technology that is one of the fundamental building blocks of modern microchips: the transistor.

Keep Reading Show less
Max A. Cherney

Max A. Cherney is a senior reporter at Protocol covering the semiconductor industry. He has worked for Barron's magazine as a Technology Reporter, and its sister site MarketWatch. He is based in San Francisco.


Netflix Games had its best month yet. Here's what's next.

A closer look at the company’s nascent gaming initiative suggests big plans that could involve cloud gaming and more.

Netflix’s acquisitions in the gaming space, and clues found in a number of job listings, suggest it has big plans.

Illustration: Christopher T. Fong/Protocol

Netflix’s foray into gaming is dead on arrival — at least according to the latest headlines about the company’s first few mobile games.

“Less than 1 percent of Netflix’s subscribers are playing its games,” declared Engadget recently. The article was referencing data from app analytics company Apptopia, which estimated that on any given day, only around 1.7 million people were playing Netflix’s mobile games on average.

Keep Reading Show less
Janko Roettgers

Janko Roettgers (@jank0) is a senior reporter at Protocol, reporting on the shifting power dynamics between tech, media, and entertainment, including the impact of new technologies. Previously, Janko was Variety's first-ever technology writer in San Francisco, where he covered big tech and emerging technologies. He has reported for Gigaom, Frankfurter Rundschau, Berliner Zeitung, and ORF, among others. He has written three books on consumer cord-cutting and online music and co-edited an anthology on internet subcultures. He lives with his family in Oakland.

Latest Stories