Policy

Apple just reignited the encryption fight. Everyone should have seen it coming.

Privacy advocates revived their worries about encryption over Apple's new child safety features, but questions about the most dangerous content in the U.S. and abroad were already shifting the thinking on how to balance security and digital harm.

An iPhone 11 on display during the Apple iPhone 11 launch at the tech giant's flagship store in Regent Street, central London.

Measures to protect kids on the iPhones drew concern from privacy and security advocates.

Photo: Jonathan Brady/PA Images via Getty Images

Apple's announcement that it would take new measures to protect children on iMessage and in users' photo accounts has generated worry in the tech policy community that the phone-maker has abandoned its commitment to privacy following years of pressure from governments worldwide.

Basically everyone wants to protect children, but in recent years questions about how to handle the spread of controversial content, from dangerous COVID-19 misinformation to political dissent repressive regimes want to silence, have become even more broad and politically fraught both in the U.S. and around the world.

The struggle over how to balance privacy and security is a long-running tension in the U.S. The most recent round began to play out when Apple said last Thursday it would allow parents to get warnings when kids under 13 send or receive explicit images and that it would scan photo uploads to iCloud for known child sexual abuse material (CSAM). First, Apple said its goal was "to help protect children from predators who use communication tools to recruit and exploit them." Then privacy, security and free speech advocates argued that, while CSAM is the most horrible of online content, compromising robust encryption means compromising the kind of information it protects, from health and financial data to the identities of LGBTQ+ youth and pro-democracy activists all over the world.

In the past, the tension has so far largely, if uneasily, resolved in favor of end-to-end encryption. Security and privacy prevail in part because of concerns that governments could use safety measures to chill all kinds of speech, and also in part because the First and Fourth Amendments create a high bar for speech and search in the U.S., even when those liberties may mean wrongdoers go uncaught. Such concerns, for instance, gave Apple political cover when it resisted unlocking a dead terrorist's iPhone following the 2015 San Bernardino attack, and helped to slow down measures like the bipartisan EARN IT Act that could have exposed companies to liability for CSAM if they employed encryption.

The tension has deepened into 2021. Concerns about dangerous posts, pictures and videos only seem to have grown as the traditional emphasis on preventing child abuse imagery, terrorism and other crime has expanded to concerns about election disinformation, lies about COVID-19 vaccines and treatments, violent racism and other instances of harmful content.

Invisible content

"The content questions are already having a moment at the same time that we're seeing encryption come back into the fore," said Emma Llansó, director of the Free Expression Project at the Center for Democracy and Technology, which was highly critical of Apple's moves.

In the U.S., some of those questions are being driven by the Biden administration, which is is critical of online misinformation being spread about COVID-19, its treatments and the efficacy of masks and vaccines. Two Democratic senators also recently led an effort to limit a liability shield for online platforms when their algorithms boost certain kinds of health misinformation.

The concerns don't come only from the Democratic side. Former President Trump, during his tenure, tried repeatedly to stop online criticism of himself, and members of Congress from both parties have issued bills and statements regarding online terrorism, civil rights violations, political misinformation, drug sales, counterfeits and more. Amid the bipartisan concerns are also perennial allegations by Republicans that online platforms censor conservative speech. And on top of all that, social media platforms themselves also have extensive community standards that limit or ban many kinds of speech, even if it's legal.

Government officials from both parties and child safety advocates all over the world have also been arguing in recent years that massively popular end-to-end encrypted services such as Apple's iMessage and Facebook-owned WhatsApp allow the most horrific content to spread rapidly with little ability for the public or law enforcement to spot, trace or correct it.

"This puts our citizens and societies at risk by severely eroding a company's ability to detect and respond to illegal content and activity, such as child sexual exploitation and abuse, terrorism, and foreign adversaries' attempts to undermine democratic values," top government officials from the U.S., U.K. and Australia wrote in a October 2019 letter to Mark Zuckerberg, criticizing his plans to bring end-to-end encryption to Facebook Messenger.

The letter said a switch to full encryption on all of Facebook's platforms would imperil many of the 16.8 million reports that the company made to the National Center for Missing & Exploited Children in 2018 — even as the children's group said that tips it received on potential CSAM had grown more than fourfold from 2015 to 2018. Its data also suggested child abuse material circulating online was becoming more graphic and explicit.

At the time, the heads of Messenger and WhatsApp responded to the government officials that "the 'backdoor' access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes."

Last week, WhatsApp chief Will Cathcart again weighed in on Apple's latest changes to assure users it wouldn't be adopting a similar system.

"We've had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content," Cathcart said in a tweet thread. "It's not how technology built in free countries works."

The world watches

Apple and Facebook have for several months been locked in a battle over the way iOS privacy changes have affected Facebook's advertising business. But Cathcart's invocation of the global scope of Apple's measures appears to reflect how governments around the world have also expressed worries about what their users are sharing.

The U.K. government is pushing to have more platforms take responsibility for illegal content and other online harms, which many civil liberties advocates fear could force companies to weaken encryption. The European Union wants platforms, especially the largest ones, to do more to rein in illegal content and protect children online.

Some nations appear to have more repressive goals in mind, such as India's recent clashes with social media over content critical of the ruling party; the country's new rules potentially require the tracing of messages. Digital rights advocates have also raised concerns about laws and proposals in China, Singapore, Russia, Turkey and others — some of them major markets for technology companies that are eager to use their economic importance to try to shut down citizen dissent.

Although Apple said it would "refuse" any demands to look for images beyond CSAM, some tech-watchers also suggested that the company may have taken its new measures to get ahead of regulations dealing with illegal content, especially those coming from liberal democracies.

"They're looking at regulatory pressure here [in the U.S.] and in Brussels and the U.K. and Australia, and they're seeing that the gig is up," said Hany Farid, a professor at the University of California, Berkeley, and an expert on digital forensics.

Calling Apple's measures the "bare minimum," Farid added, "Maybe they looked at the landscape and thought, 'OK, this is going to be bad for us. Let's get out ahead.'"

Farid compared Apple's moves to the scans for CSAM or viruses that many cloud, email and messaging providers already perform — albeit not usually on the user's own device or on platforms that companies brand as encrypted end-to-end. But, he said, Apple's changes represent a corporate recognition that many would like to see online safety and privacy weighed differently against one another than they have been.

"We all want privacy, but we also recognize online harms," Farid said. "The trick is, how do you find a balance?"

Entertainment

Netflix looks to expand gaming with major IP deals, Fortnite-like updates

Remarks made to investors and recent job postings hint at big ambitions for Netflix’s nascent gaming efforts.

Netflix may be taking some cues from games like Fortnite and Apex: Legends for its own video game initiative.

Photo: Cameron Venti/Unsplash

Two months after launching mobile games to all of its members, Netflix is looking to double down on gaming: The company told investors Thursday that it wants to expand its portfolio of games “across both casual and core gaming genres.” Recent job offers suggest that this could include both live services games as well as an expansion to PC and console gaming, and the company's COO hinted at major licensing deals ahead.

Keep Reading Show less
Janko Roettgers

Janko Roettgers (@jank0) is a senior reporter at Protocol, reporting on the shifting power dynamics between tech, media, and entertainment, including the impact of new technologies. Previously, Janko was Variety's first-ever technology writer in San Francisco, where he covered big tech and emerging technologies. He has reported for Gigaom, Frankfurter Rundschau, Berliner Zeitung, and ORF, among others. He has written three books on consumer cord-cutting and online music and co-edited an anthology on internet subcultures. He lives with his family in Oakland.

Sponsored Content

A CCO’s viewpoint on top enterprise priorities in 2022

The 2022 non-predictions guide to what your enterprise is working on starting this week

As Honeywell’s global chief commercial officer, I am privileged to have the vantage point of seeing the demands, challenges and dynamics that customers across the many sectors we cater to are experiencing and sharing.

This past year has brought upon all businesses and enterprises an unparalleled change and challenge. This was the case at Honeywell, for example, a company with a legacy in innovation and technology for over a century. When I joined the company just months before the pandemic hit we were already in the midst of an intense transformation under the leadership of CEO Darius Adamczyk. This transformation spanned our portfolio and business units. We were already actively working on products and solutions in advanced phases of rollouts that the world has shown a need and demand for pre-pandemic. Those included solutions in edge intelligence, remote operations, quantum computing, warehouse automation, building technologies, safety and health monitoring and of course ESG and climate tech which was based on our exceptional success over the previous decade.

Keep Reading Show less
Jeff Kimbell
Jeff Kimbell is Senior Vice President and Chief Commercial Officer at Honeywell. In this role, he has broad responsibilities to drive organic growth by enhancing global sales and marketing capabilities. Jeff has nearly three decades of leadership experience. Prior to joining Honeywell in 2019, Jeff served as a Partner in the Transformation Practice at McKinsey & Company, where he worked with companies facing operational and financial challenges and undergoing “good to great” transformations. Before that, he was an Operating Partner at Silver Lake Partners, a global leader in technology and held a similar position at Cerberus Capital LP. Jeff started his career as a Manufacturing Team Manager and Engineering Project Manager at Procter & Gamble before becoming a strategy consultant at Bain & Company and holding executive roles at Dell EMC and Transamerica Corporation. Jeff earned a B.S. in electrical engineering at Kansas State University and an M.B.A. at Dartmouth College.
Policy

The Senate antitrust bill just created some very weird alliances

Democrats and Republicans have found the tech reform debate scrambles traditional party politics — and Tim Cook and Ted Cruz have found themselves chatting.

The Senate Judiciary Committee advanced a bill on Thursday that could remake the tech industry.

Photo: PartTime Portraits/Unsplash

Strange alliances formed ahead of Thursday's vote to advance a key antitrust bill to the Senate floor, with frequent foes like Sens. Amy Klobuchar and Ted Cruz supporting the measure, and prominent Democrats including California Sen. Dianne Feinstein pushing back against it.

Ultimately the bill moved out of the Senate Judiciary Committee by a vote of 16-6 after a surprisingly speedy debate (at least, speedy for the Senate). Even some of the lawmakers who called for further changes agreed to move the bill forward — a sign that the itch to finally regulate Big Tech after years of congressional inaction is intensifying, even as the issue scrambles traditional party politics in a way that could threaten its final passage.

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Boost 2

Can Matt Mullenweg save the internet?

He's turning Automattic into a different kind of tech giant. But can he take on the trillion-dollar walled gardens and give the internet back to the people?

Matt Mullenweg, CEO of Automattic and founder of WordPress, poses for Protocol at his home in Houston, Texas.
Photo: Arturo Olmos for Protocol

In the early days of the pandemic, Matt Mullenweg didn't move to a compound in Hawaii, bug out to a bunker in New Zealand or head to Miami and start shilling for crypto. No, in the early days of the pandemic, Mullenweg bought an RV. He drove it all over the country, bouncing between Houston and San Francisco and Jackson Hole with plenty of stops in national parks. In between, he started doing some tinkering.

The tinkering is a part-time gig: Most of Mullenweg’s time is spent as CEO of Automattic, one of the web’s largest platforms. It’s best known as the company that runs WordPress.com, the hosted version of the blogging platform that powers about 43% of the websites on the internet. Since WordPress is open-source software, no company technically owns it, but Automattic provides tools and services and oversees most of the WordPress-powered internet. It’s also the owner of the booming ecommerce platform WooCommerce, Day One, the analytics tool Parse.ly and the podcast app Pocket Casts. Oh, and Tumblr. And Simplenote. And many others. That makes Mullenweg one of the most powerful CEOs in tech, and one of the most important voices in the debate over the future of the internet.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editorial director. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Workplace

Should your salary depend on meeting DEI goals?

Diversio just raised $6.5 million to use AI to fix DEI.

Laura McGee has spent her entire career thinking about diversity and business. At one point, she helped lead the Trump-Trudeau Council for Advancement of Women, working with the prime minister and president to build a plan to grow the North American economy through diversity. During that time, she kept hearing from CEOs that they cared about diversity and wanted to improve, but that they had “no data and no metrics.”

That was when she decided to build Diversio: a platform that makes data collection, as well as acting on it, “super simple.”

Keep Reading Show less
Michelle Ma

Michelle Ma (@himichellema) is a reporter at Protocol, where she writes about management, leadership and workplace issues in tech. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.

Enterprise

Why low-code and no-code AI tools pose new risks

The low-code trend has come to AI, but skeptics worry that gifting amateurs with Easy-Bake Ovens for machine-learning models is a recipe for disaster.

The same things that make low- and no-code AI so appealing can pose problems.

Image: Boris SV/Moment/Getty Images

“No code. No joke.”

This is the promise made by enterprise AI company C3 AI in splashy web ads for its Ex Machina software. Its competitor Dataiku says its own low-code and no-code software “elevates” business experts to use AI. DataRobot calls customers using its no-code software to make AI-based apps “AI heroes.”

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins