Policy

How social media became a 'debate-themed video game' and why the internet is destroying democracy

Justin E. H. Smith, author of “The Internet Is Not What You Think It Is: A History, a Philosophy, a Warning,” spoke with Protocol about his genealogy of the internet.

“The Internet Is Not What You Think It Is: A History, a Philosophy, a Warning,” by Justin E. H. Smith.

In his book, Smith argues that the internet as we know it is addictive, undemocratic and “shapes human lives algorithmically.”

Image: Princeton University Press; Protocol

“The internet is simultaneously our greatest affliction and our greatest hope; the present situation is intolerable, but there is also no going back,” Justin E. H. Smith writes in his new book, “The Internet Is Not What You Think It Is: A History, a Philosophy, a Warning.”

Smith, a professor of history and philosophy of science at the University of Paris, outlines the affliction of the internet in detail. He argues that the internet as we know it is addictive, undemocratic and “shapes human lives algorithmically, and human lives under the pressure of algorithms are not enhanced but rather warped and impoverished.”

How the internet can become the “greatest hope” of fixing our predicament is less clear, but that’s also not why Smith set out to write this book. “I'm doing genealogy and identifying problems, and then leaving it to brighter, younger people to find our way out of this,” Smith told me on a Zoom call from a café in Paris. His approach, which he said draws on Foucault and Leibniz, tries to place the internet in terms of “a vastly broader and deeper picture, both in time and in nature.”

In an interview with Protocol, Smith discussed how social media acts as a “debate-themed video game,” why dating apps often preclude “the deep mystery of love” and how the U.S. has quietly embraced a corporation-driven Chinese-style social credit system.

This interview has been edited and condensed for clarity.


Your book supposes that any remedy to problems of the internet will have to come from the internet itself. Is there any resonance to the idea that an external belief system could bring about change?

I wouldn't describe my view as pessimistic, I would describe it as an aporia or aporistic. That is to say, I don't identify any obvious solutions to the current predicament. Not because I'm a pessimist, but just because I don't really see that as part of the purview of the project. I'm doing genealogy and identifying problems, and then leaving it to brighter younger people to find our way out of this. And in that respect also, I feel like I echo Foucault here — my job is to show problems and how we inherited them and leave it to other people to provide a positive program.

That said, I could give you a glimpse of what a positive program would look like, and it would be a total destruction of the current economic model on which what I call the phenomenological internet is based. We need a real digital public space. What we have right now is a digital pseudo-public space that is simply not a viable forum for the exchange of ideas or for rational deliberation of the sort that a democracy needs. It's a pseudo-public space in that it allows people to play as if they are exchanging ideas, when in fact, what they're doing is playing a video game — racking up points in the form of likes and followers, based on figuring out how to game the algorithm.

We need a real digital public space. What we have right now is a digital pseudo-public space that is simply not a viable forum for the exchange of ideas or for rational deliberation of the sort that a democracy needs.

As long as that's the only game in town for talking about things like freedom of speech — or critical race theory, or whatever it might be — we're all doomed. Whatever side you defend, we are all doomed because it's not actually a forum for debate: It's a debate-themed video game.

So how do you solve the problem? I don't know. Smash the system and start over again? Seize the social media companies? Well the truth is, I honestly don't think that's a good idea. I think, de facto, social media are a public utility like electricity or water. And it would be better if we recognized that and treated them that way. That said, I don't necessarily think things would work out all that well, if in light of that, the government were to seize Facebook and Twitter and to run them accordingly. However, I think, in some way or other, there will have to be democratic oversight over how the algorithms work in order for us to have any hope of ever using social media for the purposes of deliberative democracy.

In your book you use the term “normies.” Often the most vital areas of culture seem to come from these niche spheres of the internet. Why do you think that is?

It's a real problem, right? Because it is addictive [and] exploitative. It is a chronophage … it eats up your time. … Nonetheless, I really can't help but feel that it's also the vanguard of [our culture]. And it would have been absurd to try to write a book like this one without going deep inside. In that respect, it's like writing an ethnography of heroin addiction — you're probably going to have to get addicted yourself in order to write any kind of compelling account of it. It's very much the same.

Whatever I was before, I can pretty confidently say at this point that I'm not a normie. My departure from the normies was part of the idea that I have to understand Extremely Online culture in order to write about it. And what I understand about it now is both that it's harming me and destroying me and also that it’s way ahead of the curve of everything else that happens in society at this point.

Justin E. H. SmithPhoto: Justin E. H. Smith

I go visit my mother, for example, and they've got MSNBC blasting 20 hours a day in their house and I can't avoid it. This is the only time I see television at this point. What strikes me about MSNBC for elderly normies of, say, my mother's generation is that all it really is is a social media filter. Like on MSNBC, you hear Rachel Maddow, or whoever else, talking a day or two later about what people were already talking about on Twitter. So I have this weird experience when I'm in the presence of normies of being like: “Yeah, I know. Yeah, I know. I know. I know.” And feeling like it's just coming with a sort of delay and in a filtered down and diluted form that's maybe easier to digest and to keep you sane.

I don't know how to deal with this. You can hear the conflict within me. I'm not, as I insisted in the book, a so-called neo-Luddite. I'm not on the same page as people like Jaron Lanier who think you need to shut down your social media accounts. But I'm also really, really worried about the deleterious effects of social media based on their current manifestation, which is to say: hidden-algorithm-driven, for-profit operations.

You mentioned the difficulty of unplugging from the internet. Was that intended to apply to the societal level? As an individual, what are the costs and benefits of reducing your time on social media?

What I want to say when people tell me they don't have social media is actually: Yes, you do — you just don't know it. Because you're living in a society that is at this point largely structured by algorithmic forces that have their paradigmatic expression in social media. And it's going to be increasingly so as, for example, logistics and health care and perhaps even the economy are increasingly modeled on the forms of information-processing exchange that have been tried out first of all in online forums. So trains won't run on concrete timetables anymore, they'll run on algorithmically determined, flexible timetables — just like, for example, Uber pricing is not a flat rate, but is determined by algorithms.

So you have this algorithmic creep from something that was honed on Facebook and Twitter that then extends into all sorts of gamified domains of life like car-sharing services — that ultimately has no limit to how far it can extend in shaping the way our society is. So when someone tells me they're not on social media, I want to say, “Who gives a shit? That's not the question.” Whether you are on social media or not has nothing to do with the way social media-like technologies are transforming our world.

What are some of the things that are lost when this gamification happens? You cite Foucault as a big influence, so I wonder what you think about the effects on dating, in particular.

Dating is a good example. I'm not familiar with that directly — I'm more familiar with listening to music, that's something I'm still able to do. But in both cases, we have the same thing, right? If you're browsing a record store — especially a used record store or a thrift shop, the kinds of places where stuff that doesn't belong together gets thrown together anyway — that’s where you can really cultivate a kind of musical aesthetic sensibility that the “You May Also Like” function of the sort you see on Spotify or YouTube takes away. [The algorithms] take away the responsibility for cultivating your own aesthetic sensibilities.

When it comes to dating, I would argue that the deep mystery of love is that you can end up loving someone who on paper or on a digital platform, like Tinder, you really ought not to love. The fact that now people are matching with profiles that include stuff like their political commitments — like, who cares about political commitments? Love is so much more deep than that! People are missing out on the potential to experience it because they're mistaking this for some kind of algorithmically plottable game. And indeed, that is extremely harmful to human thriving.

With politics as well, the algorithmization is hollowing out our idea of what it is to have political commitment.

With politics as well, the algorithmization is hollowing out our idea of what it is to have political commitment. People end up simply following the map, so to speak, of adjacencies that they know they want to adopt or to avoid for reasons of maintaining their social standing. So, you know, the whole thing about avoiding not only people who are, say, right-wing extremists, but avoiding people who are right-wing extremist adjacent, or people who are friends with people who are right-wing extremist adjacent … and soon enough, you've got a pretty tightly built fortress of people who share your political commitments. But those aren't actually political commitments — those are just your in-group. So social media is making it really hard to come by political commitments through, let’s say, rational reflection, rather than through algorithmic plotting.

When it comes to abstaining from technology, different nations are taking different approaches. China, for instance, has a video game ban. Could those national differences yield a way of reining in tech?

My own suspicion is that in one way or another, every country is converging on the same model, even if they're using different terminology to describe it. The dreariest way of putting that is: Like it or not, the U.S. is veering toward a Chinese-style social credit system. In spite of China's ban on video games, I think the social credit system remains like the big video game of life itself. That's how you have to understand what they're doing.

We call it by other names, and it's more in the hands of private companies than the government. But one way or another, we're moving towards a condition in which your social standing, and the range of possibilities open to you — and even perhaps, eventually, your economic standing — will be based on your digital record. I think that's emerging already.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins