“The internet is simultaneously our greatest affliction and our greatest hope; the present situation is intolerable, but there is also no going back,” Justin E. H. Smith writes in his new book, “The Internet Is Not What You Think It Is: A History, a Philosophy, a Warning.”
Smith, a professor of history and philosophy of science at the University of Paris, outlines the affliction of the internet in detail. He argues that the internet as we know it is addictive, undemocratic and “shapes human lives algorithmically, and human lives under the pressure of algorithms are not enhanced but rather warped and impoverished.”
How the internet can become the “greatest hope” of fixing our predicament is less clear, but that’s also not why Smith set out to write this book. “I'm doing genealogy and identifying problems, and then leaving it to brighter, younger people to find our way out of this,” Smith told me on a Zoom call from a café in Paris. His approach, which he said draws on Foucault and Leibniz, tries to place the internet in terms of “a vastly broader and deeper picture, both in time and in nature.”
In an interview with Protocol, Smith discussed how social media acts as a “debate-themed video game,” why dating apps often preclude “the deep mystery of love” and how the U.S. has quietly embraced a corporation-driven Chinese-style social credit system.
This interview has been edited and condensed for clarity.
Your book supposes that any remedy to problems of the internet will have to come from the internet itself. Is there any resonance to the idea that an external belief system could bring about change?
I wouldn't describe my view as pessimistic, I would describe it as an aporia or aporistic. That is to say, I don't identify any obvious solutions to the current predicament. Not because I'm a pessimist, but just because I don't really see that as part of the purview of the project. I'm doing genealogy and identifying problems, and then leaving it to brighter younger people to find our way out of this. And in that respect also, I feel like I echo Foucault here — my job is to show problems and how we inherited them and leave it to other people to provide a positive program.
That said, I could give you a glimpse of what a positive program would look like, and it would be a total destruction of the current economic model on which what I call the phenomenological internet is based. We need a real digital public space. What we have right now is a digital pseudo-public space that is simply not a viable forum for the exchange of ideas or for rational deliberation of the sort that a democracy needs. It's a pseudo-public space in that it allows people to play as if they are exchanging ideas, when in fact, what they're doing is playing a video game — racking up points in the form of likes and followers, based on figuring out how to game the algorithm.
We need a real digital public space. What we have right now is a digital pseudo-public space that is simply not a viable forum for the exchange of ideas or for rational deliberation of the sort that a democracy needs.
As long as that's the only game in town for talking about things like freedom of speech — or critical race theory, or whatever it might be — we're all doomed. Whatever side you defend, we are all doomed because it's not actually a forum for debate: It's a debate-themed video game.
So how do you solve the problem? I don't know. Smash the system and start over again? Seize the social media companies? Well the truth is, I honestly don't think that's a good idea. I think, de facto, social media are a public utility like electricity or water. And it would be better if we recognized that and treated them that way. That said, I don't necessarily think things would work out all that well, if in light of that, the government were to seize Facebook and Twitter and to run them accordingly. However, I think, in some way or other, there will have to be democratic oversight over how the algorithms work in order for us to have any hope of ever using social media for the purposes of deliberative democracy.
In your book you use the term “normies.” Often the most vital areas of culture seem to come from these niche spheres of the internet. Why do you think that is?
It's a real problem, right? Because it is addictive [and] exploitative. It is a chronophage … it eats up your time. … Nonetheless, I really can't help but feel that it's also the vanguard of [our culture]. And it would have been absurd to try to write a book like this one without going deep inside. In that respect, it's like writing an ethnography of heroin addiction — you're probably going to have to get addicted yourself in order to write any kind of compelling account of it. It's very much the same.
Whatever I was before, I can pretty confidently say at this point that I'm not a normie. My departure from the normies was part of the idea that I have to understand Extremely Online culture in order to write about it. And what I understand about it now is both that it's harming me and destroying me and also that it’s way ahead of the curve of everything else that happens in society at this point.
Photo: Justin E. H. Smith
I go visit my mother, for example, and they've got MSNBC blasting 20 hours a day in their house and I can't avoid it. This is the only time I see television at this point. What strikes me about MSNBC for elderly normies of, say, my mother's generation is that all it really is is a social media filter. Like on MSNBC, you hear Rachel Maddow, or whoever else, talking a day or two later about what people were already talking about on Twitter. So I have this weird experience when I'm in the presence of normies of being like: “Yeah, I know. Yeah, I know. I know. I know.” And feeling like it's just coming with a sort of delay and in a filtered down and diluted form that's maybe easier to digest and to keep you sane.
I don't know how to deal with this. You can hear the conflict within me. I'm not, as I insisted in the book, a so-called neo-Luddite. I'm not on the same page as people like Jaron Lanier who think you need to shut down your social media accounts. But I'm also really, really worried about the deleterious effects of social media based on their current manifestation, which is to say: hidden-algorithm-driven, for-profit operations.
You mentioned the difficulty of unplugging from the internet. Was that intended to apply to the societal level? As an individual, what are the costs and benefits of reducing your time on social media?
What I want to say when people tell me they don't have social media is actually: Yes, you do — you just don't know it. Because you're living in a society that is at this point largely structured by algorithmic forces that have their paradigmatic expression in social media. And it's going to be increasingly so as, for example, logistics and health care and perhaps even the economy are increasingly modeled on the forms of information-processing exchange that have been tried out first of all in online forums. So trains won't run on concrete timetables anymore, they'll run on algorithmically determined, flexible timetables — just like, for example, Uber pricing is not a flat rate, but is determined by algorithms.
So you have this algorithmic creep from something that was honed on Facebook and Twitter that then extends into all sorts of gamified domains of life like car-sharing services — that ultimately has no limit to how far it can extend in shaping the way our society is. So when someone tells me they're not on social media, I want to say, “Who gives a shit? That's not the question.” Whether you are on social media or not has nothing to do with the way social media-like technologies are transforming our world.
What are some of the things that are lost when this gamification happens? You cite Foucault as a big influence, so I wonder what you think about the effects on dating, in particular.
Dating is a good example. I'm not familiar with that directly — I'm more familiar with listening to music, that's something I'm still able to do. But in both cases, we have the same thing, right? If you're browsing a record store — especially a used record store or a thrift shop, the kinds of places where stuff that doesn't belong together gets thrown together anyway — that’s where you can really cultivate a kind of musical aesthetic sensibility that the “You May Also Like” function of the sort you see on Spotify or YouTube takes away. [The algorithms] take away the responsibility for cultivating your own aesthetic sensibilities.
When it comes to dating, I would argue that the deep mystery of love is that you can end up loving someone who on paper or on a digital platform, like Tinder, you really ought not to love. The fact that now people are matching with profiles that include stuff like their political commitments — like, who cares about political commitments? Love is so much more deep than that! People are missing out on the potential to experience it because they're mistaking this for some kind of algorithmically plottable game. And indeed, that is extremely harmful to human thriving.
With politics as well, the algorithmization is hollowing out our idea of what it is to have political commitment.
With politics as well, the algorithmization is hollowing out our idea of what it is to have political commitment. People end up simply following the map, so to speak, of adjacencies that they know they want to adopt or to avoid for reasons of maintaining their social standing. So, you know, the whole thing about avoiding not only people who are, say, right-wing extremists, but avoiding people who are right-wing extremist adjacent, or people who are friends with people who are right-wing extremist adjacent … and soon enough, you've got a pretty tightly built fortress of people who share your political commitments. But those aren't actually political commitments — those are just your in-group. So social media is making it really hard to come by political commitments through, let’s say, rational reflection, rather than through algorithmic plotting.
When it comes to abstaining from technology, different nations are taking different approaches. China, for instance, has a video game ban. Could those national differences yield a way of reining in tech?
My own suspicion is that in one way or another, every country is converging on the same model, even if they're using different terminology to describe it. The dreariest way of putting that is: Like it or not, the U.S. is veering toward a Chinese-style social credit system. In spite of China's ban on video games, I think the social credit system remains like the big video game of life itself. That's how you have to understand what they're doing.
We call it by other names, and it's more in the hands of private companies than the government. But one way or another, we're moving towards a condition in which your social standing, and the range of possibilities open to you — and even perhaps, eventually, your economic standing — will be based on your digital record. I think that's emerging already.