Therapeutic AI and the Simulation of Discontent
A new Siri-for-therapy trains humans to mimic a pale version of human thought.
Since the show first premiered in 2010, Black Mirror has been the go-to point of comparison for our ambivalence about new technologies. Anything novel and intrusive, from facial recognition technology to social media, is inevitably said to be “like an episode of Black Mirror.“
What makes the show a safe popular reference point isn’t only our shared uncertainty about the outsized role new tech plays in the organization of our lives, but also the strange sense of powerlessness we have to shape our own future. New products feel forced on us from above (or at least the outside) and always come with hidden costs, be it the harvesting of our most sensitive data or the exposure of our children to online perversity.
But ambivalence, by definition, runs in two directions at once. The ubiquity of these intrusive and controlling technologies isn’t only something that’s forced on an unwilling public. They’re popular because they also appeal to us in some way. We enjoy mindlessly scrolling through Facebook or spouting off on Twitter. Amazon might have helped kill the local store, but that doesn’t stop us from using it to order everything from groceries to books to clothes. And then there’s porn, of course.
The point is, these technologies remain popular in part because they take advantage of the fact that what we want and what we require to flourish are often in tension. Sometimes we even enjoy undermining ourselves. And new technologies such as Replika, the AI product you’re meant to confide in as sort of a bestie and therapist rolled into one, is the perfect example of this sort of thing. By playing to our most base desires and most tender insecurities, these products trap us in a solipsistic narcissism. They simultaneously distract us from engaging with and changing the world around us, while also conning us into thinking that we’re able to find our telos, our final purpose, within ourselves instead of having a more profound spiritual purpose.
New tech products are usually marketed to us in heady, moralistic terms, and the story of Replika is much like any other tech start up. Someone has a magnanimous vision of a better world and they want to use it to sell us something. In the case of Replika, it was founded “with the idea to create a personal AI that would help you express and witness yourself by offering a helpful conversation. It’s a space where you can safely share your thoughts, feelings, beliefs, experiences, memories, dreams – your ‘private perceptual world.’” Forbes described it as “an app that lets users create a digital avatar with the name or gender of their choosing. The more they talk to it, the more it learns about them.” Imagine an Alexa that doesn’t just respond to prompts, but answers back in an attempt to mimic thoughtful dialogue.
At first glance Replika might appear harmless, or even auspicious. And there are definitely some positive aspects to it. The software itself is astonishingly complex and certainly represents the cutting edge of AI programming. There’s some virtue, even if it’s misguided, in doing anything well. What’s more, the app is actually responding to a very real need. In an online environment which constantly makes us feel lonely and disconnected, Replika is a good-faith attempt to design something which actually pushes back against social anomie. As the website Popsugarreported, in an interview with Replika co-founder Eugenia Kuyda, “there’s a strong case for Replika as a healthy alternative to social media networks that can make us feel more alone. ‘You’d be amazed how lonely people are feeling now … it doesn’t matter if they have a lot of friends or have a cool job. They feel disconnected from other people and from life,’ [Kuyda] said, adding that it’s not meant to take the place of human engagement, instead serving to make that human engagement feel a little less daunting.”
As wonderful as that sounds, it’s difficult to see how Replika is essentially all that different from other social media. In fact, it seems as if Replika is just part of the problem: people being sold tech products that purport to reconnect them with people and life while in fact embedding them more deeply into their own isolated experiences. If anything, Replika seems a bit worse than older kinds of social media for the mere fact that you never actually get to engage with another human, even if it’s through the scrim of a mediated TikTok or Twitter experience. If anything, Replika seems to present in its purest form the problem with relying on social media and AI to repair our fractured social bonds and articulate some sense of purpose in our lives. At best, we’re engaging in a strange kind of psychological solipsism. At worst, we’re possibly being intimately manipulated by a product.
It seems obvious, but it’s worth nothing that AI isn’t “conscious” in any sense. It doesn’t even approximate thinking, because the human mind isn’t doing what a computer does. When Alan Turing invented the first “computer,” it was actually a discrete-state machine mimicking human thought. But the human mind is not a discrete-state machine. However, as Roberto Calasso tells us in The Celestial Hunter, “To think of [the brain] as a discrete-state machine was an enormously fruitful and revelatory idea, even though, strictly speaking, it was false. Why? The answer is somewhat disconcerting: the brain is not and never can be a discrete-state machine but in various circumstances and for various reasons, it simulates being so—and succeeds remarkably well in doing so. Even though, in many cases, it is less effective than machines…Such a machine would therefore simulate an entity (the brain) caught in the act of simulating.”
So in essence, Replika is a machine simulation of a brain simulating what the machine does. Whatever reassurance murmuring into this echo chamber might give us, it’s a solipsism already degraded by degrees of separation from the real thing. The problem is much worse than simply hoping for solace from a conversation with ourselves. The problem is that we’re seeking solace from an already much degraded version of ourselves. We’re essentially training our minds to mimic a pale version of human thought.
Charmed by a digital caricature of ourselves, we’re also distracted from the fact that we’re not doing anything to make the world a better place for humans to flourish. Things like Replika, as well-intentioned as they might be, essentially function as a way to acclimate individuals to our brave new world of social isolation rather than changing the structure which creates the problem in the first place. C. Wright Mills wrote in “The Professional Ideology of Social Pathologists” that there’s a certain class (we might call them the professional managerial class) that defines social change in the most attenuated terms, mostly in terms of adapting individuals to the society that they’ve created for them. Instead of changing society to meet the needs of human nature, they create ad hoc ways (typically products, either technological or psychological) of forcing us to conform to their inhumane culture. Things like Replika are attempts to fine-tune the human personality to passing progressive social fashions of the day.
There’s a reason the show Black Mirror doesn’t engage deeply with the spiritual or metaphysical. Those things play no role as either a problem or a solution in the worldview of our new elites. It wants to critique, but not too profoundly. And like Replika, it’s an artifact of the same denuded society it claims to help us heal from.
Scott Beauchamp’s work has appeared in the Paris Review, Bookforum, and Public Discourse, among other places. His book Did You Kill Anyone? is forthcoming from Zero Books. He lives in Maine.