fbpx
Politics Foreign Affairs Culture Fellows Program

Mark Zuckerberg Takes on His Facebook Frankenstein Monster

A new profile shows he's become more circumspect over the platform he created. It's time we all did.
Mark Zuckerberg

The past two years have shattered Mark Zuckerberg’s faith in social media, or humanity, or both.

At least, that’s one of the primary takeaways from Wired magazine’s feature last week on Facebook and fake news, written by Nicholas Thompson and Fred Vogelstein. In it, Thompson and Vogelstein consider Facebook’s many slip-ups and struggles in publishing and curating news stories over the past two years—from allegedly disfavoring conservative coverage to allowing Russian bots to influence news feeds during the 2016 presidential election.

Throughout the story, one theme remains strong: Facebook’s employees and CEO are only now beginning to understand their responsibility to the world and the dangerous power of the platform they’ve created. Zuckerberg, like many in our enlightened and progressive age, believes in human perfectibility. He believes in circumstantial error, not sin. He believes (or at least believed) in humankind’s ability to craft the perfect scenarios and platforms in which to break down all malice and misunderstanding, while not necessarily believing that evil is endemic to the human condition.

Which means that, when Facebook came under fire for the spread of misinformation and fake news in early 2017, he didn’t respond by noting the flaws of his global empire or in its users; he doubled down on the hopeful perfectibility of his platform and its users:

Amid sweeping remarks about “building a global community,” [Zuckerberg] emphasized the need to keep people informed and to knock out false news and clickbait. Brown and others at Facebook saw the manifesto as a sign that Zuckerberg understood the company’s profound civic responsibilities. Others saw the document as blandly grandiose, showcasing Zuckerberg’s tendency to suggest that the answer to nearly any problem is for people to use Facebook more.

Of course, Zuckerberg’s hopes for his creation are understandable—none of us want to acknowledge the inherent flaws in something we’ve dedicated our lives to curating and crafting. And it’s understandable—even laudable, in a way—that Zuckerberg would refuse to let the confusion and vitriol of the 2016 general election impact his faith in humanity.

But Facebook-as-global-community isn’t working. Since creating the social media network, Zuckerberg has always emphasized his hope that it would foster greater empathy and understanding around the world: that it would enable us to understand each other better, to bridge racial and partisan divides, and to instill our interactions with greater temperance.

Instead, the opposite has happened: Facebook, via human choice and algorithmic tendency, has fostered ideological and political bubbles amongst its users, not global community and rapport. We join groups filled with like-minded people, like pages that give us the news we want to hear, unfollow or unfriend those who annoy us or make us uncomfortable. Based on the posts we spend the most time liking, commenting, and watching, Facebook gives us more of the same, rather than encouraging us to expand our ideological repertoires.

These tendencies are extremely difficult to circumvent. We can follow accounts that challenge our beliefs, keep listening to the friends who annoy us (and engage them via comments and messages when we find it appropriate)—but there remains a degree of disconnection between us and this virtual content. That disconnection makes it easy to tune out, or to respond with more anger or bombast than we might employ in personal conversation. Additionally, we too often indulge in clickbait on Facebook or other online sites rather than forcing ourselves to absorb more thoughtful, well-crafted content. Facebook’s algorithm recognizes this tendency—and passively fosters it. Thompson and Vogelstein write:

People like Alex Hardiman, the head of Facebook news products and an alum of The New York Times, started to recognize that Facebook had long helped to create an economic system that rewarded publishers for sensationalism, not accuracy or depth. “If we just reward content based on raw clicks and engagement, we might actually see content that is increasingly sensationalist, clickbaity, polarizing, and divisive,” she says. A social network that rewards only clicks, not subscriptions, is like a dating service that encourages one-night stands but not marriages.

To some extent, Facebook cannot control or circumvent this tendency. It is the result of our sinful natures, our ability to shape a product according to our own flawed image. Facebook isn’t innocent in the way it curates and shapes our news feeds, but its creators and curators will always find it hard to balance between excess and defect in their attempts to foster a virtuous online community. Do too much, and they risk playing an alarmingly intrusive role in people’s lives and mental formation. Do too little, and they encourage an anarchic atmosphere of disinformation, vitriol, and offensive content.

Besides, many will ask (and rightfully so) what right Facebook has to determine what is and isn’t propagated on its site. It is meant to be a platform for connection, not a social policeman. Facebook didn’t force its users to absorb the Russian-propagated fake news of 2016. “Sure, you could put stories into people’s feeds that contradicted their political viewpoints, but people would turn away from them,” note Thompson and Vogelstein. “The problem, as Anker puts it, ‘is not Facebook. It’s humans.’”

Michael Brendan Dougherty pointed this out in an article for National Review last week, one that astutely contrasted Thompson’s and Vogelstein’s piece with an op-ed Ross Douthat just wrote for The New York Times in which he argued that we should ban pornography. While the first (well-received) article suggests that Facebook (and we ourselves) can and should curate an online atmosphere of virtue and respect, the second was received with anger and disdain, as even many who respected the Wired article argued that—at least when it comes to porn—we should be able to do what we want. This ironic contradiction, Dougherty argues, stems from the same fantastical attitude toward online life:

Our culture-makers seem to believe in a neatly cleaved human nature. In one realm, we can expect ourselves to act as angels, and do the disinterested thing. In another, perhaps to let off some steam, we must give the Devil his due.

But perhaps the defenders of porn should consider that the common purveyors and sharers of fake news across social media are also engaged in a form of self-abuse, combined with titillation, and fantasy life. They no more believe that Barack Obama is running guns to ISIS than that the surgically enhanced 30-year-old woman in a plaid skirt is a very bad Catholic-school girl. It’s just a reality they prefer to envision. One where they can gaze into a backlit screen, click around, and imagine they aren’t wasting their lives clicking around on a backlit screen.

We love the internet because it enables us to craft the world we want, instead of forcing us to confront the world as it is. We can doctor our Instagram selfies, visit news sites that foster our less temperate and virtuous passions, follow the Facebook users who will puff up our self-esteem, dabble in the darker delights of the internet—all without ever asking ourselves whether it’s “right” or “wrong.”

But at some point, we must confront the world we’ve made online—because whether we like it or not, it will infect and alter our physical reality.

Zuckerberg has been immersed in that world longer (and on a deeper level) than most of us—which means he’s reached this moment of crisis and self-confrontation earlier than most of us. Thompson and Vogelstein write that “people who know him say that Zuckerberg has truly been altered in the crucible of the past several months. …‘This whole year has massively changed his personal techno-­optimism,’ says an executive at the company. ‘It has made him much more paranoid about the ways that people could abuse the thing that he built.’”

In recent months, the authors write, Facebook has decided it’s “putting a car in reverse that had been driving at full speed in one direction for 14 years.” Although Zuckerberg has always intended “to create another internet, or perhaps another world, inside of Facebook, and to get people to use it as much as possible,” he’s now encouraging changes to the platform, which, he believes, will make people use Facebook less. Perhaps he’s realizing he needs to design for fallibility, addiction, and bombast—not for perfectibility, innocence, and self-control.

But Zuckerberg cannot edit human sin out of Facebook’s algorithms. We are ultimately responsible for our own choices, online and off, which means we all need to have our own Zuckerberg moments in coming days. That means we must realize that the internet does not, in fact, bring out our best selves, and must be approached with temperance and caution. We must confront the beasts we’ve created online: our own miniaturized Facebook Frankenstein monsters, whether they be unkind profile posts, ruined friendships, malicious blog posts, or toxic news feeds. If we don’t, we risk deeper consequences than fake news and nasty election cycles. We risk giving up real community and real virtue for fantastical and fake counterfeits.

Gracy Olmstead is a writer and journalist located outside Washington, D.C. She’s written for The American ConservativeThe WeekNational Review, The Federalist, and the Washington Times, among others.

Advertisement

Comments

Become a Member today for a growing stake in the conservative movement.
Join here!
Join here