Facebook isn’t a passive service, a mindless automaton empty of agenda or motive. Nicholas Carr reminds us of this in an essay for the Los Angeles Review of Books:
We have had a hard time thinking clearly about companies like Google and Facebook because we have never before had to deal with companies like Google and Facebook. They are something new in the world, and they don’t fit neatly into our existing legal and cultural templates. Because they operate at such unimaginable magnitude, carrying out millions of informational transactions every second, we’ve tended to think of them as vast, faceless, dispassionate computers — as information-processing machines that exist outside the realm of human intention and control. That’s a misperception, and a dangerous one. Modern computers and computer networks enable human judgment to be automated, to be exercised on a vast scale and at a breathtaking pace. But it’s still human judgment. Algorithms are constructed by people, and they reflect the interests, biases, and flaws of their makers.
People are increasingly concerned with the way Facebook manipulates its users, and Carr’s piece reflects a lot of these concerns. People had similar worries when the television first came out (see Neil Postman’s Amusing Ourselves to Death). It seemed to utilize its own “algorithm,” its own amalgamation of entertainment and horror in the news cycle. It, too, began to shape and mold the way humans thought about their world.
But the Internet is different. It isn’t just our resource for news and Netflix. It contains our entire worlds—everything from our banking accounts to our friend networks, our discussions with family, our work emails, travel plans, and social calendars. Other modes of communication and interaction haven’t disappeared from our lives—but many of them increasingly rely on Facebook. This isn’t necessarily bad: it illustrates the utility of the network, and of the Internet as a whole. But we must consider the implications of this newfound dependency—both for the way we think and the way we interact. What should (or can) we do about it?
First, note the “rich get richer” mode of information Facebook uses to sort news feeds. Those you chat with, who are in pictures with you, who comment on or like your posts: they’re the ones who will dominate your news feed. The less you interact with someone, the less likely you are to see their content. So, the logic goes, if you want to start seeing more updates from a given person, you should search for them, like their posts, scroll through their pictures, comment on their links, and maybe send them a chat message. That should work, right?
But what if you don’t balance your page-visiting appropriately, and other people begin to disappear from your news feed as you try to bring new people in? Or what if, in the midst of your efforts to level the news-feed playing field, you end up spending an egregious amount of time on Facebook, scrolling endlessly through content on friend’s profile pages? What if that’s what Facebook is trying to make you do the whole time? (I mostly tease, but I suppose it’s not impossible.)
All joking aside, there are some things that seemingly could change one’s Facebook algorithm—and the way we, as individuals, interact on the network. I’m going to experiment with them in the next few weeks, to see what happens:
- To cultivate a more cohesive community on Facebook, it seems it would help to keep one’s friend pool as small as possible—and to check friend lists occasionally, to see if there are some faces you haven’t seen in a while.
- Don’t mindlessly like things. Consider what you click, and whether you want Facebook to continue giving you constant updates from that user, news site, or advertiser.
- Write before you like: as Rose Eveleth wrote for The Atlantic, “liking” a link or status often becomes a stand-in for actual engagement with that person or material. “The Like button isn’t just about you being able to shout ‘I like this’ into the Facebook void,” she writes. “It’s also about your posts getting Liked, and the external validation that comes from the little red numbers that pool in your toolbar.” Removing that option from your mind forces you to find more “intimate” ways to connect, affirm, and validate your friends.
- When talking to people you care about, use Facebook’s private messages more often than the actual Facebook wall. I’m curious how this influences Facebook’s algorithms; I’ve noticed people I chat with jump to immediate prominence in my news feed. But additionally, it seems we could cultivate more closeness if we wrote more private messages. It takes all the intimacy out of saying, “You’re an awesome person, and I really like you!” when it’s said publicly, with hundreds of friends “listening in.”
Many Facebook users—myself included—have used Facebook rather naïvely in the past. We blindly adopted its services, without considering the consequences or implications it may have in our private, cognitive, and communal lives. As Carr writes, “We have a right and an obligation to understand how we, and our information, are being manipulated.” But knowledge must facilitate action, else we’re docile puppets, allowing ourselves to be manipulated. Perhaps we ought to begin taking social media, and its dangers, more seriously.