- The American Conservative - https://www.theamericanconservative.com -

Why Facebook Wants You to Be Happy

People see Facebook as a neutral platform, which they can use to have conversations. But in reality, it’s a company, and thus it has motivations and strategies and interests involved in the ways its users utilize the network. Facebook has commercial reasons behind its desire to curate specific experiences, happy or otherwise—and as of late, it’s been working rather hard to curate a positive, uplifting (or “Upworthy”) experience for its users, as Casey Johnston reports in an ArsTechnica post [1]:

As the protests in Ferguson, Missouri [2] over police fatally shooting [3] 19-year-old Mike Brown have raged through the past several nights, more than a few people [4] have noticed how relatively quiet [5] Facebook news feeds have been on the matter. While #Ferguson is a trending hashtag, Zeynep Tufekci pointed out at Medium [6] that news about the violence was, as best, slow to percolate [7] through her own feed, despite people posting liberally about it.

While I’ve been seeing the same political trending tags, my feed is mundane as usual: a couple is expecting a baby. A recreational softball team won a league championship. A few broader feel-good posts about actor Chris Pratt’s ice-bucket challenge [8] to raise awareness and money for ALS, another friend’s ice-bucket challenge, another friend’s ice-bucket challenge… in fact, way more about ice bucket challenges [9] than Ferguson or any other news-making event. In my news feed organized by top stories over the last day, I get one post about Ferguson. If I set it to organize by “most recent,” there are five posts in the last five hours.

This harkens back to Facebook’s recently released news feed manipulation study [10], which tracked users’ responses to positive vs. negative content. Facebook manipulated its algorithm to give a different experience to different users, and documented the effect these disparate algorithms had on their emotional wellbeing. All of this was in order—they purport—to curate a better user experience. Johnston adds,

One of the things Facebook thrives on is users posting and discussing viral content from other third-party sites, especially from sources like [11] BuzzFeed, Elite Daily, Upworthy, and their ilk. There is a reason that the content users see tends to be agreeable to a general audience: sites like those above are constantly honing their ability [12] to surface stuff with universal appeal [13]. Content that causes dissension and tension can provide short-term rewards to Facebook in the form of heated debates, but content that creates accord and harmony is what keeps people coming back.

Facebook is not the only company that curates a given experience for commercial gain. Most online tools use algorithms for similar curation—Google, for instance, doesn’t show you all potential search items for the keyword you give it: it shows you specific search items it thinks you will like, ones that accord with its idea of your personality or character [14]. And while this can be a useful service, it also has it drawbacks. All of these tools have a way of perpetuating specific traits or bents (virtuous or otherwise) in users: if you used to search Google for pornography, it’s likely to continue offering you pornography, even if you are trying to avoid it. If you looking at an old ex’s photo album on Facebook, perhaps in a moment of jealousy or curiosity, their postings will continue showing up in your news feed for days or weeks to come. Outside of the personal, we should pay attention to the effect these sites have on our perception of the outside world: as Eli Pariser noted in a March 2011 TED Talk [15], two users could Google the word “Egypt” and get completely different results—one would get news of protests and the Arab Spring and Morsi, while the other would only see tourism information and pictures. Pariser noted that Facebook does the same:

I’m progressive, politically—big surprise—but I’ve always gone out of my way to meet conservatives. I like hearing what they’re thinking about; I like seeing what they link to; I like learning a thing or two. And so I was surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out was going on was that Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared.

We see this trend continuing today with Ferguson coverage and reporting. People are getting more updates on the #IceBucketChallenge and its positive, peppy, social-media-centric coverage than they are about the dire situation in Ferguson.

How can we prevent this from happening? It’s a frustrating reality for those who, like Pariser, want a holistic understanding of what’s going on in the world. This is hard enough to do without algorithms picking out our quirks and vices, building bubbles around our perceived ideologies and communal circles. And sadly, opting out of this sort of algorithm-run curation is becoming more and more difficult: as Jeff Hancock, co-author of Facebook’s recent social media experiment told The Atlantic [16],

“If you think about Google search and you say, ‘I don’t want to be experimented on,’ then the question is, well, what does that mean? Google search is presumably being tested all the time … They’re constantly needing to tweak their algorithm. If I say, ‘I want to opt out of that,’ does that put me back to Google search 2004? Once you say ‘I’m opting out,’ you’re stuck with that version of that algorithm.”

The situation is extremely tricky, and it’s not anti-technology to consider carefully the repercussions of these effects. Social media is a new and potent tool, one that has quickly and systematically infected almost every facet of our social lives, news consumption, and commercial pursuits. We must consider carefully the consequences and repercussions of our consumption—and consider how better to hold accountable the companies who operate such algorithms.

Comments Disabled (Open | Close)

Comments Disabled To "Why Facebook Wants You to Be Happy"

#1 Comment By I Don’t Matter On August 21, 2014 @ 7:16 am

One does not have to have a Facebook page. One does not have to use Google for search.
You get what you paid for. Wait…

#2 Comment By FatHappy SouthernBoy On August 21, 2014 @ 8:53 am

“Once you say ‘I’m opting out,’ you’re stuck with that version of that algorithm.”

Wrong. There is a big difference between testing and deliberate manipulation.

#3 Comment By Aaron Paolozzi On August 21, 2014 @ 9:14 am

I believe that this is a mistaken way to see things. These are businesses (as you point out) that do what they will to create revenue. To create revenue they may want to keep you happy, edit out your supposed dislikes, and anything else they deem fit to do. Why should we expect them to change their practice if two things are true, 1) It works and creates the desired effect and revenue, and 2) We as consumers vote with our mouths, simply talking about not liking something, and not with our web navigation, going someplace else.

I think this is an important issue, but the way you view it is either unrealistic or naive. Even if we get the ‘opt-out’ method there will always be ways around it or we will be so hobbled it makes the site worthless. No, what will end up being the most effective is as a user realizing what is going on and if you intend to stay informed then seek out news and social interaction from multiple places. If you have a one stop shop for everything then you are already doing it wrong, and are participating in that cultural bubble you speak of. Once again we see the responsibility is truly with us consumers to be judicious of what and how we consume. Sadly I don’t see things panning out like this, but one can dream. I and mine will at least understand what they are looking at.

#4 Comment By tz On August 21, 2014 @ 9:50 am

With Google (and Bing) you only need logout and clear cookies. Or use a “private browsing session”.

But you must login to Facebook to see anything, and afaik they don’t have an internal, neutral, searcn engine.

Enforcing the echo chamber.

#5 Comment By collin On August 21, 2014 @ 10:45 am

Is this really so bad? Facebook is creating more links to the feel good stories like ice bucket challenge versus the Ferguson protest. Frankly, I don’t see why this is bad since it is not very hard to find coverage on Ferguson protest or better yet find articles that agree with your POV.

#6 Comment By philadlephialawyer On August 21, 2014 @ 4:02 pm

I agree this is a real issue, and that technology, per se, is not the culprit. As other posters mentioned, it is the for profit nature of Google (to me, Facebook is such a stupid waste of time that I won’t even bother with it) that leads to the problem. But there actually should be a search engine that you can use that will not keep track of your searches and will not steer you in any direction. Perhaps a fee based engine could be developed. Or, government and academia could put together such an engine, and funding could come from a special tax on Google, Facebook, et al. It would not at all be hard to develop such an engine, and, indeed, the earlier versions of Google, etc did not have these issues. As has been said, this is being done on purpose, and there is no need to go back to 2004 or whenever to get around it.

#7 Comment By EliteCommInc. On August 21, 2014 @ 6:20 pm

I am still at a loss what the issue is. If the members of TAC decided to compile a set of identifiers of comments and assign them some value as to personality type what is my compliant if participate. I have no doubt that commenters here do assessments of personality based on comments. Suppose TAC so hired or enlisted said psychiatric personnel to conduct a similar survey and then posted the results —

I am unclear what the complaint would be. Part of the issue selection is not merely an authors preference, but what might best elicit responses, from readers.

I am not a member of face book. I generally reject social media arenas that are not magazine/article based. It seems to me to be a hard case to make. They violated my violated my me expectations seems mighty thin.

Given the different venues offered and the rather strange and unseemly practices available — an incognito study, sounds tame. Un less they posted names alongside their analysis — doesn’t sound like much beef on that stick.

#8 Comment By Chestertonian On August 21, 2014 @ 6:51 pm


DuckDuckGo is the preferred choice for those who want an “uncurated” content-neutral search engine.

#9 Comment By philadelphialawyer On August 22, 2014 @ 5:46 pm

Thanks for the tip, Chestertonian!

EC,Inc., I think the point is not so much that anyone’s “rights” are being violated, as the practice of tailoring the search results to what the search engine (again, I don’t care about FB either) thinks the searcher wants to see is limiting. As I guess you know, despite all my posting on this site, I’m a liberal. I like to get opinions from all sides, and I like to see arguments that challenge the conventional wisdom of liberal notions, ideology, solutions, claims, etc. So, I don’t want conservative views filtered out. On the other hand, I don’t want the time I spend here to somehow filter out liberal views either, because Google must think I am a conservative!

I want a “neutral,” “uncurated” search engine, such as Chestertonian claims that DuckDuckGo is. (I just started using DDG, so I can’t vouch for it, or for the accuracy of the claim.) I may have no actual “right” to such an engine, but that doesn’t mean my wanting one is illegitimate. If Google thinks it needs to do what it does in terms of filtering and slanting to run its business, so be it. But I would like there to be other alternatives, and, apparently, there are.