People see Facebook as a neutral platform, which they can use to have conversations. But in reality, it’s a company, and thus it has motivations and strategies and interests involved in the ways its users utilize the network. Facebook has commercial reasons behind its desire to curate specific experiences, happy or otherwise—and as of late, it’s been working rather hard to curate a positive, uplifting (or “Upworthy”) experience for its users, as Casey Johnston reports in an ArsTechnica post:

As the protests in Ferguson, Missouri over police fatally shooting 19-year-old Mike Brown have raged through the past several nights, more than a few people have noticed how relatively quiet Facebook news feeds have been on the matter. While #Ferguson is a trending hashtag, Zeynep Tufekci pointed out at Medium that news about the violence was, as best, slow to percolate through her own feed, despite people posting liberally about it.

While I’ve been seeing the same political trending tags, my feed is mundane as usual: a couple is expecting a baby. A recreational softball team won a league championship. A few broader feel-good posts about actor Chris Pratt’s ice-bucket challenge to raise awareness and money for ALS, another friend’s ice-bucket challenge, another friend’s ice-bucket challenge… in fact, way more about ice bucket challenges than Ferguson or any other news-making event. In my news feed organized by top stories over the last day, I get one post about Ferguson. If I set it to organize by “most recent,” there are five posts in the last five hours.

This harkens back to Facebook’s recently released news feed manipulation study, which tracked users’ responses to positive vs. negative content. Facebook manipulated its algorithm to give a different experience to different users, and documented the effect these disparate algorithms had on their emotional wellbeing. All of this was in order—they purport—to curate a better user experience. Johnston adds,

One of the things Facebook thrives on is users posting and discussing viral content from other third-party sites, especially from sources like BuzzFeed, Elite Daily, Upworthy, and their ilk. There is a reason that the content users see tends to be agreeable to a general audience: sites like those above are constantly honing their ability to surface stuff with universal appeal. Content that causes dissension and tension can provide short-term rewards to Facebook in the form of heated debates, but content that creates accord and harmony is what keeps people coming back.

Facebook is not the only company that curates a given experience for commercial gain. Most online tools use algorithms for similar curation—Google, for instance, doesn’t show you all potential search items for the keyword you give it: it shows you specific search items it thinks you will like, ones that accord with its idea of your personality or character. And while this can be a useful service, it also has it drawbacks. All of these tools have a way of perpetuating specific traits or bents (virtuous or otherwise) in users: if you used to search Google for pornography, it’s likely to continue offering you pornography, even if you are trying to avoid it. If you looking at an old ex’s photo album on Facebook, perhaps in a moment of jealousy or curiosity, their postings will continue showing up in your news feed for days or weeks to come. Outside of the personal, we should pay attention to the effect these sites have on our perception of the outside world: as Eli Pariser noted in a March 2011 TED Talk, two users could Google the word “Egypt” and get completely different results—one would get news of protests and the Arab Spring and Morsi, while the other would only see tourism information and pictures. Pariser noted that Facebook does the same:

I’m progressive, politically—big surprise—but I’ve always gone out of my way to meet conservatives. I like hearing what they’re thinking about; I like seeing what they link to; I like learning a thing or two. And so I was surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out was going on was that Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared.

We see this trend continuing today with Ferguson coverage and reporting. People are getting more updates on the #IceBucketChallenge and its positive, peppy, social-media-centric coverage than they are about the dire situation in Ferguson.

How can we prevent this from happening? It’s a frustrating reality for those who, like Pariser, want a holistic understanding of what’s going on in the world. This is hard enough to do without algorithms picking out our quirks and vices, building bubbles around our perceived ideologies and communal circles. And sadly, opting out of this sort of algorithm-run curation is becoming more and more difficult: as Jeff Hancock, co-author of Facebook’s recent social media experiment told The Atlantic,

“If you think about Google search and you say, ‘I don’t want to be experimented on,’ then the question is, well, what does that mean? Google search is presumably being tested all the time … They’re constantly needing to tweak their algorithm. If I say, ‘I want to opt out of that,’ does that put me back to Google search 2004? Once you say ‘I’m opting out,’ you’re stuck with that version of that algorithm.”

The situation is extremely tricky, and it’s not anti-technology to consider carefully the repercussions of these effects. Social media is a new and potent tool, one that has quickly and systematically infected almost every facet of our social lives, news consumption, and commercial pursuits. We must consider carefully the consequences and repercussions of our consumption—and consider how better to hold accountable the companies who operate such algorithms.