A plethora of recent studies have delved into the effect social media has on our brains. As part of a widespread neurological fascination in social science, these reports offer various concerns on the effect new technological platforms have on our social consciousness.
The University of Michigan reported last month that Facebook, while cultivating an addictive fixation in its users, also fosters depression. Why? “Based on the responses from the participants,” TIME reports, “the scientists speculated that the Facebook users were comparing themselves with their peers, and many were feeling inferior as a result. Users also reported frustration and a “lack of attention” from having fewer comments, likes and feedback compared with their Facebook friends.”
Social scientists are also musing over the growing “selfie” trend – and trying to determine its ethical implications. Is our pouty-picture-taking symptomatic of typical youthful immaturity, or is it indicative of some deeper cultural phenomenon?
These are just a few minor grace notes in a growing public dialogue on “narcissism”: whether we are more narcissistic than in ages past, how social media affects our self-absorption, and whether we need to change. Dr. Jean Twenge, a professor of psychology at San Diego State University, believes that younger generations are “increasingly entitled, self-obsessed, and unprepared for the realities of adult life,” according to a recent New York Times article. Slate author Katy Waldman felt inclined to agree with her:
Look at the rise of plastic surgery, our painstaking attention to Facebook and Twitter profiles, our selfies, our relentless focus on self-improvement, and “soccer teams that give every kid a trophy … Could it be that she is onto something? How else to explain that Twenge’s thesis just feels right?
Some believe this “epidemic” of narcissism is a product of our digital era. According to another study, this one conducted by the University of California Los Angeles, language in books have shown the trend develop over the past 200 years. “The currently discussed rise in individualism is not something recent but has been going on for centuries as we moved from a predominantly rural, low-tech society to a predominantly urban, high-tech society,” said psychology professor Patricia Greenfield, who conducted the study. Words like “unique,” “individual,” “self,” “feel,” “choose,” and “get” increased significantly over time. Words like “authority,” “belong,” and “pray” are more rare than before.
The “selfie” is a commonly quoted example of the narcissistic tendency in social media. One psychologist told TIME that the self-captured image could be beneficial, since it allows “young adults and teens to express their mood states and share important experiences.” But not everyone sees the “selfie” as a correct depiction of the self. Brett McCracken, in a Mere Orthodoxy blog post, argued the self-projections we present on social media are both deceptive to others and to ourselves:
Social media’s “what are you doing now?” invitation to pose, pontificate and consume conspicuously only amplifies the narcissistic presentism of the generation depicted in The Bling Ring. It makes it easier than ever to tell the world exactly what you want them to know about you. Through a carefully cropped and color-corrected selfie, depicting whatever glamorized “now” we think paints us in the best light, we can construct a public persona as we see fit.
Are Instagram selfies and Twitter posts truly making us more narcissistic? Or are we merely viewing a greater publication of old human tendencies?
I would argue the latter. While the digital era could be making us more narcissistic, it seems more likely that it is doing exactly what it was created to do: showcasing the self in all its glory. As long as humans have roamed the planet, they have had selfish tendencies. But prior to the rise of social media, they did not have a global platform for this grandstanding. High school, with all its intense and sticky drama, was documented profusely in letters and diaries, but it rarely appeared in public discourse. Now, via Facebook, young adults have a communal outlet for their emotionally turbulent lives. Facebook is, in many ways, the new diary.
In addition, arguing that young adults are more narcissistic than ever seems too simplistic. Many young people are immature and tend to be self-interested. It is only with experience and age that we learn the planet does not revolve around us. Once again, this generation has a wider global platform for self-pontification than ever before. This may skew the narcissism balance in their favor. Not to mention that the young are less adept at disguising their selfishness. With age, we learn how to hide our self-absorption behind facades of one sort or another. Time makes some humble; it makes others clever.
It is true that individualism and life disassociated from family and community is more widespread today than in past civilizations. It could be that this individualism has fostered our egotism. The person who lives with family or roommates must learn patience, self-control, and sacrifice. The individual, however, need not reconcile with other people’s wants and desires. Personal wants reign supreme. This obviously cultivates a different set of values and perceptions in the individual. But one must caution against the tendency to stereotype and predict behavior based on such perceived “trends.”
The human mind is not one massive mound of ever-growing egotism. It is a complex, varied, wondrous thing — scarred with the pains of human experience, spotted with sin and excess, ever shifting with the cadence of human experience. While we should be wary of social media’s influence on our mental and spiritual growth, we needn’t discount the online experience because of a few paltry “selfies.” One of the beauties of social media is its ability to connect and share – to teach us more about each other.
Ali Eteraz, author of Children of Dust, wrote in Medium last Tuesday that words are losing their potency and power. He believes the mighty Instagram, with its frozen pixelated memories, is our future. Though perhaps unconsciously, he said, “Instagram and its cousins represent an undeclared war on writing. On words.”
Eteraz believes that in the beginning, the Internet encouraged words. Text statuses reigned supreme in the first days of Facebook. But then, a change slowly began to develop:
First, by progressively smaller bursts of text (websites became blogs, became status updates, became 144 character tweets), and then through the enthronement of the image. Whether it is moving pictures (Youtube, Vimeo, Liveleak), or photo-sharing sites like Instagram, Pinterest, and Snapchat, it goes without saying that we are well on our way to communicating with each other by way of pictures.
There is one important and distinctive difference, however, between the sorts of “words” projected on the Internet, and those words utilized by storytellers throughout time. In the print era, the newspapers and books told stories of the other: of the powerful in Washington or on Wall Street, of a neighbor’s child who won a spelling bee, of Bobby Jones the golf player or General Robert E. Lee. Besides the private introspection of the diary or public thoughtfulness of the memoir, stories of self were at least somewhat limited.
But in the world of Internet and social media, another narrative has begun to reign supreme: namely, the self-narration. Facebook and Twitter statuses create constant self-publication. This public diary has wooed us away from the storybooks; after all, between General Lee and myself, whom will my ego find more fascinating?
With the initiation of Instagram, self-describing words became self-describing pictures. Instead of striving to help the user know and understand “who I am and how I feel” via statuses, the Instagram image asks you to merely see who I am, and to know “me” on that front. Eteraz does not view this as an alarming trend – except perhaps selfishly, he surmises, as a writer who wants to save his income. For most of society, he supposes, such a trend is normal:
After all, we are descendants of cavemen that told their stories upon stone walls by way of images. And we are descended of societies where the primary language was the hieroglyph, which is nothing more than words represented in imagistic forms. From this perspective we shouldn’t show much concern if our societies transition away from words and move to communicating by way of the image.
But here again, Eteraz does not seem to notice that the very nature of what is communicated has changed. Hieroglyphics and caveman images did not contain self-musing diary entries. Many contained histories and chronicles of kingdoms and clans, as well as ceremonial and religious messages. The “hieroglyphics” of Instagram rarely contain any of these things. As the image grows omnipotent online, the stories we tell are changing. News has increasingly subsided from such typographic-focused sites as the New York Times to the image-based commentary of Buzzfeed. Rather than reading “long-winded” titles, we watch long-winded Netflix shows. Eteraz sees this trend to the image as inevitable:
…Most realists among the wordsmiths already know that short of some massive cataclysm that lays to waste the electronic grid that makes the delivery of images so easy, we are pretty much done for. Whenever I walk past the cinemas and the cafes with flat screen TV’s and look at our children tapping away at pictures on iPads and think about how no one cares about reading the lengths of Proust, or Yukio Mishima, or Qurratulain Hyder, the thoroughness of the wordsmith’s dispossession comes to mind…
But, he surmises, writers’ laments on the subject are merely selfish, as we cannot “live with the thought that [we] are also irrelevant. If there are no words there are no smiths.” This negates all the inherent and transcendent merits of words themselves. He does recognize – and devotes one sentence – to the thought that some writers believe “words have something to contribute to this world, something important.” But he does not extrapolate this thought: What do words contribute to this world? Why do they matter?
There is neither time nor space to full extrapolate the importance of words in this post. But perhaps this short list will identify some reasons for words’ preeminence throughout time as the highest form of communication:
Noah Millman joined a lively HuffPost Live panel this week to discuss Luke Epplin’s critique of the ubiquitous “magic feather” theme in children’s movies, which emphasizes “the importance of never giving up on your dreams, no matter how irrational, improbable, or disruptive to the larger community”:
As with the titular character in Walt Disney’s 1943 animated feature Dumbo, these movies revolve around anthropomorphized outcasts who must overcome the restrictions of their societies or even species to realize their impossible dreams. Almost uniformly, the protagonists’ primary liability, such as Dumbo’s giant ears, eventually turns into their greatest strength. But first the characters must relinquish the crutch of the magic feather–or, more generally, surmount their biggest fears–and believe that their greatness comes from within. …
In addition to disparaging routine labor, these films discount the hard work that enables individuals to reach the top of their professions. Turbo and Dusty don’t need to hone their craft for years in minor-league circuits like their racing peers presumably did. It’s enough for them simply to show up with no experience at the world’s most competitive races, dig deep within themselves, and out-believe their opponents. They are, in many ways, the perfect role models for a generation weaned on instant gratification.
Millman elaborates on the “Bring It On” point here:
Should women “opt out” of the workforce, or – in author Sheryl Sandberg’s words – “lean in”? Several journalists have discussed the issue this week, most championing women’s right to career acclaim. An August 8 New York Times story profiled women who left prestigious jobs to start families, but are now trying to re-enter the workforce. The story prompted a plethora of contributors to ask whether women should always put career first, or whether there are benefits to “opting out.”
The Times article said it was difficult – if not impossible – for women returning to the workforce to procure jobs at a status level previously enjoyed. Forbes writer Deborah Jacobs finds this reasonable:
“I would like to feel empathetic, but find none of this surprising. The corporate world values work experience, and no matter how you spin the story about your PTA service and volunteer work, staying home with the kids is not work experience.”
Jacobs wrote an article on this subject for the Times in 1994, entitled “Back From the Mommy Track.” She described mothers “working their way back from the sidelines,” seeking a career after years at home. The opportunity costs to their years of mothering, she said, were extensive. Now, she believes women “must be prepared to live with our decisions.”
But does “leaning in” guarantee the alternative gratification these women imply? Perhaps not: the Times also published a story Saturday, analyzing whether prestigious jobs truly enrich women’s lives. Their findings generally demonstrated that women derive less pleasure from career power than men:
“Men tend to perceive more intrinsic rewards either from feeling influential or from having authority. For women, by contrast, both conditions seem to be necessary to get this reward. This matters. For women, just having authority may not be enough (as it seems to be for many men). And so even when women do occupy the ‘corner suite,’ so to speak, they aren’t guaranteed the personal and professional rewards men garner.”
The authors blame most of this dissatisfaction on “broader societal norms” and “stigma” that harm women’s ability to “lean in.” They call this a “psychosocial rewards gap,” and suggest it may dissuade women from pursuing high-powered careers.
Most of these stories paint “opt out” women as discontent with their place “on the sidelines,” pursuing the thankless work of motherhood. Yet many of these mothers – who unfortunately possess a less vocal position in the media – enjoy their life and find it worthwhile. In contrast, those who pursue a career do not always find ultimate fulfillment, either: even if they attain that “corner suite,” they may be disappointed with the rewards. In that moment, they may wish they “opted out” of the office and “leaned in” a little to their families.
It is important to note this decision faces fathers, as well. Even if men face less stigma than women in the workplace, they must choose between home-centric and career-centric pursuits. This is not to suggest that women never face a “psychosocial rewards gap.” Rather, it is possible that this very gap exists for men as well in the inverse: they may feel pressure to build career achievement rather than family involvement.
Regardless of the vocation pursued, there are opportunity costs involved. Whether pursuing a life of “opting out,” or striving for the top-dog position at a Fortune 500 company, one must decide what costs are worth paying.
Should there be rules about how we conduct ourselves in movie theaters? The debate ratcheted up this week, after tech blogger Hunter Walk suggested theaters should allow audiences to use other media. He wants “the ability to multitask” and suggested theaters offer more light, wifi, electricity outlets, and even a second screen experience. Anil Dash agreed, adding that “shushers” are a “textbook case of cultural conservatism” whose arguments are akin to those stating “women should not wear pants, or defending slavery…” He sounds rather like a modern Punchinello, crying to theater snobs, “Possibly it may disturb your neighbors; but you do not ask them to hear it … Isn’t this a free country?”
It’s true that the idea of sitting quietly and watching a performance is rather antiquated. It used to be a sign of respect – not merely to fellow observers, but also to artists themselves. Classical musicians were prone to “shush” audiences who clapped at the wrong moment. You can only imagine the outrage expressed when ringing, buzzing phones appeared on the scene. They believed their hard work and skill should command a level of audience respect. But Walk and Dash belong to a generation that rarely experiences this classical culture. They have likely never attended a ballet, opera, or orchestral performance. Most Americans live in a world where every entertaining event is accompanied by Instagram, Facebook, and Twitter. If Americans can tweet at rock concerts and rodeos, why not at the movie theater? Moreover, movies are not live events, which understandably impedes any sense of “respect” one feels for absentee actors and director.
However, Atlantic Wire’s Richard Dawson believes silence is more than artistic “respect” – it is an act of community: “Dash says the shushers are trying to block out the world, when I think it’s the opposite. Being considerate of those around you — recognizing that they might want to watch a movie in the quiet dark — is an act of communion.” Quiet consideration thus defined requires accepting and respecting the place you inhabit. Ballet reviewer Reese Thompson compared theater and ballet to a waltz between artists and audience. A lack of audience energy or participation dynamically affects the quality of the performance. No one, she writes, “whether they’re watching a comic perform in a bar or sitting in the orchestra section of the Metropolitan Opera,” should ever answer their cell phone during a performance.
In the age of distraction, it’s hard to reject multitasking and outward-connecting. But Fast Company writer Laura Vanderkam argues “monotasking” is key to conscientious flourishing, and she encouraged readers to embrace focus once more. To “monotask” during an artistic event (be it movie or concert) is an act of simple respect. But it is also something more: it is a moment of community between audience members, in which they allow their imaginations to become entranced with another world. As a child, I remember losing myself so completely in a movie, I emerged feeling slightly foreign in my own skin. But that sort of enchantment is harder nowadays; you never know when your neighbor might break the spell with a text or phone call.
The Economist’s Erasmus blog marked Wednesday as the 100th day since rebels abducted two Christian bishops from Aleppo. No one knows their fate. “As the blood-letting in Syria grows ever more polarised between extremes,” writes the author, “the position of anyone who stands in the middle is becoming increasingly unbearable.”
This quote shows the messy and tragic side of Syria’s war that the media so often fails to capture. In the fight for freedom, sectarian violence and religious persecution have plagued various areas of the country. Massacres and cruelty have plagued both sides. Unfortunately, the media and political figures have been quick to adopt “a side,” rather encapsulate the growing tragedy of the situation.
It is this twisted and chaotic reality that freelance reporter Francesca Borri addressed in a Columbia Journalism Review article last week. After more than a year of freelancing in Syria, she writes despairingly of distorted media coverage and frenzied violence:
“This is a dirty war, a war of the last century; it’s trench warfare between rebels and loyalists who are so close that they scream at each other while they shoot each other. The first time on the frontline, you can’t believe it, with these bayonets you have seen only in history books. Today’s wars are drone wars, but here they fight meter by meter, street by street, and it’s f—-g scary.”
President Bashar al-Assad is certainly to blame for much of the country’s turmoil. When the country’s revolution began in 2011, he repeatedly used lethal violence against protesters. He has allowed indiscriminate attacks on combatants and civilians, including the massacre of 45 women and children in Homs last year. Those who support the call to arm rebels quickly point out these accounts of regime atrocity.
However, the Syrian resistance is leaderless and fragmented. Some, according to Senator Rand Paul, are jihadist groups. There is no organized, cohesive movement in Syria to support: “Any attempt to aid the Syrian rebels would be complicated and dangerous, precisely because we don’t know who these people are,” Paul wrote for Politico. “…No question, the Assad regime has committed monstrous atrocities. But this does not mean that the Syrian rebels are in any sense the ‘good guys.’”
Who are the good guys in Syria? Unfortunately, many are men like Father Paolo Dall’Oglio, a critic of the Assad regime reportedly kidnapped by rebels close to al-Qaeda. They are the men and women who stand “in the middle,” the very place that is “becoming increasingly unbearable” in this polarized country.
Their stories of quiet and fearful dissent paint a more vivid and complex picture of Syrian warfare. But as Borri writes, “The editors back in Italy only ask us for the blood, the bang-bang. I write about the Islamists and their network of social services, the roots of their power—a piece that is definitely more complex to build than a frontline piece. I strive to explain, not just to move, to touch, and I am answered with: ‘What’s this? Six thousand words and nobody died?’”
You’ve probably seen a clip of it already: Fox News aired a cringe-worthy interview of the author of the latest Jesus tell-all book on Friday, much to the delight of many on the internet. In the now-viral interview, Fox News anchor and religion correspondent Lauren Green shows zero interest in the arguments or content of scholar Reza Aslan’s new book Zealot: The Life and Times of Jesus of Nazareth.
Instead, she leads off the interview with “You’re a Muslim, so why did you write a book about the founder of Christianity?” Aslan’s eyebrows threaten to rise right off of his face, but he comports himself honorably in a painful ten-minute conversation that never moves past this misguided line of questioning: “It still begs the question though, why would you be interested in the founder of Christianity?”
But even if Green’s line of questioning weren’t laced with xenophobia, ignorant about the purpose of scholarship, or breathtakingly incurious, it would still be problematic. There is a deeper philosophical problem behind focusing on the fact that Aslan is a Muslim.
Let’s suppose for the sake of argument the following: Reza Aslan brings personal biases and prejudices from his Muslim faith to his study of the historical Jesus; the liberal media is breathlessly excited by Aslan’s book, even though it merely rehashes debates that have been going on in historical Jesus studies for decades, because that media tends to be hostile to traditional Christian faith.
In fact, there may very well be reason to believe those things. But to think that they have anything to do with the merits of Aslan’s arguments about Jesus is to engage in a logical fallacy that C.S. Lewis called Bulverism. He explains:
You must show that a man is wrong before you start explaining why he is wrong. The modern method is to assume without discussion that he is wrong and then distract his attention from this (the only real issue) by busily explaining how he became so silly… Assume that your opponent is wrong, and explain his error, and the world will be at your feet. Attempt to prove that he is wrong or (worse still) try to find out whether he is wrong or right, and the national dynamism of our age will thrust you to the wall.
Bulverism a great way to score points while getting no closer to the truth, and it comprises perhaps 95% of writing about religion on the internet.
If you’re actually interested in Zealot, you shouldn’t care about Aslan, or Fox, but about the man from Galilee: what was he like? what did he teach? was he the Christ? If you’re looking for answers to that question, Aslan’s Muslim faith, Fox’s hostility, and any number of dreary facts about America’s cultural grievances are strictly irrelevant.
Textual criticism and and historical methodology can be boring and hard. Questioning motives and feigning outrage is always fun and easy, and serves as a particularly shallow way for people to engage in intellectual triage. That’s why interesting subjects only suffer when they get dragged into the culture wars.
Dear Barnes & Noble,
When you announced the resignation of your C.E.O. and Nook failure, some may have called it the beginning of your end. Idea Logical’s Mike Shatzkin said you could only hope to “make the slide into oblivion more gradual.” But take note: not everyone is so pessimistic about your future. The New Yorker’s James Surowiecki argued that print books are still “an exceptionally good piece of technology—easy to read, portable, durable, and inexpensive,” and he referenced Codex Group findings that 97 percent of those who read e-books are still “wedded to print.”
So perhaps you aren’t a dying relic after all, and merely need some revamping. Over the past several days, commentators have burst forth with a cacophony of competing ideas for your revival. The following list contains some potentially promising options for you to consider:
#1. Just be a bookstore.
Rather than trying to reinvent yourself, you should “focus on something truly radical: being a bookstore.”
- James Surowiecki, The New Yorker
#2. Cultivate your ‘secret sauce’: the serendipitous experience of discovery.
Add opportunity for discovery by using more tables than bookshelves. “Physical discovery is the secret sauce of retailing and publishing.”
- Simon Lipskar, Writers House, Wall Street Journal
Create “a special mix of smart curation and easy browsing” with new-arrival bookshelves and interesting sections, to build the serendipitous experience.
- Virginia Postrel, Bloomberg
#3. Hire happy, friendly nerds.
Hire people “with the express intention of making them booksellers, not team members or employees who aren’t excited about what they’re selling.”
- Jason Diamond, Flavorwire
#4. Create more ‘destination activities.’
Create an environment that people want to visit: “That means more destination activities, such as book groups, author readings.”
- Gerald Storch, Target, Inc., Wall Street Journal
#5. Transform bookstores into subscription showrooms.
Separate “the discovery and atmospheric value” of your store from the traditional “book-warehousing” model. Make stores smaller, with inventory limited to “examination copies — one copy per title.” Charge a daily, monthly, or annual membership that allows customers “to hang out, browse the shelves, buy snacks and use the Wi-Fi.”
- Virginia Postrel, Bloomberg
#6. Become a mini-mall.
Turn each store into a “mini-mall,” with each category leased to an “expert in that particular category.” You act as a systems integrator, “putting together a multi-category offering that attracts sufficient customers into the mini-mall.”
- Roger Martin, Harvard Business Review
#7. Downsize your image.
Create a “community franchise,” like Washington, D.C.’s Ace Hardware stores: “The owner names each of them after its neighborhood, but the small chain benefits from the buying power and branding of a national distribution network.”
- Lydia DePillis, Washington Post
The chattering class is not happy with the Boy Scouts.
This Monday, thousands of Boy Scouts gathered in West Virginia for the National Jamboree, 10 days of camping and outdoor activities like rappelling, canoeing, and biking.
That’s about as good-ol-fashioned Scouting as it gets, in contrast with a year so far filled with public debate and strife. This June, the Scouts voted to allow openly gay Scouts youths while continuing to exclude openly gay leaders—thereby inviting scorn from all sides.
Yet even the Jamboree has become yet another occasion for Scout-shaming, as the word has gone out that the Scouts are persecuting their heftier members. In accordance with a policy announced two years ago, Scouts with a body mass index (BMI) of 40 or higher have been excluded from the Jamboree, and Scouts with a BMI of 32 to 39 had to submit additional health information before being cleared to participate.
David Plotz, online editor of Slate, pegs the BSA as all-purpose discriminators (“Since they allow gay scouts, they had to find someone else to exclude”), while Lesley Kinzel, in the longest and most outraged critique of the policy, huffs:
It seems that the organization is trying to model itself on the boys’ most feared middle-school bullies, gamely prowling the halls between classes and ensuring that no boys exhibit the slightest inkling of weak, unathletic, or “girly” behavior.
Just like the Boy Scouts assume gay folks cannot possibly serve as good leaders and role models for kids, they also assume that all fat people—or rather, people with a BMI over a certain level—can’t walk a couple miles up a hill.
First, a quibble: anyone one who’s participated in Scouts knows that it consists largely of fat kids walking up hills.
Furthermore, a brief consideration of the policy shows that neither its effects nor its intent should be construed as fat-shaming. As J. Bryan Lowder, himself an Eagle Scout, points out, while BMI is a flawed measure of fitness, any teenager (as opposed to NFL lineman) with a BMI over 40 will almost certainly be unable to participate in or enjoy the Jamboree. And for the Scouts in the gray area,
I agree…that the higher scrutiny on BMI is out of whack, but having been to scout camp many times, I also doubt that the screening will be as harsh in practice as it sounds on paper. Scout leaders and site staff want, above all, for as many scouts as possible to have fun and be active, and so if there is a way for an obese scout to participate in a given activity, they are going to try to find it….
Nor is the emphasis on physical fitness in place to “shame” anyone:
Scout camps are usually remote and difficult to access, meaning that if a health crisis or injury does occur, it can be exceedingly difficult to get the victim to a hospital. Having seen a fellow scout airlifted by helicopter out of a gorge after falling during a climb, I can attest that this is a real concern that has nothing to do with shaming anyone.
The BSA’s national commissioner, not exactly a willowy figure, even publicly challenged himself to get in shape for the jamboree! Even if clumsily framed and articulated, the policy clearly comes from a spirit of concern and motivation, and not out of meanness—no matter what people would like to believe about the Scouts.
Conventional perceptions of the abortion debate took a hit this week, as Politico and the New York Times exposed skewed abortion coverage, including the media’s glorification of Texas senator Wendy Davis and her abortion bill filibuster.
Politico contributor Gary Bauer said coverage of Davis’s filibuster “ranged from frivolous to fawning”: one Huffington Post contributor described her as “brilliant, dogged, intrepid,” “eloquent, persuasive,” and “epic.” NPR’s Wayne Goodwyn called her a “tiny ray of hope” for Texas. Davis herself said the GOP was “selfish” and “out-of-touch,” arguing that they refuse “to listen to real families with real hopes.”
But Bauer retorts that “The vast majority of Americans don’t come close to sharing Wendy Davis’s support for late-term abortion.” Rather, polls show most Americans oppose abortion after 20 weeks. A University of Texas/Texas Tribune poll found 62 percent of Texas voters supported the abortion bill. According to these numbers, Davis may be more “out-of-touch” than the GOP. But she is not the only one: most Americans, the data suggests, are out-of-touch with themselves. Bauer continues,
In May, Gallup published a survey titled Americans Misjudge U.S. Abortion Views. It found that more Americans (48 percent) describe themselves as ‘pro-life’ than as ‘pro-choice’ (45 percent). But Gallup also found that most Americans (51 percent) believe the public is mostly ‘pro-choice.’ … The sense that America supports abortion is so pervasive that even a plurality of self-described ‘pro-lifers’ believe most Americans are pro-choice.
Why this prevalent misperception? Bauer believes partisan abortion coverage has created a false picture of the issue: “The fact is that the country is deeply divided over abortion, though it overwhelmingly opposes the types of abortions Davis was filibustering to save. Unfortunately, most Americans don’t realize how much of an outlier Davis is.”
Shortly after the Texas bill passed, the New York Times ran its story on the complexity of abortion rights public opinion. “Both parties have a claim on public opinion. Maybe more to the point, both can make a strong case that the other party has an extreme view,” David Leonhardt wrote. MSNBC host and Politico blogger Joe Scarborough quickly responded: “Leonhardt’s suggestion that Republicans are no more extreme on the politics of abortion than Democrats is one rarely expressed by national media outlets, and one that more editors and news directors should note.”
When USA Today contributor Kirsten Powers lambasted the media for their “deafening silence” on the Kermit Gosnell trial this spring, abortion biases came clear. As Dave Weigel acknowledged, “Social conservatives are largely right about the Gosnell story.” The media’s silence on the Gosnell case, predominant pro-choice coverage, and laudatory filibuster reporting all show widespread prejudice on this topic. The Gallup poll Bauer references is an example of media impact on public perception, as our understanding of the world is heavily conditioned by the news.
Walter Lippman wrote in Public Opinion of the way humans amplify fears and hatreds into a “fabrication of a system of all evil, and of another which is the system of all good.” “Real space, real time, real numbers, real connections, real weights are lost,” he said. “The perspective and the background and the dimensions of action are clipped and frozen in the stereotype.”
When it comes to the “women’s rights” or the “rights of the unborn,” journalists rarely report the real complexities beneath the stereotypes. One hopes these Times and Politico stories signal more balanced coverage of a deeply complex and troubling issue. As Scarborough said, news outlets should stop “mindlessly following a left-leaning narrative … more interested in promoting political agendas than reporting on political realities.”