Politics Foreign Affairs Culture Fellows Program

WEIRDoes Murder To Dissect

Here is your weekend read: a terrific Ethan Watters story on the WEIRD phenomenon.  This won’t be entirely new to longtime readers of this blog, but me, I can’t get enough of thinking about this stuff. Start here, with this passage from the Watters piece: A modern liberal arts education gives lots of lip service to […]

Here is your weekend read: a terrific Ethan Watters story on the WEIRD phenomenon.  This won’t be entirely new to longtime readers of this blog, but me, I can’t get enough of thinking about this stuff. Start here, with this passage from the Watters piece:

A modern liberal arts education gives lots of lip service to the idea of cultural diversity. It’s generally agreed that all of us see the world in ways that are sometimes socially and culturally constructed, that pluralism is good, and that ethnocentrism is bad. But beyond that the ideas get muddy. That we should welcome and celebrate people of all backgrounds seems obvious, but the implied corollary—that people from different ethno-cultural origins have particular attributes that add spice to the body politic—becomes more problematic. To avoid stereotyping, it is rarely stated bluntly just exactly what those culturally derived qualities might be. Challenge liberal arts graduates on their appreciation of cultural diversity and you’ll often find them retreating to the anodyne notion that under the skin everyone is really alike.

If you take a broad look at the social science curriculum of the last few decades, it becomes a little more clear why modern graduates are so unmoored. The last generation or two of undergraduates have largely been taught by a cohort of social scientists busily doing penance for the racism and Eurocentrism of their predecessors, albeit in different ways. Many anthropologists took to the navel gazing of postmodernism and swore off attempts at rationality and science, which were disparaged as weapons of cultural imperialism.

Economists and psychologists, for their part, did an end run around the issue with the convenient assumption that their job was to study the human mind stripped of culture. The human brain is genetically comparable around the globe, it was agreed, so human hardwiring for much behavior, perception, and cognition should be similarly universal. No need, in that case, to look beyond the convenient population of undergraduates for test subjects. A 2008 survey of the top six psychology journals dramatically shows how common that assumption was: more than 96 percent of the subjects tested in psychological studies from 2003 to 2007 were Westerners—with nearly 70 percent from the United States alone. Put another way: 96 percent of human subjects in these studies came from countries that represent only 12 percent of the world’s population.

To cut to the chase: the team discovered that people raised in the West — and especially Americans — are actually outliers on human psychological experience. Scientists studying the human brain and mind have been wrongly assuming that the experience of Westerners regarding perception was universal. This is not true. At all. We in the West are what these scholars call WEIRD: Western, Educated, Industrialized, Rich and Democratic. The experience of growing up in this particular culture conditions our perceptions at a far deeper level than we previously thought.

Let me make this clear: it’s not that it changes what we think about what we perceive, though it certainly does; it’s that it changes perception, period. People raised in our culture take in and sort information differently from other cultures, and do so in a largely unconscious way. Scientists can measure this. From the story:

The growing body of cross-cultural research that the three researchers were compiling suggested that the mind’s capacity to mold itself to cultural and environmental settings was far greater than had been assumed. The most interesting thing about cultures may not be in the observable things they do—the rituals, eating preferences, codes of behavior, and the like—but in the way they mold our most fundamental conscious and unconscious thinking and perception.

For instance, the different ways people perceive the Müller-Lyer illusion likely reflects lifetimes spent in different physical environments. American children, for the most part, grow up in box-shaped rooms of varying dimensions. Surrounded by carpentered corners, visual perception adapts to this strange new environment (strange and new in terms of human history, that is) by learning to perceive converging lines in three dimensions.

When unconsciously translated in three dimensions, the line with the outward-feathered ends (C) appears farther away and the brain therefore judges it to be longer. The more time one spends in natural environments, where there are no carpentered corners, the less one sees the illusion.

The obvious point here is that when we think we are seeing the world as it is, we are actually perceiving what we have been conditioned to see. There is no way to escape our radical subjectivity in this regard. It’s not that we are seeing illusions, though we might be; it’s that our basic cognitive structures have been influenced by our cultural environment. Seeing is not a neutral act; our cultures teach us how to see. Excerpt:

The interdependent self—which is more the norm in East Asian countries, including Japan and China—connects itself with others in a social group and favors social harmony over self-expression. The independent self—which is most prominent in America—focuses on individual attributes and preferences and thinks of the self as existing apart from the group.

That we in the West develop brains that are wired to see ourselves as separate from others may also be connected to differences in how we reason, Heine argues. Unlike the vast majority of the world, Westerners (and Americans in particular) tend to reason analytically as opposed to holistically. That is, the American mind strives to figure out the world by taking it apart and examining its pieces. Show a Japanese and an American the same cartoon of an aquarium, and the American will remember details mostly about the moving fish while the Japanese observer will likely later be able to describe the seaweed, the bubbles, and other objects in the background. Shown another way, in a different test analytic Americans will do better on something called the “rod and frame” task, where one has to judge whether a line is vertical even though the frame around it is skewed. Americans see the line as apart from the frame, just as they see themselves as apart from the group.

Heine and others suggest that such differences may be the echoes of cultural activities and trends going back thousands of years. Whether you think of yourself as interdependent or independent may depend on whether your distant ancestors farmed rice (which required a great deal of shared labor and group cooperation) or herded animals (which rewarded individualism and aggression). Heine points to Nisbett at Michigan, who has argued (pdf) that the analytic/holistic dichotomy in reasoning styles can be clearly seen, respectively, in Greek and Chinese philosophical writing dating back 2,500 years. These psychological trends and tendencies may echo down generations, hundreds of years after the activity or situation that brought them into existence has disappeared or fundamentally changed.

And here is the rub: the culturally shaped analytic/individualistic mind-sets may partly explain why Western researchers have so dramatically failed to take into account the interplay between culture and cognition. In the end, the goal of boiling down human psychology to hardwiring is not surprising given the type of mind that has been designing the studies. Taking an object (in this case the human mind) out of its context is, after all, what distinguishes the analytic reasoning style prevalent in the West. Similarly, we may have underestimated the impact of culture because the very ideas of being subject to the will of larger historical currents and of unconsciously mimicking the cognition of those around us challenges our Western conception of the self as independent and self-determined. The historical missteps of Western researchers, in other words, have been the predictable consequences of the WEIRD mind doing the thinking.

Longtime readers may recall a post from my old Beliefnet days, recollected here, about a passage the linguist Daniel Everett wrote in his fascinating book about living and studying in the deepest Amazon, Don’t Sleep, There Are Snakes. Everett recalls a morning in which the primitive Piraha tribe, among whom he and his family were living, became excited because a jungle god was standing on a beach in the river, yelling at them. Everett and his young daughter came running, but could see nothing. The Pirahas couldn’t understand how the Everetts couldn’t see what was plainly there. Everett wrote:

What had I just witnessed? Over the more than two decades since that summer morning, I have tried to come to grips with the significance of how two cultures, my European-based culture and the Pirahas culture, could see reality so differently. I could never have proved to the Pirahas that the beach was empty. Nor could they have convinced me that there was anything, much less a spirit, on it.

As a scientist, objectivity is one of my most deeply held values. If we could just try harder, I once thought, surely we could each see the world as others see it and learn to respect one another’s views more readily. But as I learned from the Pirahas, our expectations, our culture, and our experiences can render even perceptions of the environment nearly incommensurable cross-culturally.

Back to the Watters piece. One of the original WEIRD scholars, Ara Norenzayan, is a Lebanese immigrant, and was startled to come to American academia and find that researchers were oblivious to the power of religion in shaping human culture and perception. More:

Norenzayan became interested in how certain religious beliefs, handed down through generations, may have shaped human psychology to make possible the creation of large-scale societies. He has suggested that there may be a connection between the growth of religions that believe in “morally concerned deities”—that is, a god or gods who care if people are good or bad—and the evolution of large cities and nations. To be cooperative in large groups of relative strangers, in other words, might have required the shared belief that an all-powerful being was forever watching over your shoulder.

If religion was necessary in the development of large-scale societies, can large-scale societies survive without religion? Norenzayan points to parts of Scandinavia with atheist majorities that seem to be doing just fine. They may have climbed the ladder of religion and effectively kicked it away. Or perhaps, after a thousand years of religious belief, the idea of an unseen entity always watching your behavior remains in our culturally shaped thinking even after the belief in God dissipates or disappears.

Why, I asked Norenzayan, if religion might have been so central to human psychology, have researchers not delved into the topic? “Experimental psychologists are the weirdest of the weird,” said Norenzayan. “They are almost the least religious academics, next to biologists. And because academics mostly talk amongst themselves, they could look around and say, ‘No one who is important to me is religious, so this must not be very important.’” Indeed, almost every major theorist on human behavior in the last 100 years predicted that it was just a matter of time before religion was a vestige of the past. But the world persists in being a very religious place.

I am reminded of a paper one of this blog’s readers sent in a couple of weeks back, by the evolutionary psychiatrist Bruce G. Charlton, in which he discusses how animism is the natural state of the human person. More:

With the advent of agriculture, at least some significant part of the natural world becomes separated from the social world, an object instead of a subject. The food producing environment is no longer a parent, it does not share its abundance with the farmer – rather, the farmer toils to hold back the continual encroachments of the natural world, and forces it to yield sustenance by strength of limb and sweat of brow. The balance of nature is actively prevented from returning to equilibrium. Although animism continues to feature in people’s beliefs and practices (after all animism remains the spontaneous mode of thought among all people, in all societies), humans cease to anthropomorphise the whole significant world, with the consequence that the world is no longer experienced as a whole. Only certain bits of the significant world are regarded as sentient agencies. In particular the economy is necessarily objectified and manipulated, because human survival depends on it.

The seamless integration of significant experience as a network of social relationships is lost in the transition from hunting and gathering to agriculture. Life becomes divided, and humans alienated. From this point, not everything means something.

Every individual in every society starts life as a spontaneous animist inhabiting a meaningful world composed of sentient agencies. However, in ‘agricultural’ societies the child is socialised into an instrumental attitude towards those parts of the natural world upon which the economy depends. The child learns to treat as objects things which were previously treated as agents. Animism is regarded as merely a naïve or uneducated belief system.

On the whole, learned objectification clearly ‘works’, in the sense that societies which treat significant aspects of natural world as objects include all the most powerful societies, the ones that have the greatest productive capacities, and the greatest ability to understand and manipulate. This should not need emphasising.

Alienation is not an accident. Objectification is necessary for economic efficiency hence societal survival. The need to function in the economic realm means thatthis division – at least – is inculcated into each new generation. There are sanctions. If socialisation fails and the animistic attitude of generalised anthropomorphism is carried through into adult life, the probable outcome for that individual is economic ineffectiveness, consequently low status. But individual experience of the ‘meaning of life’ has – in effect – been sacrificed to group power.

When people do not feel at home in the world this is because their cultures have ‘taught’ them that significant aspects of the world are objects with which there can be no legitimate social or emotional relationship. And this implies that when people in modern culture seek ‘the meaning of life’ they are often deeply mistaken about what kind of a thing they seek.

The questions the Watters article (read the whole thing here) and this related material raises in my mind are these: Are we in the West progressing in our knowledge of the world, or regressing? Has our ability to measure the world as an object rendered it less possible to measure the world as a subject? Have we destroyed our ability to perceive the spiritual dimension of existence as a really existing phenomenon? Have we trained ourselves not to see God, even as we flatter ourselves that we see things as they really are?

Wordsworth was right: We murder to dissect. 

[H/T and thanks to the readers who suggested this post and sent in material.]



Become a Member today for a growing stake in the conservative movement.
Join here!
Join here