fbpx
Politics Foreign Affairs Culture Fellows Program

In Defense of Cultural Appropriation

The good news is that it's here to stay, no matter how many angry Twitter mobs come to kill it.
cultural_appWEB

It takes quite a lot to draw attention away from Washington these days. Only the most sensational events—hurricanes, wildfires, a genocide in Myanmar—seem to do the trick. One such event occurred at a high school in Utah this past spring. No, it wasn’t another mass shooting, but the public outcry could hardly have been louder if it had been. The source of the controversy? A prom dress. A senior named Keziah Daum, wanting to wear “something that would be more unique and bold and had some sort of meaning to it,” decided to don a red cheongsam, a high-collared, form-fitting Chinese dress. Big mistake. Daum, you see, is not Chinese, and that was the problem. “My culture is NOT your goddamn prom dress,” a man named Jeremy Lam tweeted, after seeing photos of Daum on Twitter. Lam’s comment got nearly 40,000 retweets and more than 170,000 likes. A deluge of similar tweets followed (“you just don’t wear it if ur not. Chinese…it’s not something to play dress up with.” [sic]), along with articles in The Washington Post, The Atlantic, and the Independent, some attacking Daum, some defending her.

If you’re confused about what all the hullabaloo was about, it can be summed up with two words: cultural appropriation. That is to say: “the unacknowledged or inappropriate adoption of the customs, practices, ideas, etc. of one people or society by members of another and typically more dominant people or society.” It’s a concept that emerged in the 1980s, born in the brains of sociologists studying indigenous groups harmed by colonization. Amerindian culture, in particular, has long been a goldmine for advertisers, supplying logos for everything from Indian Motorcycles to Red Man chewing tobacco to the Cleveland Indians baseball team, and many Native Americans have, quite understandably, taken offense. No one likes to see their heritage (not least of all their religious practices) caricatured for the sake of selling kitsch.

In the years since, though, the idea has flown the perch, spawning flocks of irate identitarians decrying the appropriation of everything from sombreros to sushi. Angry online mobs have, at various times, lambasted Miley Cyrus for styling her hair in cornrows, Chanel for selling a $1,325 boomerang—a commodification of Aboriginal culture—and a white Brazilian woman undergoing chemotherapy for wearing a traditional African head wrap to hide her hair loss. In 2015, an actual mob surrounded Yale professor Nicholas Christakis after his wife, also a professor at Yale, ever so gently suggested that the students might want to reconsider insisting that the university police their Halloween costumes. For two hours, students harangued, insulted, and, at one point, attempted to intimidate Christakis physically, while four Yale deans looked on. Though the incident was caught on video, none of the students were disciplined. Instead, the university doubled the budget for various on-campus cultural centers, and mandated racial sensitivity training for faculty and administrators.

It’s worth pointing out that hysteria of this kind is hardly new, even if, of late, it has risen to a piercing pitch. For millennia, the Indian caste system has barred lower caste members from wearing clothes, eating food, or acquiring jobs that are available to higher caste members. Non-Muslims, to this day, are forbidden from entering the Islamic holy city of Mecca. Any number of societies, from ancient Athens to feudal Japan, have imposed sartorial and sumptuary strictures on their citizens. During the age of the Roman imperium, only senators were allowed to wear Tyrian purple. In his history of art, E.H. Gombrich tells the story of a European artist who made some drawings of cattle in a remote African village. When the artist got up to leave, the inhabitants were distressed: “If you take them [the drawings] away with you, what are we to live on?”

Gombrich’s story may be apocryphal—he fails to mention either the name of the artist or the name of the tribe—but it illustrates an ongoing misunderstanding about cultural appropriation: the difference between appropriation and theft. If I steal a painting from you, you lose access to it, along with the ability to sell it. But if I steal a style of painting from you— impressionism, for instance—you lose nothing, neither the ability to own an impressionist painting nor the freedom to paint one yourself. Indeed, often the opposite is true: the fact that Monet, Renoir, Pissarro, Degas, and Cezanne were all working in the same style at the same time—exchanging ideas, copying each other’s techniques—only made their paintings more fashionable. No single one of them could have popularized impressionism on his own.

Added to which, if a thief were to be caught stealing a painting from your home, would you determine his guilt or innocence based on his culture or country of origin? It would not reflect particularly well on you if you did. This point is illustrated rather nicely by the case of Lord Elgin and his plunder, during the first years of the 19th century, of the Parthenon marbles, the beautiful bas-relief sculptures that for millennia adorned the Parthenon in Athens. It is often cited as one of history’s most heinous examples of cultural appropriation. Fair enough: it was certainly a heinous act of appropriation, but why bring Elgin’s culture into the equation? The crime would have been just as odious had the earl been a native Greek, moving the marbles from Athens to Corinth. Not only did he tear apart a stunning work of ancient art and architecture, but he shattered several of the friezes in the process of prying them from the Parthenon. The act was wrong in and of itself, not because Elgin was British.

Unfortunately, this type of category error is all too common in discussions of cultural appropriation. You see it in the dispute over the Washington Redskins. Many have suggested that the team’s name disparages Native Americans, and thus should be changed. A reasonable argument. But it is not an argument about cultural appropriation, as was suggested by The Washington Post. The Arapahoe, the Comanche, the Hopi, the Shoshone, the Navajo—none of these tribes refer to themselves as “Redskins.” It is a term coined and popularized by people of European descent. So too the word “nigger,” which, offensive as it may be, actually provides a rather lovely illustration of how appropriation shapes and reshapes language over time. Born in Latin as the word niger, meaning “black,” it was picked up by 16th century Iberian traders to describe their North African counterparts. In the beginning, it did not have a negative connotation. It was attributive, not pejorative, in the same way that we today talk about black cinema or black studies. That is, until the English picked it up and began using it to disparage the human property they were acquiring in West Africa. By the early 19th century, the word had fully transmogrified into an epithet. In 1837, the abolitionist Hosea Easton wrote, “[nigger] is an opprobrious term, employed to impose contempt upon [blacks] as an inferior race…[I]t flows from the fountain of purpose to injure.” And that’s all before African Americans appropriated the word for themselves, turning it, through a form of linguistic jiu-jitsu, from an insult into an expression of esteem. Indeed, it is no longer one word but two, as the linguist John McWhorter explains:

Nigger is a standard English slur. Nigga is a word in a different dialect, used among black people themselves, usually men, to mean “buddy.” It emerged from a common tendency, especially among men, to use mockery and joshing as an expression of affection…Nigga means “You’re one of us.” Nigger doesn’t.

The question today is whether it’s wrong for non-blacks to appropriate the “buddy” form of the word. Its wide usage in black stand-up comedy, hip hop, and movies has encouraged other ethnic groups—Asians, Latinos, and even some whites—to begin using it amongst themselves, in the same affectionate way that African Americans do. On one side, people will argue that the word, however it is spelled, is simply too radioactive; its history, as spoken by non-blacks, too painful. On the other, people will say that, as it is neither intended as an insult nor directed at blacks, it can’t be wrong. It’s a fascinating topic, on which an entire book could be written.

But it’s not, at its heart, a question about cultural appropriation. After all, Standard American English has appropriated dozens of words from black vernacular without anyone raising an eyebrow. Banjo, yam, gumbo, safari, zebra, zombie, chimpanzee, crib (i.e. a person’s house), homeboy (friend), bogus (fake), hip (well-informed), twerk (a dance), and woke (to be socially and politically enlightened) are all examples of words that have been appropriated from either African languages or African American Vernacular English. Why are people offended by the appropriation of nigga but not these other words? Because of its near relation with nigger, and the horrible history that the latter evokes. Even the homophone niggardly, meaning “stingy,” has a tendency to raise hackles, though it has no relation to the epithet. Nigga’s toxicity is not the result of cultural appropriation. Quite the contrary: by appropriating the N-word from whites, African Americans have drained it of at least some of its venom, turning it from one of the most vile insults in the English language into a term of endearment.

As these examples illustrate, discussions of cultural appropriation are, invariably, intertwined with discussions of race. Anti-appropriationists often use the words “culture” and “race” interchangeably. Recall the howls of protest that greeted the casting of Emma Stone in Aloha (2015), in which Stone, who is white, played a character of part Chinese, part native Hawaiian descent. “Aloha is a catastrophe of cultural appropriation, tokenization, and erasure of people of color,” wrote one critic. “My hope with this piece,” said another, “was to highlight Hollywood’s long history of appropriation of Hawaii and the culture of its native people…” But culture and race are not the same thing, even if they often overlap. Where were the protesters when Forest Whitaker, an American, played a Ugandan (Idi Amin) or when Daniel Day-Lewis, a Brit, played an American (Abraham Lincoln)? They both won Academy Awards. The trouble with Stone in Aloha isn’t her culture; it’s her race. She doesn’t look either Chinese or native Hawaiian, whereas Whitaker and Day-Lewis, being both excellent actors with a thorough command of accents, can credibly pass as foreigners of their own respective races.

This leads to uncomfortable questions. If Stone looked more Hawaiian would her casting have been acceptable? What if she’d been made to appear convincingly Hawaiian by using some type of “yellow face” makeup, of the kind that Mickey Rooney wore in Breakfast at Tiffany’s (1961)? Some critics have tried to answer these questions by planting their flags as far as they can in the opposite direction, arguing that all movie roles should be cast “authentically.” In a recent interview about the film Crazy Rich Asians (2018), Los Angeles Times film critic Jen Yamato told Sean Rameswaram of Vox, “If a movie cannot be made with accurate and authentic representation, nowadays, of characters that have been historically underrepresented, then maybe now is not the right time for that project to be made.”

Rameswaram: It just seems so subjective, like hard and fast rules are impossible. Can a Cambodian American who grew up in Houston play a Vietnamese soldier in a movie about the Vietnam War or is that a problem?

Yamato: Well, that’s a really good question that brings it back to Crazy Rich Asians…in the making of Crazy Rich Asians, director John M. Chu has talked about facing this problem. Does he have to cast every character authentically to that culture? And he ended up not doing that for all of his characters, and that has been a source of controversy, and I think a valid and legitimate conversation.

However well intentioned Yamato may be, she loses sight of what acting—nay, cinema itself—is all about: make-believe. Of course, race should be considered in movie casting, but only as one of many criteria, including height, weight, age, ability to speak a language or play a sport, and, most important of all, the capacity to plumb the emotional depths of the character. The entire point of acting is to pretend. It’s an art founded upon inauthenticity. Indeed, one of the joys of seeing a great actor perform is knowing that he is not who he pretends to be. That’s the pleasure of watching Jamie Foxx in Ray (2004), Heath Ledger in Brokeback Mountain (2005), Christian Bale in American Hustle (2013), and Julianne Moore in Still Alice (2014). None of them are what they feign for the camera. Foxx isn’t blind, Ledger isn’t (or wasn’t) gay, Bale isn’t Jewish, and Moore doesn’t suffer from early onset Alzheimer’s. Nor, for that matter, are they any of the other things they pretend to be in their respective roles—an R&B musician, a cowboy, a con man, or a linguistics professor. Yet they disappear so far into the parts they’re playing that they fully convince us of the reality they have created. That’s the magic of their craft.

Yamato is simply mistaking means with ends. Casting more Asians in American films is an admirable goal. Asians are notoriously under-represented by Hollywood. According to a USC Annenberg study, between 2007 and 2017, only 4.8 percent of American movies had an Asian character with a speaking part. Casting more Asians in American films would be a boon not just to the Asian American community, which would benefit from being seen more often, and to Asian American actors, who would benefit from getting more jobs, but to the American film industry, as well, which is always in need of fresh talent. How we get there, though, is important. If we begin casting actors strictly based on genetically inherited traits—ethnicity, sexuality, gender identity—then what (and how many) traits are we going to screen for? Should only blondes play blondes and left-handed people play left-handed people? One pities the poor actors who are going to have to report their sexual histories to the directors they work for, to prove that they, like the characters they are about to play, are truly homo-, hetero-, or bisexual. The trouble with insisting on authenticity in casting—particularly ethnic authenticity—is that it fosters a belief in biological purity that isn’t all that different from the “one-drop” rule practiced in the Antebellum South. How much true native blood is needed for an Iroquois to play an Iroquois or for a Somali to play a Somali? If we define all roles by their race, we will define all actors by their race, rather than their abilities. Is that not quintessentially racist?

Many today would say no. In the past couple decades, racism has begun to be redefined in terms of power, rather than prejudice or skin color alone. In his 1991 book Dismantling Racism: The Continuing Challenge to White America, Joseph Barndt explains, “Racism goes beyond prejudice. It is backed up by power. Racism is the power to enforce one’s prejudices. More simply stated, racism is prejudice plus power.” This has led to a gradual shift in the moral landscape. In their new book, The Rise of Victimhood Culture: Microaggressions, Safe Spaces, and the New Culture Wars, sociologists Bradley Campbell and Jason Manning posit that Western society, since the Renaissance, has been dominated by three main moral cultures. The first, honor culture, thrived when institutions for resolving interpersonal disputes—courts, school boards, HR departments—were limited or non-existent. Deprived of outside sources of arbitration, individuals jealously defended their own reputations, at times fighting to the death over minor slights. To do otherwise risked inviting predation.

As laws and institutions of authority formed, however, honor culture gradually gave way to a culture of dignity. “Dignity exists independently of what others think,” the authors explain, “so a culture of dignity is one in which public reputation is less important. Insults might provoke offense, but they no longer have the same impact as a way of establishing or destroying a reputation for bravery.” Instead of resolving conflicts with violence, people began to seek mediation through the government or other civic institutions. This has, at times, created new social costs. (There may have been a lot more duels in the 18th century than there are today but there were a lot fewer personal injury lawsuits.) However, it produced very obvious benefits, as well: black suffrage, women’s suffrage, marriage equality for gays and lesbians. In a dignity culture, every person is said to have inherent value, regardless of race, gender, religion, intelligence, or social status.

As time has passed, though, dignity culture has been coupled with what Campbell and Manning call a victimhood culture, which blends elements of both of its predecessors. Like the adherents of an honor culture, members of a victimhood culture tend to be highly sensitive to slights, but rather than responding with violence they are (usually) inclined, like members of a dignity culture, to seek redress through boycotts and group protest. In a victimhood culture, moral authority is determined by the color of one’s skin. Historically oppressed groups—Native Americans, African Americans, members of the LGBTQ community—are considered to be virtuous, and granted increased levels of respect in proportion to their perceived degree of victimhood. A black man, thus, would have less moral authority than a black woman (because men traditionally oppress women), who in turn would have less moral authority than, say, a black transgender Muslim. An inverse metric is used to determine the guilt of so-called oppressor groups—whites, males, heterosexuals, Westerners—who are considered to be morally and intellectually sullied by their privilege.

Critics of victimhood culture, like Campbell and Manning, see this as its own form of bigotry: a photo negative of older prejudices. Yet without it, the case against cultural appropriation would collapse under the weight of its own absurdity. After all, what is human civilization but a series of cultural exchanges? Unless you live your life in utter isolation, you won’t be able to help appropriating somebody else’s culture, whether it be when you bite into a slice of pizza (an Italian invention), put on your socks (a Greek invention), or read this sentence (written in Latin script). And that’s to say nothing of the imperishable works of art that have resulted from cultural appropriation—everything from Hamlet to Huckleberry Finn, from the frescos in the Sistine Chapel to the films of Akira Kurosawa. Hence the need for opponents of cultural appropriation to smuggle power into the conversation. The exchange of cultures is wrong, they say, only when it goes from the weak to the strong, not the other way around. “A deeper understanding of cultural appropriation,” Maisha Z. Johnson explains, “refers to a particular power dynamic in which members of a dominant culture take elements from a culture of people who have been systematically oppressed by that dominant group [italics hers].”

The trouble with the word “power,” as Johnson and others use it, is that it is not only vague but variable. Does it refer to economic power, electoral power, physical power, persuasive power, or the charismatic power that comes from representation in media, sports, and the academy? Consider again the example of Asian Americans. They are, as I pointed out above, vastly underrepresented in American cinema today relative to their portion of the population. They have also been, to use Johnson’s phrase, systematically oppressed for much of the last century and half. (Think of the Chinese Exclusion Act, the California Alien Land Law, and the internment of Japanese during World War II.) And yet, according to the Pew Research Center, median income among Asian Americans is, on average, over $3,000 higher than that of whites, and over $20,000 higher than that of blacks and Hispanics, as of 2016. Asians are also significantly more likely to hold a college degree than any other racial group in the United States, and less likely to be unemployed. Do they have power or not?

The truth, as the essayist Masha Gessen points out, is “there are power imbalances in all relationships.” Power waxes and wanes. A person who may be powerful in one situation may not be in another. The same applies to groups of people. Pegging racism to power not only creates confusion where none need exist; it perpetuates the very stereotypes that adherents of victimhood culture seek to eliminate—that minorities are axiomatically weak and that we must define them by their race, gender preference, or sexual orientation rather than by the qualities that make them unique individuals.

And that is perhaps the most disturbing thing about the opposition to cultural appropriation: how much it replicates the racial essentialism of decades and centuries past. Its foundation is a lumpy mixture of collective guilt (often coded as “privilege”), racial and sexual hierarchization (thus the pecking order of who can appropriate from whom), and, of course, segregation. This last, indeed, is the cornerstone of the anti-appropriationist ethos: the desire to exclude entire categories of people from participating in the practices of an in-group. Had Keziah Daum been Chinese, no one would have batted an eye when she wore a cheongsam to prom. It wasn’t her dress that was making people upset; it was her skin. The irony—the obvious irony—is that this is precisely what civil rights leaders of generations past were fighting against. “When my brothers try to draw a circle to exclude me,” Martin Luther King said, “I shall draw a larger circle to include them.”

The good news is that cultural appropriation is here to stay, no matter how many angry Twitter mobs come to kill it. Critics of the practice can’t even state their grievances without stealing the artifacts of at least half a dozen cultures. The expression itself is a prime example. The word “culture” comes to us by way of French, while “appropriate,” meaning “to take,” was plundered from Latin by Middle English. This, if nothing else, demonstrates how futile it is to try to stop the tsunami of culture or to build fences around it. There is nothing more human—or, one might equally argue, humane—than the desire to copy, emulate, and learn from people who are different from ourselves. Imitation is the sincerest form of flattery. Happily for Kezia Daum, some people were able to appreciate this. “I am a collector of cheongsams, with Chinese heritage and I think it is ridiculous other people are judging you!” one woman wrote on Twitter. “As Chinese, we are very proud and delighted to share our cultural fashions with anyone around the world.”

Graham Daseler is a film editor, animator, and writer. His articles have appeared in The Times Literary Supplement, The Los Angeles Review of Books, 34th Parallel Magazine, and numerous film periodicals.

Advertisement