A student is raped by a classmate, goes to the campus center for help, and is grilled about whether she provoked the rape, told she has to confront her accuser personally in order to be taken seriously, and, ultimately, hounded off of campus, since her post-traumatic stress makes her “unstable.”
You might recognize all the details from The New Republic‘s story about Patrick Henry College’s alleged mishandling of rape cases, but the above incident is drawn from Angie Epifano’s experience at Amherst. Patrick Henry’s Christian ethos informs the tone in which these students were brushed off (you’d be unlikely to hear concerns about purity at a public or secular private school), but the alleged underlying betrayal is more attributable to being a university than a Christian one in particular.
Treating Patrick Henry’s crisis as unique because of its singular status as a private, Christian school (one of only four private colleges in the country that decline federal funds and, thus, aren’t regulated under Title IX) masks a broader problem with administrations’ treatment of students in crisis, one that isn’t limited to sexual assault.
When students at my alma mater discussed the mental health or sexual assault resources, it might have sounded like we were cribbing from the “Never Ever Talk to the Police” lecture by Professor James Duane of the Regent University School of Law that was taking a tour of campuses. You’re talking to someone who’s job is to safeguard the community, not you, if, in their opinion, you present a legal, physical, or reputational risk to the institution.
One of my classmates recently went public in the school paper with one of the samizdat stories we’d been passing around. Like the students at Patrick Henry and Angie at Amherst, Rachel Williams was quickly cross-examined and pushed off campus when she came to the administration for help with self-harm and depression.
And so, when I say “yes” to the ‘I admit cutting myself’ part, he nods his head and closes his eyes like someone has just given him a bonbon. …
“Well the question may not be what will you do at Yale, but if you are returning to Yale. It may well be safer for you to go home. We’re not so concerned about your studies as we are your safety,” he says.
“I’m sorry,” I say. “What makes you think I will be safer away from school, away from my support system?” School was my stimulation, my passion and my reason for getting up in the morning.
“Well the truth is,” he says, “we don’t necessarily think you’ll be safer at home. But we just can’t have you here.”
Rachel’s experience was echoed repeatedly by students on campuses across the United States who were profiled in Newsweek (“Colleges Flunk Mental Health”). Whether schools attribute instability to irresponsibility or a neurological quirk, the reaction is the same: get them off campus before there’s any chance they might harm themseves and open the school up for an in loco parentis lawsuit like the one MIT faced after the on-campus suicide of Elizabeth Shin. Read More…
Despite his prodigious career in education, Booker T. Washington’s legacy has been tarnished with a charged failure to do more for civil rights during his lifetime: Robert J. Norell, a historian and author of a recent biography on Washington’s life damned him as a “heroic failure”. In 1895, Washington delivered a speech that would be known as the Atlanta Compromise: a short address to allay white fears of a black uprising in a postbellum South. It was delivered to a mostly white audience at the Cotton Sates and International Exposition in Atlanta concerning the current state of black men and women in the South, their place in society, and their future as citizens in the country that once held them as property. “…[Y]ou can be sure in the future, as in the past, that you and your families will be surrounded by the most patient, faithful, law-abiding, and unresentful people that the world has seen.” Though the language may have been obsequious, if one reads between the lines and flowery phrases, Washington’s address contained a warning about the protracted denial of the black man of his identity in the South—a warning that was brought to pass through Washington’s ideological descendants.
A child at the end of the Civil War, Washington understood from a young age the importance of self-sufficiency among the black community. When all the slaves of his plantation were set free, after the initial celebration, the sobering reality of planning a future for themselves and their children set in. Washington wrote in his autobiography Up From Slavery: “The responsibility of being free, of having charge of themselves, of having to think and plan for themselves and their children, seemed to take possession of them.” The elderly slaves in particular felt the most vulnerable, and rightly so. After spending nearly all of their lives working on a plantation, they stood to gain the least from freedom. They had neither the strength to leave the plantation or the skill to work elsewhere. Their freedom was only nominal, and would never be fully realized.
Washington’s solution to this new crop of citizens whose only background was plantation work was practical education. After taking the helm of the Tuskegee Institute in Alabama in 1881, Washington focused his energy on ensuring that blacks had access to both vocational and moral training. Washington believed in the value of blacks being able to provide for themselves by the work of their own two hands with neither resentment nor entitlement.
His views were not universally welcomed among blacks—especially not among the intelligentsia. One of the most prominent black intellectuals of his time, W.E.B. DuBois, denounced Washington’s views and demanded that blacks be fully reinstated with access to education, health care, jobs, and the right to vote. Eloquent, sophisticated, and respected in social and academic circles, DuBois found Washington’s philosophy about black education anathema to the black American cause. But geography and time separated the two men. DuBois was born and educated in the North, where slavery was abolished first and where Quakers and other religious groups preached equality. Washington’s world was the deep South, where slavery had persisted for hundreds of years, blacks were not yet regarded as fully human, and lynchings were an acceptable way to deal with “uppity” blacks. Read More…
American viewers expressed outrage Sunday after NBC reporter Christin Cooper interviewed U.S. skier Bode Miller, driving him to tears with inquiries about his deceased brother. Miller tied for a Bronze medal in competition Sunday, and mentioned his brother in connection with the win. But subsequent prodding from Cooper on the subject was too much.
Twitter and other social media exploded indignantly, accusing Cooper of malicious indifference. But Miller defended the reporter Monday: “I’ve known Christin for a long time and she’s a sweetheart of a person,” he said on NBC’s Today Show. “I know she didn’t mean to push. I don’t think she really anticipated what my reaction was going to be. I think by the time she sort of realized, then I think it was too late.”
It’s kind of Miller, and sheds a compassionate light on the incident. Nonetheless, the incident should prompt journalists (and their audience) to consider the consequences of our sensationalistic reporting.
This sort of journalism is not abnormal. The media’s normal attitude toward celebrities is often just as uncouth and intrusive. Our reality shows regularly exploit the emotions of viewers by drawing on any and every tragedy experienced by its candidates. We’ve become a culture that feeds on sensationalism and misfortune: we love the underdog, the tragic hero. This often becomes a form of emotional exploitation,targeting the audience, protagonist, or people involved in the protagonist’s life. As long as a semblance of fiction or separation remains, we feel okay with this exploitation. But when a man breaks into tears, and the camera hovers in front of his bowed form, we begin to feel guilty. Yes, we—for it is our fault too, not just Christin Cooper’s. The media gives us what we want. It’s a simple case of supply and demand: Americans eat up the media’s shameless emotional pandering (usually).
Also, it’s important to note that Cooper was not the only reporter who seemed to rub pain in Miller’s face. The cameraman zoomed in on his face, and held the camera close to his grief. He wouldn’t go away. It was an inexcusable breach of propriety. If the cameraman had turned away when Miller began crying, perhaps the incident wouldn’t have been as painful. But this isn’t the only time such shameless recording has taken place in Sochi: other failed contestants, when pushed to tears, are immediately dogged by cameras. They want to see the tears fall, to capture the moment of grief for viewers at home.
When Cooper and her cameraman zoomed in on Miller’s grief, they detracted from the true story of the moment: his victory, hard work, and resilience. That should have been the focus of the interview. They ought to have adopted a more professional manner. Indeed, beyond professionalism, mere courtesy would suggest that a person, in moments of grief and pain, should be granted greater privacy.
But American audiences should also consider the role they play in shaping the news: until they demand otherwise, this sensationalism will not stop.
I was very moved by Eve Tushnet’s piece on homosexuality and the Catholic church. If you have not yet read it, I urge you to do so stat; it is a rare piece that deserves to be ingested, not just read. Her courageous story raises important questions about how sex and sexuality are approached in the church. Rod Dreher also weighed in on this issue, arguing that both straight and gay people need community and support on the journey of chastity.
My suggestion is this: how about an instructive community for everyone, irrespective of sexual orientation or gender? Everyone who chooses the path of chastity needs both support and instruction on how to present their bodies as a “living sacrifice”. For those of us who have chosen this path, chastity is much more than resisting temptation or forgoing an experience. Remaining abstinent outside of marriage isn’t like summoning the willpower to pass on dessert; it is a state of being that extends into courtship, marriage, and family planning. Even if someone decides on a lifetime of celibacy, there are phases of that lifestyle that need to be addressed. The Church has to implement faith-based teaching for each milestone of life—otherwise, the expectation to remain chaste remains just that: an expectation.
A solid example of faith-based practical teaching is the prosperity message propagated by the Evangelicals in the early to mid 1990s. With consumer debt rampant, personal finance became a topic that was placed front and center of Evangelical canon. Preachers like John Avanzini devoted their entire ministries on educating Christians on credit cards, interest rates, and debunking the myth that Jesus was poor. It was an effective campaign that reshaped how Christians viewed their finances, and it was done in a way that incorporated step-by-step teaching with Scriptures as a guide. The same down-to-earth teaching should be applied to sex. No matter what form of Christianity you believe in, learning about contraception should be a part of chastity, so each married couple can make the best decision for themselves.
The Atlantic ran a piece this week about a college course on marriage currently offered at Northwestern, arguing that there is no such thing as a “soul mate”, that the key to a good marriage depends on effective communication skills. Romantic love in popular culture is portrayed as a mysterious, elusive force that strikes its victims at random. While this is not the teaching about love in Christianity, the Church is not immune from falling prey to cultural misconceptions about love and marriage. Single people, as well as those who never plan to marry, should place seeking friendship at the top of their list, as well as exploring other kinds of love available to them. A Church emphasis on all kinds of love could be a way to remind young Christians that chastity is taking an active role in your relationships, not just waiting for “the one” to come along.
Eve’s piece drew attention to a small but devoted minority who wish to commit themselves to the noble cause of a lifetime of celibacy in according with the teaching of the Catholic church. But there is a sizeable portion of unmarried Christians, gay and straight, who need guidance as they navigate dating, marriage, family planning, or permanent celibacy. As the average age of marriage rises and the dating and marriage pool continues to change, the Church needs to respond to these needs with solid teaching to provide support and guidance. Otherwise, it risks appearing out of touch and didactic, teaching messages evocative of a time and place that do not jive with the here and now.
The occasion of the 50th anniversary of the Beatles’ appearance on The Ed Sullivan Show has inspired some obligatory guffawing at those old squares who greeted the band with derision. One putdown that fairly stands out for its utter revulsion was from none other than William F. Buckley, who wrote in the Boston Globe in September 1964:
The Beatles are not merely awful; I would consider it sacrilegious to say anything less than that they are god awful. They are so unbelievably horribly, so appallingly unmusical, so dogmatically insensitive to the magic of the art that they qualify as crowned heads of anti-music, even as the imposter popes went down in history as “anti-popes.”
Without appearing willfully contrarian, I get where these critics were coming from, if only in a roundabout sort of way. I’m an enthusiast of early rock and all its British exponents, from both London and Liverpool; I appreciate and admire the Beatles just fine; etc. Yet at the end of the day I’m a Stones guy—and I can’t help but bristle when Beatlemaniacs diminish the Stones for their comparative lack of technical sophistication or proficiency. There is no end to my puzzlement at those who swear by the Beatles because of their proto-progressivity. Because here’s the thing: rock-and-roll really was retrogressive. Yes, even the Beatles.
Oh, I can just hear you out there. Look at George’s sweet jazz-guitar technique!
To which I can only respond, give me a freakin’ break.
George—a lovely player; in my opinion, the finest of the three Beatles guitarists—could never have hung with the likes of Barney Kessel, Herb Ellis, Les Paul, or Wes Montgomery, all of whose mastery of the guitar (in the 1950s!) far exceeded that of any rock-and-roller of the 1960s. This is to say nothing of Django or Charlie Christian.
For all the magic that the Beatles, with not a little help from the classically trained George Martin, created in the studio; for all their genius at crafting songs, there is not a chord or trope or motif of theirs that Cole Porter and George Gershwin would not have recognized. As Elijah Wald has noted, the Beatles did not so much push musical boundaries forward as they consolidated the earlier advances of other 20th century greats, from Louis Armstrong all the way to Tin Pan Alley. (I’m reminded of a bit of trivia I learned from Terry Teachout: the Beatles had mistakenly thought they were the first ones to end a tune on a 6th chord. Martin informed them that Glenn Miller already had.)
Again, don’t misunderstand: I’m a Beatles fan. I appreciate the unparalleled pop-cultural phenomenon that they were. But if I squint just a little, I find it easy to put myself in the shoes of someone who’d lived through hot jazz and hard bop, and who found the Beatles to be amateurish lightweights. In my own shoes, I would defend the Beatles without denying this fact. The amateurishness of rock music was a feature, not a bug. And it still is. If your passion for the Beatles stems from this outsize opinion of their technical competence, I regret to inform you, you’re doing it wrong.
When we debate affirmative action, it’s hard to get hard data. We can check how often identical resumes with white and black names get called back for interviews, but it’s hard to check whether a quota system changes those callback rates over time, or whether affirmative action creates a stigma against the people who are hired under the policy.
It’s pretty rare that public policies are ever tested properly (see Obama’s Innovation Centers, which “test” health policy without ever running controlled trials), but, due to a quirk in India’s election law, we have real-world data about the consequences of one case of quotas and affirmative action.
India set quotas for women’s representation for chief councillor on village councils. In other words, in a randomly selected subset of villages, only women were allowed to stand as candidates. A team of researchers conducted surveys before and after villages were governed by women and was able to isolate the effect of the quota system on villagers perception of women as leaders.
In the villages that had been forced to elect women, villagers were significantly more likely to rate women as competent leaders than in the villages that hadn’t been subject to the quota. The researchers were studying their reactions to women candidates generally, not attitudes about the specific woman leader elected, so it’s possible that individuals were still resented or felt singled out in the manner that Marjorie Romeyn-Sanabria and others describe. However, being exposed to a woman governing was enough to shift voters’ general attitude, even though she hadn’t won in a purely meritocratic election.
One reason the quota system may have produced results is that we never live in a pure meritocracy. There doesn’t need to be any malice aforethought or deliberate discrimination to derail meritocracy. After all, we’re still fumbling to find reliable measures of merit. Even rich, data-driven companies like Google have had to scrap their interview questions and GPA cutoffs when they found their measures of merit weren’t producing results. (A similar process occurs in college admissions, but given the incredible diversity of post-secondary education, and the vagueness of what a “good match” entails, I’ll stick to other domains for examples.) Read More…
In the latest issue of Pacific Standard, John Gravois tells a remarkable story about the origins of artisanal toast in San Francisco. Yes, artisanal toast. At four bucks a slice. Yet far from being one more jeremiad filled with lamenting about Silicon Valley casual wastefulness upcharging the most mundane of foods, Gravois traces the toast craze back to its very humble beginnings in a place called Trouble.
Giulietta Carrelli, the 34-year-old proprietor of Trouble Coffee & Coconut Club, had a hard life on her way from the Cleveland of her childhood to settling down in San Francisco. Plagued by schizoaffective disorder, she bounced around the country, burning her way through friendships and periods of stability at roughly equal rates until she ultimately found a coffee shop in the Bay Area whose owner had a higher tolerance than most.
Eventually encouraged to start her own shop, Carrelli carved out a tiny lot in an obscure neighborhood near the water, and began building a community that could ground her. Her toast, cinnamon sugar, was a source of comfort from a childhood that couldn’t afford any fancier dessert. Her shop was founded, in the most simple sense, to make friends. Even after years of running Trouble, she still struggles with her own troubles, and can have lapses of hours, days, or longer. As she told Gravois, “I’m wearing the same outfit every day. I take the same routes. I own Trouble Coffee so that people recognize my face—so they can help me.”
Gregarious, tattooed, and certainly memorable, Carrelli runs a coffee shop, with a single toaster on the counter, to continually build her network of support, to catch her and remind her of who she is and where she wants to be. The most moving and remarkable about Carrelli’s network of friendships, though, is this feature Gravois describes:
Most of us dedicate the bulk of our attention to a handful of relationships: with a significant other, children, parents, a few close friends. Social scientists call these “strong ties.” But Carrelli can’t rely on such a small set of intimates. Strong ties have a history of failing her, of buckling under the weight of her illness. So she has adapted by forming as many relationships—as many weak ties—as she possibly can. And webs of weak ties are what allow ideas to spread.
Carrelli’s coffee shop exists to make friends, but by partaking of a different sort of friendship than Aristotle or Nisbet would commend to us. Carrelli is the most vivid and compelling example I’ve seen of the essential power of weak bonds.
For all the value we usually place on strong bonds, a superfluity of weak bonds can show much the same strength, and perhaps be more valuable, more resilient in coping with powerful impacts. Moreover, strong bonds can come at a cost, brittleness, as civil engineers are taught. Often tremendous strength is maintained by immobility, a fragility that any wise engineer accounts for by building flexibility into a material, allowing it to bend within its environment and not shatter the moment it is pushed past its tolerance
A useful parallel in the physical world may be another basic staple of everyday consumption: water, formed by predominantly weak hydrogen bonds between each of the individual molecules of H2O. In our everyday encounters, water is permeable, loose, breaking to flow at any disturbance. But that collection of weak hydrogen bonds can make for a powerful static surface tension between whichever molecules happen to be on top at any given moment. It can resist the force of a sudden, powerful impact with the strength of concrete.
Those of us who put a premium on having an attachment to place, to locality, can often talk as if the only thing that mattered were strengthening the few strong bonds that tie us to those around us, especially in the face of modernity’s anonymizing assault on those strong bonds. And it is right that we should make such an emphasis.
But we shouldn’t lose sight of what Carrelli discovered in her journey to starting Trouble. That the multiplication of weak bonds are also essential to tying a people together, and can be especially vital in reaction to such overwhelming shocks that could shatter a few vulnerable but erstwhile strong connections. It is the many weak bonds that tie localities together, and afford them collective strength when faced with dangers that could wipe each out.
Some children are products of their environments, while others are products of their communities. I was neither. Growing up in Coney Island in the early 1990s, my immediate surroundings held few opportunities for a child whose ambition stretched beyond the boardwalk. By the age of four, I could read and write at a first-grade level, and my collection of Dr. Seuss tales bored me. Determined to give me the best education available, my mother took me to every private school that granted me an interview until one accepted me and awarded me a sufficient scholarship to attend. I stayed at that school all the way through high school, went on to college, and am now in the fledging stages of a career in journalism.
Here’s the question: what color is my skin?
Better yet—should it matter?
Technically, I am racially mixed: my father is black and my mother is Hispanic. I’ve never seen myself as exclusively one or the other, but as a unique combination of both. My heritage, like my skin color and gender, are aspects of me that contribute to my identity, but do not tell the whole story.
Over the years, I have been trapped by the sense that because of my race—not in spite of the historical baggage that accompanied it—I was afforded the opportunities granted to me. On a good day, my love of learning and work ethic came second and third, respectively, to my ethnic background. The assumption was that I was an indigent child rescued by a benevolent program that put me on the path to success after polishing my uncouth mannerisms. I was keenly aware that I was a guest in a foreign land, because I was often sought out by my peers to provide perspective to bolster an already formed opinion, but was not engaged with based on the quality of my ideas. That a minority student could matriculate at a private school having skipped the polishing step was apparently so rare that I was perceived as odd, and never broke free from that mold.
Steven L. Carter, a Yale Law professor, aptly sums up the feelings of inadequacy in the introduction of his book Confessions of an Affirmative Action Baby: “…labels, too bedevil the black intellectual, and many of them, as though required in truth-by-advertising law, are in the form of cautions…not least, qualifications for one’s position: ‘Warning! Affirmative Action Baby! Do Not Assume That This Individual Is Qualified!’” I eventually tired of the implicit expectation of relying on identity politics to advance my education, and, for a variety of reasons, switched political affiliations. On this side of the aisle, my skin color is second to my work ethic, which pleases me. Affirmative action, in granting me access to institutions that fostered my potential, ignored the rest of me because it did not fit the stereotype of a refugee from the inner city. Read More…
What happens when an artist finds himself above reproach? Woody Allen, it would seem, finds himself in this rather fortunate position, even as new allegations about his conduct toward stepdaughter Dylan Farrow emerge. In a New York Times piece published Saturday, she described the sexual abuse Allen inflicted on her as a seven-year-old girl. She says the abuse has haunted her throughout her life: “Each time I saw my abuser’s face—on a poster, on a t-shirt, on television—I could only hide my panic until I found a place to be alone and fall apart.”
If any politician or pundit faced these accusations, one would hope they would lose their position, or at least be subjected to careful scrutiny and investigation. Yet in Allen’s case, many are willing to shrug off the immensity of the crime because of his great artistic talent. Rod Dreher wrote in a blogpost a few days ago, “I completely agree that Allen is a pig—if there were no Dylan Farrow accusations at all, his unrepented-of conduct with Soon-Yi Previn was enough to establish his swinishness—but that does not detract from his accomplishments as a filmmaker.”
Perhaps these allegations do not diminish Allen’s accomplishments. But even so, one must carefully consider the ethics of this position. Maureen Orth gathered substantive accusations in 1992 that supported claims of inappropriate behavior by Allen toward Dylan Farrow when she was little. Allen has denied all accusations, and continues to live without culpability. Does one continue to watch his movies, to view and enjoy his art? Andrew Sullivan thinks so, though he seems to believe Farrow is telling the truth:
Perhaps with less essential talents, the sins may more adequately define the artist. But that, in many ways, only makes the injustice worse. Those with the greatest gifts can get away with the greatest crimes.
We can and should rail against this, while surely also be realistically resigned to it. It struck me, for example, rather apposite that as the blogosphere is debating whether to boycott Woody Allen’s films in the future because of this horrifying story, exponentially more people are tuning into the Super Bowl to watch a game we now know will render many of its players mentally incapacitated in their middle ages and beyond. We know that this spectacle is based on the premise of brain damage for many of its participants, but we watch anyway.
This argument appears flawed and alarming to me for a couple simple reasons. First, the Super Bowl comparison does not hold, because tackle football’s dangers and horrors are all known to the athletes involved, and they engage in the sport voluntarily. If Peyton Manning’s concern over his future health and wellbeing ever trumped his desire to play football and make money, he could resign in a heartbeat. He plays football on his own prerogative, and he bears the consequences of his own actions. How do the choices of a 37-year-old man with power, fame, and athletic prowess compare to the involuntary and frightened oppression of a seven-year-old girl? Read More…
China is losing its villages at an exponential rate, according to the New York Times. Reporter Ian Johnson shares the numbers: “In 2000, China had 3.7 million villages, according to research by Tianjin University. By 2010, that figure had dropped to 2.6 million, a loss of about 300 villages a day.”
Johnson reported in the Times last June that the Standing Committee of China’s National People’s Congress was planning to build a city of 260 million people, in an effort to propagate economic growth and consumption. This mammoth enterprise would require moving 250 million people from farm to city in the next dozen years, according to Forbes contributor Gordon G. Chang. “Across China, bulldozers are leveling villages that date to long-ago dynasties,” wrote Johnson. “Towers now sprout skyward from dusty plains and verdant hillsides.”
Beyond the crushing loss of countryside and agriculture this rapid urbanization will bring about, Johnson’s new article notes the cultural loss involved: “Across China, cultural traditions like the Lei family’s music are under threat. Rapid urbanization means village life, the bedrock of Chinese culture, is rapidly disappearing, and with it, traditions and history.”
Why are villages so important? What makes them distinct and culturally significant compared to cities? Villages support subsidiarity and diversity, whereas cities usually promote mass movement and centralization. Of course, the village’s specificity has downsides: it can foster clannishness and biases toward “outsiders.” Nonetheless, without the village, we would lack the kaleidoscopic culture that makes art and life so rich.
Without the village, we’d likely forget valuable traditions. Villages tend to have a longer memory than cities, due to their permanency. Landowners and families are generally more stayed, often remaining in the same area for generations. In contrast, cities often inspire new enterprise and “the next big thing.” They foster pop culture, not folk culture. In the small community, neighbors, family, and friends are almost inescapable. Whether gathering at the city hall, church, or merely visiting the grocery store, familiar faces abound. One must learn to live in communion with others. In cities, it is easier to live alone—and easier to be lost in the clamor and crowd.
The U.S. is voluntarily leaning toward urbanization: city growth outpaced suburban growth from 2011 to 2012, and continued into the spring of 2013. There was some rumored resurgence of the suburbs in September, and it will be interesting to track those trends. Suburban sprawl, some would argue, is even more culturally toxic than urban life. Of course, urban culture is not evil, nor is it necessarily second to village culture. Cities enable the meeting of great minds and talents. They enable entrepreneurs and artists to find a voice. Within moderation, such things are good. But without our folk culture, we lose an important piece of our identity. If the U.S. became a vast amalgam of giant cities, we would lose the federalist impulse that makes us great. The Appalachian communities of Tennessee would no longer differ, in character or culture, from the vast sprawling deserts of Arizona. All would be the same.
“Chinese culture has traditionally been rural-based,” author and scholar Feng Jicai told Johnson. “Once the villages are gone, the culture is gone.” This is not only true of China. Without rural culture, we lose our grounding, history, and diversity. These things not only foster artistic wellbeing: they foster intellectual and political independence. Perhaps the Chinese government will ignore the debilitating effect this centralization has on their culture. But one hopes that other countries, the U.S. included, will consider the consequences.