Rarely do opinion pieces in college newspapers emerge as subjects of national controversy, but a recent essay by Harvard student Sandra Y.L. Korn has generated widespread denunciation among conservatives. Her essay—entitled “The Doctrine of Academic Freedom“—argues for dispensing with longstanding commitments to “academic freedom” in favor of what she calls “academic justice.” Academic freedom permits the airing and defense of any and all views, but she rightly notes that some views have come to be largely unacceptable in academia today. Since such views are not only socially unacceptable, but often discouraged or even prohibited as a matter of university policy, why should they not also be banned when they are articulated as findings of faculty research?
If our university community opposes racism, sexism, and heterosexism, why should we put up with research that counters our goals simply in the name of ‘academic freedom’? Instead, I would like to propose a more rigorous standard: one of ‘academic justice.’ When an academic community observes research promoting or justifying oppression, it should ensure that this research does not continue.
As might be expected, Ms. Korn’s essay has provoked strenuous criticism, including accusations of “academic totalitarianism,” investigations into her personal background aimed at exposing her as a limousine liberal, and criticism from at least one highly visible blusterer in the conservative media.
The default position of these conservatives is that Ms. Korn is attacking the sacred holy of the academic enterprise—academic freedom. In other words, mainstream conservatives have adopted the view of … John Stuart Mill, the lion of liberalism. The same John Stuart Mill who stated that most stupid people are likely to be conservative. And perhaps he had a point. Because academic freedom is not a particularly conservative principle. Academic freedom has been the vehicle by which the universities have been transformed into liberal bastions today, but it is now the inviolable principle that conservatives are rallying around in their denunciation of a Harvard undergraduate. True to form, and as I have argued in a previous column, American conservatives tend to be subject to drift, and almost inevitably end up occupying the territory once held by liberals when they move leftward. Their rallying to apparently contentless “academic freedom” is a particularly vivid case in point.
I agree with Ms. Korn—academic institutions inevitably are dedicated to some substantial commitments, and (often with difficulty) attempt to “patrol” those boundaries, if not with sticks, more often by populating their institutions with people who generally share those commitments. “Academic freedom” was the means by which the substantial commitments once held mainly by religious institutions were initially destabilized and eventually rejected, and provided the cover for their replacement with a new set of commitments. “Academic freedom” purports to be an openness to all views and opinions, but itself contains an implicit philosophy that eventually becomes manifest in, well, Sandra Korn (who is entitled to the confident assertion of the victory that has been won on today’s campuses by her teachers). Read More…
Mary Jo Anderson writes about the false compassion of euthanasia in Crisis Magazine:
The truth that lies underneath the “rights” rhetoric is who will decide what constitutes a quality of life and at what cost. Theodore Dalrymple is an English doctor, psychiatrist and author of OurCulture—What’s Left of It. Dalrymple wrote, “Euthanasia has a tendency to slide from the voluntary to the compulsory, as people increasingly make judgments on behalf of others as to what is a human life worth living.”
This is a temptation that many human rights advocates can be susceptible to: in desiring to help others, we often choose to reform them into our own image. America’s compassionate conservatives often fall prey to this tendency: out of a sincere desire to help people, they seek to re-shape another person’s life or will into their own.
Anderson’s post is highlighting a new bill passed in Belgium on February 13, one that permits euthanasia for young children. “We can no longer pin a wig over the bald truth of the culture of death,” she writes. For pro-life advocates, the move smacks of the same imperative judgments that characterize so many pro-choice arguments (i.e. the mother, specifically, is allowed to decide what’s best for the child and her family).
This bill is an example of what Anderson calls “a right to die”: it’s “thought to be a compassionate, advanced policy,” giving people the ability to decide just how much pain and hardship they are willing to suffer. Of course, making such a decision necessitates we assume a sort of omniscient posture: if we decide whether life is still worth living, we must have a very specific and (we hope) accurate understanding of what life is about, what it’s worth, and how much pain we can handle.
This is how much of the right handles non-life matters. Of course, conservatives may gasp in horror at the thought of aborting or euthanizing babies. But conservatives may also look at homeless people on the street, and tell them, “You should go get a job.” Isn’t this also a case of gross assumption—of putting someone else in your shoes, and commanding them to walk according to your path, willpower, and context? When we decide one country “deserves” democracy, or another should be “liberated” from its oppressors—often such words contain grains of truth. But what we really mean is, “Let us (our military, more often than not) come over and fix your problems, so that you can be just like us.”
Religious thought contains an important truth on this subject: most believe that humans are not omniscient. We were created by Another, a Being with infinite knowledge and discernment. We did not determine our birth. For many, this truth implies that we should not determine our end: our times are in God’s hands, as the Psalmist writes in Psalm 31.
This means we cannot look at our brothers and assume we know their best life (or death) course. Jesus told the story of a tax collector and Pharisee in Luke 18:9-14—the Pharisee thanked God that he was not “like this tax collector, this sinner.” But in reality, it was the tax collector, humble and contrite, who went home justified.
When Jesus had mercy and compassion on an individual, interestingly enough, He often gave them what they asked for. There were cases in which He told people, instead, of sins they needed to leave or truths they needed to confront. But in each and every case, He loved first.
Compassion: it’s difficult to turn such an emotion into right action. So often, we are separated in thought and soul from truly knowing others. We think we know what is right or good, but this lack of omniscience makes it difficult to rightly run people’s lives. Perhaps we should not try—instead, we should just love first.
Why is it that today’s liberals have become the most ardent cheerleaders of arbitrary monarchy? Wasn’t liberalism born of the effort to limit arbitrary rule of a single, unelected ruler?
No, I’m not suggesting that the Left has suddenly decided that they regret the American Revolution. But, in nearly every leading liberal magazine, newspaper and blog, there is a growing excitement and hope that Pope Francis will change the Roman Catholic Church’s “policies” on birth control, male celibate priesthood, homosexuality, gay marriage, divorce and (some, at least, though far fewer) abortion. They have celebrated the appointment of Pope Francis as a sign that the Church is finally going to join the modern world, and fervently hope that he will simply declare that those teachings are no longer valid and embrace today’s accepted orthodoxies. They yearn for executive fiat.
It is striking to witness this palpable longing in juxtaposition of the absence of any real concern on the Left about possible abrogations of the rule of law arising from President Obama’s decision to suspend the “employer mandate” until 2015, and general support of the President’s assertion that in the face of Congressional opposition that he has recourse to the “Pen and the Phone.” And, after a season of accusatory lamentation about Pope Benedict’s authoritarian treatment of the “Nuns on the Bus,” there has been deafening silence from the Left over the Obama administration’s decision to go to court to force the Little Sisters of the Poor to violate their conscience in accepting provision of contraception, abortifacients, and sterilization.
Liberalism was born, the story goes, as a reaction against arbitrary and unlimited rule by monarchs. Yet, today’s liberals seem to adore executive power when it’s used to effect their preferred ends, even hoping that one of the only remaining “monarchs”—the Pope—will single-handedly change the “rules” of the Church. They wish to exchange “fiat” in the sense of “let it be done” to “fiat” in the sense of “do as I say.”
(Of course, “conservatives” don’t escape from this general inclination—they tend also to be ardent supporters of expansive executive power when one of their own is in office, and it is generally conservative intellectuals who have been most interested in developing theories about active executive power.)
What happened to limited government, you might ask? I answer: exactly what liberalism promised. For, liberalism was never about “limited” government, but the pursuit and exercise of potentially limitless power toward seemingly “limited” ends of securing Rights. Read More…
Few things annoy academics more than being told that their work is irrelevant. So there’s nothing surprising about the backlash against Nicholas Kristof’s column in Sunday’s New York Times. Kristof contended that America’s professors, especially political scientists, have “marginalized themselves” by focusing on technical debates at the expense of real problems, relying too heavily on quantitative methods, and preferring theoretical jargon to clear prose. An outraged chorus of responses (round-up here) rejected Kristof’s generalization as a reflection of the very anti-intellectualism that he intended to criticize.
Some of the responses to Kristof reflect the expectation of public recognition for every contribution to the debate that makes so much academic writing a chore to read. Even so, Kristof ignores fairly successful efforts to make scholarship more accessible—even if you don’t count every blog by every holder of a Ph.D. The Tufts professor and Foreign Policy contributor Daniel Drezner has more than 25,000 Twitter followers—partly due to the success of a book the uses a zombie apocalypse scenario to compare theories of international relations. The Washington Post recently picked up The Monkey Cage (full disclosure: several of my colleagues at George Washington University are contributors) and The Volokh Conspiracy, which are populated by political scientists and legal academics, respectively. Not to mention Kristof’s own employer. Just the day before Kristof’s piece ran, the Times hired Lynn Vavreck to contribute to a new site concentrating of social science and public policy.
At least when it comes to political science, then, it’s just not plausible that “there are fewer public intellectuals on American university campuses today than a generation ago.” On the contrary, there are probably more academics who try to communicate with non-specialist audiences than there were in 1994. One difference is that the public intellectuals of past decades were more likely to engage directly with normative and historical Big Questions. That change reflects the declining influence of political theory in comparison with causal analysis, as well as the weakening of the Cold War imperative of justifying liberal democracy.
Of course, writing for non-specialist readers isn’t encouraged in graduate programs, and doesn’t often align with the requirements for hiring and promotion. But there are anecdotal reasons to think that expectations are slowly changing, as departments struggle to prove their ‘relevance’ in a period of financial retrenchment. If Kristof’s piece promotes these changes, it will have served a valuable function whether or not its argument is compelling.
In fairness to Kristof, however, none of the observations refute his basic claim. That’s because he isn’t actually talking about “public intellectuals”. Rather, he means old-fashioned mandarins, who move easily between Harvard Yard and Washington, usually without encountering many members of the public along the way. In a followup on Facebook, Kristof observes that “Mac Bundy was appointed professor of government at Harvard and then dean of the faculty with only a B.A.—impossible to imagine now.” After nearly a decade as dean, Bundy joined the Kennedy administration as National Security advisor, where his vast intellectual firepower led him to promote and defend the Vietnam War.
What Kristof really offers, then, is less an argument for public engagement by scholars than a plea for another crop of Wise Men who lend conventional wisdom the authority of the academy. Not coincidentally, he presents as an exception to the trend toward academic self-marginalization the former Princeton professor and State Department official Anne-Marie Slaughter, whose resume is as perfect a reflection of the meritocratic elite that Bundy helped create as Bundy’s own pedigree was of the old Establishment. More professors should learn to participate in public debate—including political theorists frustrated by the increasingly technical orientation of political science. But if ‘relevance’ means becoming mouthpieces of our new ruling class, then Kristof can keep it.
“Use every man after his desert and who shall ‘scape whipping?”
There’s a narrative that comes up whenever addiction is discussed publicly nowadays: the narrative in which the disease of addiction essentially replaces a person’s free will.
The barroom-wisdom version of it is the old line, “First the man takes a drink. Then the drink takes a drink. Then the drink takes the man.” A fairly heartbreaking version of it comes in this interview with author (and father of an addict) David Sheff: “Once I started realizing that my son was not making choices, I went from being angry and judgmental to being able to look at him with compassion: He’s sick; he needs help. It also allowed me to figure out what I needed to do: He’s sick; he needs to be treated.”
This narrative is, to put it mildly, not uncontested. Lots of addicts don’t agree with this description of our problem. Lots of non-addicts also disagree, typically more virulently (but I guess I’m biased), and insist that addicts are doing it on purpose: That’s a narrative I seem to see more on the right than on the left, though these are personal enough matters that ideological categories get blurry. The whole debate over how fully you want to reify the disease metaphor (which is what it is—that’s not bad, metaphors are a normal and poetic part of human understanding) gets tangled up in related debates about Alcoholics Anonymous, metaphors of surrender, treatment and/as/vs. punishment, and the promotion of personal responsibility.
There are five things wrong with an overemphasis on disease at the expense of free will—but the reaction against the disease metaphor often merely serves to strengthen it. First, the five things:
If you couldn’t understand what your family was saying, would you understand them better or worse?
Nina Raines’s ”Tribes” opens with four Britons hurling abuse at each other around the kitchen table. I think it’s supposed to be funny, but it’s mostly just crass and painful: Mom, Dad, brother and sister describing one another’s passions, hopes, beliefs, and sex lives in the most contemptuous terms possible. The fifth member of the family is deaf and yeah, you do feel that perhaps he’s the lucky one.
As the play moves forward, the younger characters get shades and nuance. (The parents, and especially the cartoonishly self-centered father, remain pretty much the same.) Daniel (Richard Gallagher), the hearing son, shows flashes of haunted vulnerability which reveal a gulf of misery under cover of vituperation. The entire family has raised Billy (James Caverly, who starts off with a beatific smile which is clearly at least partly a mask or role) to read lips rather than to sign. They’ve developed an ideological resistance to anything which smacks of Deaf culture.
They genuinely believe they’re protecting Billy, but they’re also terrified of losing their beloved son and brother to a culture which can promise him a kind of belonging they can’t offer. When that masky smile finally slips and Billy says that they view him as the family mascot, the audience can tell that it’s not true: If anything, he’s the family conscience, the only one they allow to be good, the only one they’ll openly love. Of course, he’s also the only one they never need to listen to.
When the play begins, the family is all trapped together. The hearing children, Daniel and Ruth, have retreated to the family home after a series of romantic and professional defeats in the outside world. (“I feel like a bonsai tree!” Ruth yells, in a line which got big, empathetic laughs.) Billy never left, has never had a job or a girlfriend. One of the major themes of the play is the fact that belonging is rarely chosen; you don’t get to pick the elements which make up your identity, the ties which bind. You can try to leave—and seriously, Daniel at least should do everything in his power to get out of his parents’ house, because they’re actively damaging his psyche; this isn’t a play about the comforts of home—but you will eventually have to return, if only to give an account of yourself.
There are some terrific little moments (the play’s humor eventually does become actually funny), often involving how much impromptu “sign” this resolutely anti-sign-language family uses. There are tough, basically unanswerable questions about how language shapes us and separates us from others: As Billy’s new girlfriend goes deaf, she wonders if she’s losing the ability to understand nuances and ambiguities which can’t be expressed in sign. Read More…
For most casual observers, whether Catholic or not, the main battle lines within American Catholicism today seem self-evident. The cleavage overlaps perfectly the divide between the political parties, leading to the frequently-used labels “liberal” and “conservative” Catholics. We have Nancy Pelosi and Andrew Cuomo representing the Left, and Rick Santorum and Sam Brownback aligned with the Right. Mainstream opinion has classified Popes John Paul II and Benedict XVI as honorary Republicans, and Pope Francis as a Democrat (hence, why he is appearing on the covers of Time and Rolling Stone magazines).
This division does indeed capture real battle lines, but more than anything, the divide is merely an extension of our politics, and—while manned by real actors—does not capture where the real action is to be found today in American Catholic circles.
The real action does not involve liberal “Catholics” at all. Liberal Catholicism, while well-represented in elite circles of the Democratic Party, qua Catholicism is finished. Liberal Catholicism has no future—like liberal Protestantism, it is fated to become liberalism simpliciter within a generation. The children of liberal Catholics will either want their liberalism unvarnished by incense and holy water, or they will rebel and ask if there’s something more challenging, disobeying their parents by “reverting” to Catholicism. While “liberal” Catholicism will appear to be a force because it will continue to have political representation, as a “project” and a theology, like liberal Protestantism it is doomed to oblivion.
The real battle is taking place beyond the purview of the pages of Time Magazine and the New York Times. The battle pits two camps of “conservative” Catholicism (let’s dispense with that label immediately and permanently—as my argument suggests, and others have said better, our political labels are inadequate to the task).
On the one side one finds an older American tradition of orthodox Catholicism as it has developed in the nation since the mid-twentieth century. It is closely aligned to the work of the Jesuit theologian John Courtney Murray, and its most visible proponent today is George Weigel, who has inherited the mantle from Richard John Neuhaus and Michael Novak. Its intellectual home remains the journal founded by Neuhaus, First Things. Among its number can be counted thinkers like Robert George, Hadley Arkes, Robert Royal, and—if somewhat quirkier than these others—Peter Lawler.
Its basic positions align closely to the arguments developed by John Courtney Murray and others. Essentially, there is no fundamental contradiction between liberal democracy and Catholicism. Liberal democracy is, or at its best can be, a tolerant home for Catholics, one that acknowledges contributions of the Catholic tradition and is leavened by its moral commitments. While liberalism alone can be brittle and thin—its stated neutrality can leave it awash in relativism and indifferentism—it is deepened and rendered more sustainable by the Catholic presence. Murray went so far as to argue that America is in fact more Catholic than even its Protestant founders realized—that they availed themselves unknowingly of a longer and deeper tradition of natural law that undergirded the thinner liberal commitments of the American founding. The Founders “built better than they knew,” and so it is Catholics like Orestes Brownson and Murray, and not liberal lions like John Locke or Thomas Jefferson, who have better articulated and today defends the American project. Read More…
Today marks the 41st anniversary of the national legalization of abortion in the United States. Thousands will march on the Mall and in the streets of Washington D.C. in protest of this decision, braving frigid temperatures and a blanket of snow to express their profound moral objection to Roe v. Wade and lamenting the estimated 55 million young lives that were legally extinguished since January 22, 1973.
The March for Life has become a rallying point for the pro-life movement, an annual pilgrimage of sorts, especially for young people who gather together to affirm a bedrock belief: the sanctity of human life from conception to natural death. Even amid the overwhelming sense of tragedy and loss that draws them to D.C., in order, it is hoped, to effect a change, there is also a sense of affirmation and even celebration in the company of many others who are also so firmly committed, who gather to defend a belief that is today dismissed and mocked by cultural elites and cognoscenti (including the governor of New York, purportedly a Catholic), who find joy in the fellowship of so many companions who stand for life. As one friend posted on Facebook, “the Tribe is together.”
Amid the widespread sense of shared purpose, there is perhaps little time or inclination to reflect on a question: why gather, as Marchers do, in Washington D.C.? It is perhaps a question whose answer is self-evident: the March ends outside the Supreme Court, which continues to affirm Roe v. Wade as controlling precedent. It is the location of the president and the Senate, which ultimately has the power to make or confirm appointments to the Court. It is the nation’s media center, where such a protest has the best chance of being amplified to the nation. It is physically laid out to accommodate large protests, with its Mall almost seeming to have been designed for that purpose. It is the nation’s capital, where our elites congregate to make policy and steer the nation. Naturally, if people from all parts of the nation gather in protest of a national issue, it is not only the best place, but the only place.
However, the March’s annual presence in D.C. obscures a number of issues, above all, whether abortion is ultimately a political and even legal matter. On one level, inescapably so: it has been a political matter for decades, even a “wedge” issue that has become a defining difference between the two political parties. It is obviously a legal issue, generating countless pages of legal theory and philosophical argument, as well as scores of subsequent High Court and even more lower court decisions that have responded to ongoing challenges and debates over the issue. So perhaps no further thought is necessary—destination D.C.
However, by other considerations, treating it exclusively as a political and legal matter obscures the extent to which it is most fully a question of culture. And, if conservatives would generally tend to agree on one thing—aside from the immorality of abortion—it is that culture does not originate in Washington, D.C., or at least that it shouldn’t. Read More…
When considering the future of fiction, many speak somewhat disdainfully of “fan fiction” and other less original or artistic works. But Slate contributor Hugh Howey believes fan fiction is actually a classically respected literary genre:
Lovers of the new and immutable novel may fear the end times, but ironically the end times themselves were a work of fan fiction. The four Gospels were written well after the times they describe, and each has its own take on similar events. (It used to bother me that the Gospels disagreed on so much. But then I discovered Batman comics and saw how often the Caped Crusader’s origins and backstory also changed over time.) Shakespeare made a career out of fan fiction. Wealthy patrons would request a new stab at a familiar story, and the Bard would comply. Or he would draw upon historical facts and people to make fiction from the real.
Thus, those who believe fan fiction is discreditable to the arts must accept the fact that it’s a historical form. Howey argues that fan fiction books are not easy stories—though the ideas used to shape the books may be “used,” he writes, “Ideas are cheap. Stories are dear … We are all telling the same story with slight variations.”
There’s another intersecting perspective one might bring to bear on the increasing popularity of fan fiction. Although we all aspire to literary greatness, it is not easy to write brilliant material. Few of us will ever write something of considerable originality. For most of us, our best work will be written “on the shoulders of giants,” to borrow a phrase from Russell Kirk. Some of us must content ourselves with mediocrity, if that mediocrity will enable true genius to shine through our weak yet willing hands.
Front Porch Republic author James Matthew Wilson excellently explains this idea of aspirational mediocrity in a Sunday post:
To keep alive a tradition, to continue to produce mildly good poems, and reasonably memorable concertos, capable of rousing the ear and the mind to attention, thought, and pleasure — these things are good in themselves. To perpetuate a traditional practice enriches the storehouse of being while also stitching together the eternal society of the dead, living, and those still unborn in such a way that past, present, and future remain habitable places, where human voice can still hear and answer human voice. It keeps words, habits, and techniques in common, it cultivates, tempers, and preserves a climate of opinion, whatever the other storms of history.
Few authors, writers, and journalists will admit their work is mediocre (whether fan fiction or otherwise). At root, we want to write classics. But perhaps our mediocrity will help transmit a tradition, as Wilson writes: “To live within and participate in a tradition is, again, to keep something alive and to draw things and persons together, across time, in a community of knowledge and love.” Does fan fiction accomplish this? Not always; but within its diverse and sundry works, nuggets of a valuable literary tradition can flourish and grow.
Was Dostoyevsky right when he said, “Beauty will save the world”? One of my favorite pieces on the subject came from Jeffrey Bilbro at Front Porch Republic this fall. Bilbro referenced Solzhenitsyn’s opinion on the subject, as stated in a Nobel Lecture:
…Perhaps that ancient trinity of Truth, Goodness and Beauty is not simply an empty, faded formula as we thought in the days of our self-confident, materialistic youth? If the tops of these three trees converge, as the scholars maintained, but the too blatant, too direct stems of Truth and Goodness are crushed, cut down, not allowed through – then perhaps the fantastic, unpredictable, unexpected stems of Beauty will push through and soar TO THAT VERY SAME PLACE, and in so doing will fulfill the work of all three?
There is a problem—partly addressed by Bilbro, and holistically addressed by Robert Royal in a post at The Catholic Thing last Monday, that must be noted: namely, beauty often serves more as an illusory temptress than as a guide to the divine. How do we distinguish between illuminating beauty and a false, hollow sort? Royal explains with a passage from Dante’s Purgatorio, when Dante dreams of a beautiful woman:
She ‘gan to sing so, that with difficulty
Could I have turned my thoughts away from her.
“I am,” she sang, “I am the Siren sweet
Who mariners amid the main unman,
So full am I of pleasantness to hear.
I drew Ulysses from his wandering way
Unto my song, and he who dwells with me
Seldom departs so wholly I content him.”
Her mouth was not yet closed again, before
Appeared a Lady saintly and alert
Close at my side to put her to confusion.
“Virgilius, O Virgilius! who is this?”
Sternly she said; and he was drawing near
With eyes still fixed upon that modest one.
She seized the other and in front laid open,
Rending her garments, and her belly showed me;
This waked me with the stench that issued from it.
It’s a poignant passage, and leads us to Royal’s vital question: “When is what appears beautiful a reflection of the divine—and when is it a Siren’s song?”
This reminded me of Oscar Wilde’s Picture of Dorian Gray. The novel’s protagonist is a handsome young man who makes “beauty” his goal and end in life—and ultimately destroys his life through this search. The novel is full of rich pictorial imagery, but Wilde’s descriptions of nature and people seem purposefully transient. The beauty depicted is of a temporal sort, never referencing a higher, deeper, or truer meaning. Beauty is more “pleasing” than “good.” One notable exception would be the young, tragic Sibyl Vane, whose goodness and beauty are inextricably linked (and in the end, constitute her downfall). Gray could not tell the difference between a pleasurable beauty and “good” beauty. How can we differentiate between the two?
Both Bilbro and Royal believe our society has a picture of beauty that’s either reductive or deceptive. Bilbro writes, “Currently, our cultural aesthetic is, in Solzhenitsyn’s terms, sickly and pale: we too often confuse the pretty, the mere appearance, for true beauty, hence our acceptance of lush green lawns that cause water pollution … We have to be able to see the whole to perceive beauty (again, note the connection between beauty and health). Analysis of the beautiful, if it does not begin with a vision of the whole and keep this vision constantly in mind, quickly devolves into an abstract rummaging through dead parts.” This idea of reduced beauty gives us the sense that something has been lost in our searching—that whatever beauty constitutes, it is something richer and deeper than most modern definitions. Read More…