After perusing some old bookshelves and boxes, I discovered an old nature diary. These excerpts were written by my 11-year old self:
On Monday, July 23, my family and I went huckleberry picking in the forest. As we were looking for a good place to start picking, Mom cried, “Look at the baby deer!” Dad stopped the car, and we saw a doe dash behind a bush. The baby deer stayed for about five minutes, and then disappeared.
After picking at least a gallon of huckleberrys [sic], we went back to where we had had a picnic, and washed our hands in the creek. Then we looked for rocks, and I found a piece of petrified wood.
On the way home, Katie looked for more deer, and soon called out, “A buck! A buck!” Dad turned around the car, and we saw the buck run away! After continuing our drive Katie called, “A doe! A doe!” I barely saw her, and I didn’t have enough room to draw her, and so I didn’t include her.
This bed is all a-bloom. I investigated it thouroughly [sic] and not a flower does not have intricate purples, blues, whites, pinks, and vivid strawberry. This bed also, is filled with frilly purple, triumphant yellow, soft strawberry pink, luscious green, and a few hints of snowy white.
I must say spring has come swiftly and sufficiently. When I enter the house, it looks so dark and gloomy compared to the fresh, yellow-white sunlight that streams through branches outside.
It is terrible writing. But it’s also interesting. I see a little girl who loved the outdoors, and loved beauty—and she was trying, desperately, to capture these moments before they faded away. I didn’t want to forget the first time I saw a piece of petrified wood or baby deer. I wanted to capture the beauty of a garden rose. The words are an interesting juxtaposition of almost terse journalistic chronicling, and an exaggerated romantic style.
I didn’t know it then, but I was writing to an audience of selves. Years pass, and a new self visits the journal—someone with greater experience, skepticism, knowledge. This new person reads with fresh eyes and perhaps amusement—but aloof as they may seem, they still feel kinship with the writer of the past.
This is the horror and wonder of writing: we know we will return to ourselves, five or 10 years down the road, and see a face we’ve forgotten. We will look closely at a soul whose face we know, but whose expression and turn of phrase is now old-fashioned (and sometimes cringe-worthy) to our eyes.
There are two ways to “read where you’re from,” and I think both are valuable exercises. The first is to revisit old diary entries, scribbles, photographs, and personal collections, such as the above. Not everyone is a writer, but almost every child collects memories from their childhood in one form or another. Revisiting such materials gives us a window into who we were, and the ways we’ve grown—or stayed the same. Revisiting old notebooks and diaries from childhood requires a sense of humor, perspective, and candor—to see oneself clearly, then and now. Read More…
Andrew Leonard has a fascinating, if discursive, profile in Salon of the novelist Richard Powers. In it, Powers registers a note of discontent with the proliferation of both pop-musical content—the “insane torrent,” as he puts it in his latest novel Orfeo—and all the digital mechanisms that deliver it today.
“Suppose you were born in 1962 and you are coming into your own and music starts to become essential to you,” says Powers. “You are right on the tail end of that sort of folk rock thing, but you are aware historically of how these guys were revolting in a way against the previous generation. And because of the nature of the distribution mechanisms that you talk about, where it’s two radio stations and one record store, there’s a saturation effect for whatever is in vogue and there has to be a countervailing cultural move just to refresh our ears. And that’s the start of punk.
“So you can see these revolutions and counter-revolutions and you can see a historical motion to popular music and it’s thrilling and you want to know what happens next. The state that you just described of permanent wonderful eclectic ubiquitous interchangeable availability — there’s no sense of historical thrust.”
Leonard sums up the point thus: “Universal ubiquity has blown up the narrative.”
There’s something to this, I think. Granted, I’m pushing 40; I’m not as hip to trends as I was when I was a kid, and certainly not when I was covering them for a daily newspaper. But the point is, it’s undeniably the case that it was once fairly easy for a casual music fan to keep abreast of the latest thing. Right up until the early aughts and the passel of “The [monosyllabic plural-noun]” bands playing neo-garage rock: that’s the last time I can remember there being some kind of identifiable “movement” in rock music.
Now there is no one thing.
As a caveat, I will admit it’s somewhat rockist to think along these lines. Just because Pharrell Williams or Bruno Mars or Robin Thicke aren’t trad-rock frontmen, why can’t they be seen as the vanguard of an eclectic dance-pop revival?
And part of the fragmentation that Powers laments isn’t even technologically-driven; it’s generational and racial. Much of what we would’ve called rock music 30 years ago now falls under the rubric of country music. It’s Top 40 for white people. It’s blues-based rock tricked out with splashes of fiddle and pedal-steel. (Bruce Springsteen astutely noticed this recently: “country music is kind of where rock music has gone, really, at this point … It’s basically kind of pop-rock music … It’s where rock music continues to have a certain currency.”)
So, a good chunk of the energy in rock-ish songwriting is devoted to the Nashville machine, which critics don’t take terribly seriously and hence wouldn’t bother to suss out trends that fit on the continuum of Powers’s reaction/counter-reaction “historical thrust.”
All that said, I suspect Powers is right. We’ve probably seen the last of swing-turns-to-bop or prog-is-wiped-out-by-punk cultural shifts in popular music. Powers seems to quite openly admit that these shifts had been abetted by self-styled tastemakers, who have seen a drastic decline in influence over the last 15 years. He argues unabashedly that the culture desperately requires those media filters.
I don’t know if I’d go quite that far. Then again, I say that having come of age in a culture that had filters. I say I can do without them. But I was shaped by them. What about my kids’ generation: how will they know what’s good, what’s worth listening to, what (as a Clash fan would have put it) matters?
They’ll be free of a “narrative” imposed from above. Instead they will have the “insane torrent.”
Everything will be at their fingertips.
But perhaps they will miss greatness right under their noses.
Rarely do opinion pieces in college newspapers emerge as subjects of national controversy, but a recent essay by Harvard student Sandra Y.L. Korn has generated widespread denunciation among conservatives. Her essay—entitled “The Doctrine of Academic Freedom“—argues for dispensing with longstanding commitments to “academic freedom” in favor of what she calls “academic justice.” Academic freedom permits the airing and defense of any and all views, but she rightly notes that some views have come to be largely unacceptable in academia today. Since such views are not only socially unacceptable, but often discouraged or even prohibited as a matter of university policy, why should they not also be banned when they are articulated as findings of faculty research?
If our university community opposes racism, sexism, and heterosexism, why should we put up with research that counters our goals simply in the name of ‘academic freedom’? Instead, I would like to propose a more rigorous standard: one of ‘academic justice.’ When an academic community observes research promoting or justifying oppression, it should ensure that this research does not continue.
As might be expected, Ms. Korn’s essay has provoked strenuous criticism, including accusations of “academic totalitarianism,” investigations into her personal background aimed at exposing her as a limousine liberal, and criticism from at least one highly visible blusterer in the conservative media.
The default position of these conservatives is that Ms. Korn is attacking the sacred holy of the academic enterprise—academic freedom. In other words, mainstream conservatives have adopted the view of … John Stuart Mill, the lion of liberalism. The same John Stuart Mill who stated that most stupid people are likely to be conservative. And perhaps he had a point. Because academic freedom is not a particularly conservative principle. Academic freedom has been the vehicle by which the universities have been transformed into liberal bastions today, but it is now the inviolable principle that conservatives are rallying around in their denunciation of a Harvard undergraduate. True to form, and as I have argued in a previous column, American conservatives tend to be subject to drift, and almost inevitably end up occupying the territory once held by liberals when they move leftward. Their rallying to apparently contentless “academic freedom” is a particularly vivid case in point.
I agree with Ms. Korn—academic institutions inevitably are dedicated to some substantial commitments, and (often with difficulty) attempt to “patrol” those boundaries, if not with sticks, more often by populating their institutions with people who generally share those commitments. “Academic freedom” was the means by which the substantial commitments once held mainly by religious institutions were initially destabilized and eventually rejected, and provided the cover for their replacement with a new set of commitments. “Academic freedom” purports to be an openness to all views and opinions, but itself contains an implicit philosophy that eventually becomes manifest in, well, Sandra Korn (who is entitled to the confident assertion of the victory that has been won on today’s campuses by her teachers). Read More…
Mary Jo Anderson writes about the false compassion of euthanasia in Crisis Magazine:
The truth that lies underneath the “rights” rhetoric is who will decide what constitutes a quality of life and at what cost. Theodore Dalrymple is an English doctor, psychiatrist and author of OurCulture—What’s Left of It. Dalrymple wrote, “Euthanasia has a tendency to slide from the voluntary to the compulsory, as people increasingly make judgments on behalf of others as to what is a human life worth living.”
This is a temptation that many human rights advocates can be susceptible to: in desiring to help others, we often choose to reform them into our own image. America’s compassionate conservatives often fall prey to this tendency: out of a sincere desire to help people, they seek to re-shape another person’s life or will into their own.
Anderson’s post is highlighting a new bill passed in Belgium on February 13, one that permits euthanasia for young children. “We can no longer pin a wig over the bald truth of the culture of death,” she writes. For pro-life advocates, the move smacks of the same imperative judgments that characterize so many pro-choice arguments (i.e. the mother, specifically, is allowed to decide what’s best for the child and her family).
This bill is an example of what Anderson calls “a right to die”: it’s “thought to be a compassionate, advanced policy,” giving people the ability to decide just how much pain and hardship they are willing to suffer. Of course, making such a decision necessitates we assume a sort of omniscient posture: if we decide whether life is still worth living, we must have a very specific and (we hope) accurate understanding of what life is about, what it’s worth, and how much pain we can handle.
This is how much of the right handles non-life matters. Of course, conservatives may gasp in horror at the thought of aborting or euthanizing babies. But conservatives may also look at homeless people on the street, and tell them, “You should go get a job.” Isn’t this also a case of gross assumption—of putting someone else in your shoes, and commanding them to walk according to your path, willpower, and context? When we decide one country “deserves” democracy, or another should be “liberated” from its oppressors—often such words contain grains of truth. But what we really mean is, “Let us (our military, more often than not) come over and fix your problems, so that you can be just like us.”
Religious thought contains an important truth on this subject: most believe that humans are not omniscient. We were created by Another, a Being with infinite knowledge and discernment. We did not determine our birth. For many, this truth implies that we should not determine our end: our times are in God’s hands, as the Psalmist writes in Psalm 31.
This means we cannot look at our brothers and assume we know their best life (or death) course. Jesus told the story of a tax collector and Pharisee in Luke 18:9-14—the Pharisee thanked God that he was not “like this tax collector, this sinner.” But in reality, it was the tax collector, humble and contrite, who went home justified.
When Jesus had mercy and compassion on an individual, interestingly enough, He often gave them what they asked for. There were cases in which He told people, instead, of sins they needed to leave or truths they needed to confront. But in each and every case, He loved first.
Compassion: it’s difficult to turn such an emotion into right action. So often, we are separated in thought and soul from truly knowing others. We think we know what is right or good, but this lack of omniscience makes it difficult to rightly run people’s lives. Perhaps we should not try—instead, we should just love first.
Why is it that today’s liberals have become the most ardent cheerleaders of arbitrary monarchy? Wasn’t liberalism born of the effort to limit arbitrary rule of a single, unelected ruler?
No, I’m not suggesting that the Left has suddenly decided that they regret the American Revolution. But, in nearly every leading liberal magazine, newspaper and blog, there is a growing excitement and hope that Pope Francis will change the Roman Catholic Church’s “policies” on birth control, male celibate priesthood, homosexuality, gay marriage, divorce and (some, at least, though far fewer) abortion. They have celebrated the appointment of Pope Francis as a sign that the Church is finally going to join the modern world, and fervently hope that he will simply declare that those teachings are no longer valid and embrace today’s accepted orthodoxies. They yearn for executive fiat.
It is striking to witness this palpable longing in juxtaposition of the absence of any real concern on the Left about possible abrogations of the rule of law arising from President Obama’s decision to suspend the “employer mandate” until 2015, and general support of the President’s assertion that in the face of Congressional opposition that he has recourse to the “Pen and the Phone.” And, after a season of accusatory lamentation about Pope Benedict’s authoritarian treatment of the “Nuns on the Bus,” there has been deafening silence from the Left over the Obama administration’s decision to go to court to force the Little Sisters of the Poor to violate their conscience in accepting provision of contraception, abortifacients, and sterilization.
Liberalism was born, the story goes, as a reaction against arbitrary and unlimited rule by monarchs. Yet, today’s liberals seem to adore executive power when it’s used to effect their preferred ends, even hoping that one of the only remaining “monarchs”—the Pope—will single-handedly change the “rules” of the Church. They wish to exchange “fiat” in the sense of “let it be done” to “fiat” in the sense of “do as I say.”
(Of course, “conservatives” don’t escape from this general inclination—they tend also to be ardent supporters of expansive executive power when one of their own is in office, and it is generally conservative intellectuals who have been most interested in developing theories about active executive power.)
What happened to limited government, you might ask? I answer: exactly what liberalism promised. For, liberalism was never about “limited” government, but the pursuit and exercise of potentially limitless power toward seemingly “limited” ends of securing Rights. Read More…
Few things annoy academics more than being told that their work is irrelevant. So there’s nothing surprising about the backlash against Nicholas Kristof’s column in Sunday’s New York Times. Kristof contended that America’s professors, especially political scientists, have “marginalized themselves” by focusing on technical debates at the expense of real problems, relying too heavily on quantitative methods, and preferring theoretical jargon to clear prose. An outraged chorus of responses (round-up here) rejected Kristof’s generalization as a reflection of the very anti-intellectualism that he intended to criticize.
Some of the responses to Kristof reflect the expectation of public recognition for every contribution to the debate that makes so much academic writing a chore to read. Even so, Kristof ignores fairly successful efforts to make scholarship more accessible—even if you don’t count every blog by every holder of a Ph.D. The Tufts professor and Foreign Policy contributor Daniel Drezner has more than 25,000 Twitter followers—partly due to the success of a book the uses a zombie apocalypse scenario to compare theories of international relations. The Washington Post recently picked up The Monkey Cage (full disclosure: several of my colleagues at George Washington University are contributors) and The Volokh Conspiracy, which are populated by political scientists and legal academics, respectively. Not to mention Kristof’s own employer. Just the day before Kristof’s piece ran, the Times hired Lynn Vavreck to contribute to a new site concentrating of social science and public policy.
At least when it comes to political science, then, it’s just not plausible that “there are fewer public intellectuals on American university campuses today than a generation ago.” On the contrary, there are probably more academics who try to communicate with non-specialist audiences than there were in 1994. One difference is that the public intellectuals of past decades were more likely to engage directly with normative and historical Big Questions. That change reflects the declining influence of political theory in comparison with causal analysis, as well as the weakening of the Cold War imperative of justifying liberal democracy.
Of course, writing for non-specialist readers isn’t encouraged in graduate programs, and doesn’t often align with the requirements for hiring and promotion. But there are anecdotal reasons to think that expectations are slowly changing, as departments struggle to prove their ‘relevance’ in a period of financial retrenchment. If Kristof’s piece promotes these changes, it will have served a valuable function whether or not its argument is compelling.
In fairness to Kristof, however, none of the observations refute his basic claim. That’s because he isn’t actually talking about “public intellectuals”. Rather, he means old-fashioned mandarins, who move easily between Harvard Yard and Washington, usually without encountering many members of the public along the way. In a followup on Facebook, Kristof observes that “Mac Bundy was appointed professor of government at Harvard and then dean of the faculty with only a B.A.—impossible to imagine now.” After nearly a decade as dean, Bundy joined the Kennedy administration as National Security advisor, where his vast intellectual firepower led him to promote and defend the Vietnam War.
What Kristof really offers, then, is less an argument for public engagement by scholars than a plea for another crop of Wise Men who lend conventional wisdom the authority of the academy. Not coincidentally, he presents as an exception to the trend toward academic self-marginalization the former Princeton professor and State Department official Anne-Marie Slaughter, whose resume is as perfect a reflection of the meritocratic elite that Bundy helped create as Bundy’s own pedigree was of the old Establishment. More professors should learn to participate in public debate—including political theorists frustrated by the increasingly technical orientation of political science. But if ‘relevance’ means becoming mouthpieces of our new ruling class, then Kristof can keep it.
“Use every man after his desert and who shall ‘scape whipping?”
There’s a narrative that comes up whenever addiction is discussed publicly nowadays: the narrative in which the disease of addiction essentially replaces a person’s free will.
The barroom-wisdom version of it is the old line, “First the man takes a drink. Then the drink takes a drink. Then the drink takes the man.” A fairly heartbreaking version of it comes in this interview with author (and father of an addict) David Sheff: “Once I started realizing that my son was not making choices, I went from being angry and judgmental to being able to look at him with compassion: He’s sick; he needs help. It also allowed me to figure out what I needed to do: He’s sick; he needs to be treated.”
This narrative is, to put it mildly, not uncontested. Lots of addicts don’t agree with this description of our problem. Lots of non-addicts also disagree, typically more virulently (but I guess I’m biased), and insist that addicts are doing it on purpose: That’s a narrative I seem to see more on the right than on the left, though these are personal enough matters that ideological categories get blurry. The whole debate over how fully you want to reify the disease metaphor (which is what it is—that’s not bad, metaphors are a normal and poetic part of human understanding) gets tangled up in related debates about Alcoholics Anonymous, metaphors of surrender, treatment and/as/vs. punishment, and the promotion of personal responsibility.
There are five things wrong with an overemphasis on disease at the expense of free will—but the reaction against the disease metaphor often merely serves to strengthen it. First, the five things:
If you couldn’t understand what your family was saying, would you understand them better or worse?
Nina Raines’s “Tribes” opens with four Britons hurling abuse at each other around the kitchen table. I think it’s supposed to be funny, but it’s mostly just crass and painful: Mom, Dad, brother and sister describing one another’s passions, hopes, beliefs, and sex lives in the most contemptuous terms possible. The fifth member of the family is deaf and yeah, you do feel that perhaps he’s the lucky one.
As the play moves forward, the younger characters get shades and nuance. (The parents, and especially the cartoonishly self-centered father, remain pretty much the same.) Daniel (Richard Gallagher), the hearing son, shows flashes of haunted vulnerability which reveal a gulf of misery under cover of vituperation. The entire family has raised Billy (James Caverly, who starts off with a beatific smile which is clearly at least partly a mask or role) to read lips rather than to sign. They’ve developed an ideological resistance to anything which smacks of Deaf culture.
They genuinely believe they’re protecting Billy, but they’re also terrified of losing their beloved son and brother to a culture which can promise him a kind of belonging they can’t offer. When that masky smile finally slips and Billy says that they view him as the family mascot, the audience can tell that it’s not true: If anything, he’s the family conscience, the only one they allow to be good, the only one they’ll openly love. Of course, he’s also the only one they never need to listen to.
When the play begins, the family is all trapped together. The hearing children, Daniel and Ruth, have retreated to the family home after a series of romantic and professional defeats in the outside world. (“I feel like a bonsai tree!” Ruth yells, in a line which got big, empathetic laughs.) Billy never left, has never had a job or a girlfriend. One of the major themes of the play is the fact that belonging is rarely chosen; you don’t get to pick the elements which make up your identity, the ties which bind. You can try to leave—and seriously, Daniel at least should do everything in his power to get out of his parents’ house, because they’re actively damaging his psyche; this isn’t a play about the comforts of home—but you will eventually have to return, if only to give an account of yourself.
There are some terrific little moments (the play’s humor eventually does become actually funny), often involving how much impromptu “sign” this resolutely anti-sign-language family uses. There are tough, basically unanswerable questions about how language shapes us and separates us from others: As Billy’s new girlfriend goes deaf, she wonders if she’s losing the ability to understand nuances and ambiguities which can’t be expressed in sign. Read More…
For most casual observers, whether Catholic or not, the main battle lines within American Catholicism today seem self-evident. The cleavage overlaps perfectly the divide between the political parties, leading to the frequently-used labels “liberal” and “conservative” Catholics. We have Nancy Pelosi and Andrew Cuomo representing the Left, and Rick Santorum and Sam Brownback aligned with the Right. Mainstream opinion has classified Popes John Paul II and Benedict XVI as honorary Republicans, and Pope Francis as a Democrat (hence, why he is appearing on the covers of Time and Rolling Stone magazines).
This division does indeed capture real battle lines, but more than anything, the divide is merely an extension of our politics, and—while manned by real actors—does not capture where the real action is to be found today in American Catholic circles.
The real action does not involve liberal “Catholics” at all. Liberal Catholicism, while well-represented in elite circles of the Democratic Party, qua Catholicism is finished. Liberal Catholicism has no future—like liberal Protestantism, it is fated to become liberalism simpliciter within a generation. The children of liberal Catholics will either want their liberalism unvarnished by incense and holy water, or they will rebel and ask if there’s something more challenging, disobeying their parents by “reverting” to Catholicism. While “liberal” Catholicism will appear to be a force because it will continue to have political representation, as a “project” and a theology, like liberal Protestantism it is doomed to oblivion.
The real battle is taking place beyond the purview of the pages of Time Magazine and the New York Times. The battle pits two camps of “conservative” Catholicism (let’s dispense with that label immediately and permanently—as my argument suggests, and others have said better, our political labels are inadequate to the task).
On the one side one finds an older American tradition of orthodox Catholicism as it has developed in the nation since the mid-twentieth century. It is closely aligned to the work of the Jesuit theologian John Courtney Murray, and its most visible proponent today is George Weigel, who has inherited the mantle from Richard John Neuhaus and Michael Novak. Its intellectual home remains the journal founded by Neuhaus, First Things. Among its number can be counted thinkers like Robert George, Hadley Arkes, and Robert Royal.
Its basic positions align closely to the arguments developed by John Courtney Murray and others. Essentially, there is no fundamental contradiction between liberal democracy and Catholicism. Liberal democracy is, or at its best can be, a tolerant home for Catholics, one that acknowledges contributions of the Catholic tradition and is leavened by its moral commitments. While liberalism alone can be brittle and thin—its stated neutrality can leave it awash in relativism and indifferentism—it is deepened and rendered more sustainable by the Catholic presence. Murray went so far as to argue that America is in fact more Catholic than even its Protestant founders realized—that they availed themselves unknowingly of a longer and deeper tradition of natural law that undergirded the thinner liberal commitments of the American founding. The Founders “built better than they knew,” and so it is Catholics like Orestes Brownson and Murray, and not liberal lions like John Locke or Thomas Jefferson, who have better articulated and today defends the American project. Read More…
Today marks the 41st anniversary of the national legalization of abortion in the United States. Thousands will march on the Mall and in the streets of Washington D.C. in protest of this decision, braving frigid temperatures and a blanket of snow to express their profound moral objection to Roe v. Wade and lamenting the estimated 55 million young lives that were legally extinguished since January 22, 1973.
The March for Life has become a rallying point for the pro-life movement, an annual pilgrimage of sorts, especially for young people who gather together to affirm a bedrock belief: the sanctity of human life from conception to natural death. Even amid the overwhelming sense of tragedy and loss that draws them to D.C., in order, it is hoped, to effect a change, there is also a sense of affirmation and even celebration in the company of many others who are also so firmly committed, who gather to defend a belief that is today dismissed and mocked by cultural elites and cognoscenti (including the governor of New York, purportedly a Catholic), who find joy in the fellowship of so many companions who stand for life. As one friend posted on Facebook, “the Tribe is together.”
Amid the widespread sense of shared purpose, there is perhaps little time or inclination to reflect on a question: why gather, as Marchers do, in Washington D.C.? It is perhaps a question whose answer is self-evident: the March ends outside the Supreme Court, which continues to affirm Roe v. Wade as controlling precedent. It is the location of the president and the Senate, which ultimately has the power to make or confirm appointments to the Court. It is the nation’s media center, where such a protest has the best chance of being amplified to the nation. It is physically laid out to accommodate large protests, with its Mall almost seeming to have been designed for that purpose. It is the nation’s capital, where our elites congregate to make policy and steer the nation. Naturally, if people from all parts of the nation gather in protest of a national issue, it is not only the best place, but the only place.
However, the March’s annual presence in D.C. obscures a number of issues, above all, whether abortion is ultimately a political and even legal matter. On one level, inescapably so: it has been a political matter for decades, even a “wedge” issue that has become a defining difference between the two political parties. It is obviously a legal issue, generating countless pages of legal theory and philosophical argument, as well as scores of subsequent High Court and even more lower court decisions that have responded to ongoing challenges and debates over the issue. So perhaps no further thought is necessary—destination D.C.
However, by other considerations, treating it exclusively as a political and legal matter obscures the extent to which it is most fully a question of culture. And, if conservatives would generally tend to agree on one thing—aside from the immorality of abortion—it is that culture does not originate in Washington, D.C., or at least that it shouldn’t. Read More…