Alan Jacobs

The “Doomsday Clock”: An Idea Whose Time Has Gone

Lawrence M. Krauss, a physicist at Arizona State University, writes in the New Yorker about the Doomsday Clock, which was created in 1947 by people associated with the Bulletin of the Atomic Scientists:

I am privileged to chair the Bulletin’s Board of Sponsors, a group of scientists, including sixteen Nobel laureates, that was created by Albert Einstein and Robert Oppenheimer after the Second World War to advise the Bulletin. As a result, I also work with the Bulletin’s Science and Security Board, which, each year, decides on the position of the Doomsday Clock. It’s a difficult task. Many disparate, worldwide factors must be judged in order to realistically assess the total existential risk facing humanity. This task has become even more complex in the past decade because the Bulletin has begun to explore issues beyond nuclear weapons, including climate change, bioterrorism, and cyber threats. Last year, in January, 2015, the Bulletin set the Doomsday Clock at three minutes to midnight. In a statement, we wrote that “Unchecked climate change, global nuclear weapons modernizations, and outsized nuclear weapons arsenals pose extraordinary and undeniable threats to the continued existence of humanity.”

This year, we’ve decided not to move the clock either forward or backward. It will remain set at 11:57 — three minutes to midnight. The fact that the clock’s hands aren’t moving isn’t good news. It’s an expression of grave concern about how the global situation remains largely the same. The last time the clock was this close to midnight was in 1983 — the height of the Cold War.

The “Doomsday Clock” is an odd thing — or non-thing, because of course there really isn’t any such clock. The idea, in 1947, was to use a ticking clock as an image of approaching nuclear war. Eventually someone decided to make a fake clock with moveable hands to make the occasional Doomsday Press Conferences a little more dramatic, but that’s not a timepiece; rather, it’s an image of an image of an emotion: fear.

I say “image of an emotion” because no actual science goes into the decision of where to place the hands of the clock. The scientists who make the decision have no particular expertise in geopolitical strategy, military and political risk assessment, or even climatology (relevant since they incorporate climate change into their assessment). They just read a bunch of stuff and take their own emotional temperature.

Moreover, now that climate change has entered in a major way into their thinking, the “ticking clock” metaphor has lost its fit to the circumstances. It was a good, strong image in the days of the Cold War, when the perceived danger was a nearly-simultaneous firing of nuclear weapons that could destroy a large part of human civilization in a just few hours. But when you’re trying to think about the consequences of anthropogenic climate change, the idea of a clock ticking down to midnight is meaningless. What would “midnight” be? The effects of such alterations to the ecosphere may indeed be vast, but “vaster than empires and more slow,” as the poet says, unfolding over centuries and millennia.

Still, you can understand why Krauss and his fellow members of the Science and Security Board would want to hang on to it. The fake clock, with its ominous name, seems more real than the guesses and anxieties that establish the position of its hands. And though Krauss in no way hides the Board’s responsibility for its decisions, it’s interesting how his language — and the language of many journalists who write about the clock — veers towards objective description: “The fact that the clock’s hands aren’t moving isn’t good news…. The last time the clock was this close to midnight was in 1983 — the height of the Cold War.” When described in this way — “the clock’s hands aren’t moving” — the thing seems to assume volition, like the planchette of a Ouija board. “The last time the clock was this close to midnight was in 1983” gives the appearance of being a more substantial and objective statement than “The last time members of this Board decided to place the hands this close to midnight was 1983.” (Which, incidentally, wasn’t “the height of the Cold War” — that would have been 1962.)

If indeed anthropogenic climate change has become a greater danger than nuclear weapons — and I’m inclined to think that they may well be true — then the “ticking clock” metaphor needs to be retired. But what would replace it? I’m thinking “the spreading (or contracting) slime mold.” Pretty catchy, no?

Posted in .

The Public Intellectual as Modernist Poet

Having recently disagreed with Damon Linker, it’s nice to find agreement on something. In his most recent column, he describes a recent essay by Corey Robin thus:

Is this indifference to political reality a defect or a plus? Left-wing commentator Corey Robin clearly thinks it’s a virtue — and not just when it comes to presidential politics. In a provocative essay for The Chronicle of Higher Education, “How Intellectuals Create a Public,” Robin argues that “the problem with our public intellectuals today is that they are writing for readers who already exist, as they exist,” as opposed to “summoning” a new world, a new public, a new reality, into being.

I had semi-drafted a post on the Robin essay before Damon’s post, so let me take this ball and run with it.

When I read Robin’s essay, what immediately came to my mind is Wallace Stevens’s great poem “The Idea of Order at Key West”:

It was her voice that made
The sky acutest at its vanishing.
She measured to the hour its solitude.
She was the single artificer of the world
In which she sang. And when she sang, the sea,
Whatever self it had, became the self
That was her song, for she was the maker. Then we,
As we beheld her striding there alone,
Knew that there never was a world for her
Except the one she sang and, singing, made.

For Robin, this is the vision: the public intellectual as modernist poet, driven by a “Blessed rage for order … The maker’s rage to order words of the sea….” And all of the rest of us are the rapt, passively obedient audience “summoned” into being by the Maker’s brave new word.

Robin talks a lot about summoning.

“That’s also how public intellectuals work. By virtue of the demands they make upon the reader, they force a reckoning. They summon a public into being.”

“It is precisely that sense of a public — summoned into being by a writer’s demands; divided, forced to take sides — that Sunstein’s writing is in flight from.”

“the politics of division and summoning that is the public intellectual’s stock in trade”

“A world where it is difficult to imagine the summoning of a public, beyond the intermittent, ever-more-fleeting summons we’ve seen these past 20 years”

The word has two major overtones: in the first, a lord summons a servant; in the second, a magician summons a spirit. (Maybe three, if you add a court of law issuing a summons.) These are tropes of mastery, in which the public intellectual assumes dominion over a servile public. To be sure, Robin graciously allows that the public so summoned might disagree with what the public intellectual says; but he does not acknowledge the possibility of ignoring the summons.

A position like this is a recipe for political ineffectuality, because people know when they’re being condescended to, especially when the condescension is this flagrant, and simply will not agree to be summoned. For good and for ill, powerful political rhetoric acknowledges the meaningful agency of the audience, and typically positions the politician as the people’s emissary. Thus Churchill: “It was the nation and the race dwelling all round the globe that had the lion’s heart. I had the luck to be called upon to give the roar.”

We might argue about whether Churchill really believed this — I think he did — but even in the most cynical reading, he was a great politician in part because he knew it was important to say it. Public intellectuals rarely become politicians, but if they want political influence, they need to take a lesson from Churchill. If you don’t actually have respect for the people you’re writing for, fake it.

The Trade-In Society

David Blatt, formerly head coach of the Cleveland Cavaliers Erik Drost/Flickr
David Blatt, formerly head coach of the Cleveland Cavaliers Erik Drost/Flickr

Let’s ask a question: Why was David Blatt fired as coach of the Cleveland Cavaliers? The man who fired him said it was a matter of “a lack of fit with our personnel and our vision.” Possibly true. But it would be more useful to say this: David Blatt got fired because Chip Kelly got fired before him, and Jose Mourinho before him, and Kevin McHale before him, and so on nearly ad infinitum.

That is to say: firing coaches is how professional sports franchises deal with conflict. And athletes know that this is how professional sports franchises deal with conflict: so when a team hits a bad patch, and the players are underperforming, and the coach is getting angry with them, and relationships are fraying… why bother stitching them up? Why bother salving the wounds? If everyone knows where the situation is headed — sacking the manager — then isn’t there rather a strong incentive to make things worse, in order to hasten the inevitable, put an end to the frustrations, start afresh, get a do-over? Of course there is.

And precisely the same tendencies are at work in many of the key institutions of American social life. This is one of the chief reasons why so many marriages end quickly; this is why so many Christians church-hop, to the point that pastors will tell you that church discipline is simply impossible: if you challenge or rebuke a church member for bad behavior, he or she will simply be at another church the next week, or at no church at all.

It seems that we — and I’m using “we” advisedly here, as you’ll see in a moment — are becoming habituated to making the nuclear option the first option, or very close to the first option, when we can. Trying to come to terms with a difficult person, or a difficult situation, is an endeavor fraught with uncertainty: it might work, but it might not, and even if it does work, I could end up paying a big emotional price. Why not just bail out and start over?

I know at least some of these temptations well. Not all of them: I am deeply grateful that I went into my marriage, 35 years ago, sharing with my beloved the bone-deep conviction that, except in the most tragic circumstances, Christian marriage is indissoluble. Bailing out has never been an option for either of us, and since we are very different people with very different responses to the world, that’s been invaluable for us. We’ve had a lot of work to do, but it has been good work, rewarding work.

But in the three decades that I lived in Wheaton, Illinois, I was a member of three different churches, and I often wonder what I might have learned — what wisdom I might have gained, what benefits of character I might have reaped, what good I might have done for others, what I might have been taught by fellow parishioners — if I had never left the first one. I can’t manage to wish I had stayed, but that may be because all I know is what went wrong there, what made me frustrated and unhappy. Any benefits I (or others) might have received through persistent faithfulness are unknown to me, a matter of speculation.

Looking back on my decision to leave that first church, I realize that I did so because I was confident that, whatever good things might have come to me at that church, those good things, or very similar ones, would be available to me elsewhere. It seems to me that if there’s one thing that our current version of advertising-based capitalism teaches us all it’s that everything is replaceable: everything can be reproduced, or traded in for a new and improved model. And that applies to coaches, to churches, to spouses. We live in a trade-in society.

This belief breeds impatience with everything, and that impatience in turn breeds immense frustration with any situation that doesn’t lend itself to the discard-and-replace approach. I think even our recent university-campus controversies can be explained in these terms. Students don’t want to deal with administrators who don’t see things their way, or speakers who say things they find offensive, but they realize that an immediate opt-out isn’t possible. You can’t walk away from Oberlin on a Friday and show up for class at Carleton on Monday morning. At least for a time, you’re stuck. But what if you’re stuck in a situation and have never been taught how to negotiate, how to work things out, how to be patient in the midst of conflict? Well, then, you make demands. You are very insistent that “These are demands and not suggestions”. And often those demands are that administrators or faculty be fired — like football coaches who haven’t won enough, basketball coaches who manifest “a lack of fit with our personnel and our vision” — because that, they think, can be done right now.

What most troubles me about these pathologies is that I don’t see any way back from the current level of impatience and the inability — indeed, refusal — to persist through difficulties. You can always point to marriages that have survived struggles and come to thrive; or workplace enemies who became mutually-valued collaborators; or sports franchises, like the San Antonio Spurs, that have succeeded through a commitment to continuity. But in a trade-in society, those situations look like black swans: unpredictable, inexplicable. (And will the Spurs continue to prize continuity when Tim Duncan, one of the best players ever, and Gregg Popovich, one of the best coaches ever, retire? Or will the pressure towards immediate action prove too much for their institutional culture to resist?)

The president of Oberlin, Marvin Krislov, has published an open letter in response to the protestors in which he says “I will not respond directly to any document that explicitly rejects the notion of collaborative engagement,” in part because “many of its demands contravene principles of shared governance.” That is, the students are demanding that the college president dictate changes that he doesn’t actually have the power to do, according to the by-laws and written procedures of the college. Similarly, when students at public universities demand the punishment or prohibition of “hate speech,” they can be reminded that the First Amendment makes no exception for hate speech. So in increasing numbers Americans, especially younger Americans, support the repeal of the First Amendment. It turns out that many people are profoundly unhappy with social and political structures that prevent the immediate implementation of their desires, and are willing to discard them — without pausing to reflect that the people who share their desires may not always be in the majority. You can’t remove those breaks for yourself without simultaneously removing them for your political and social enemies. (Though Lord knows people try.)

The impatience that people feel with manifest injustice is understandable, and more than understandable. In perhaps the most powerful passage in his “Letter from the Birmingham Jail,” Martin Luther King, Jr. answers white moderates’ counsel of patience with a long litany of everyday abuse and affliction, and concludes: “There comes a time when the cup of endurance runs over and men are no longer willing to be plunged into an abyss of injustice where they experience the bleakness of corroding despair. I hope, sirs, you can understand our legitimate and unavoidable impatience.” And all God’s people say Amen.

But it’s worth noting that those white moderates wanted King and his fellow protestors to do nothing: simply to wait and trust that time would somehow naturally bring about justice. And it is also worth noting that it’s socially unhealthy when people exhibit impatience far beyond Dr. King’s when confronted by injustices that are far less massive — or when faced by mere inconveniences or strictly personal discomforts. People want to be able to trade in old models of anything and everything, and profoundly resent any social or political structures that inhibit instantaneous action.

In such an environment, it’s no wonder that a great many people applaud a Presidential candidate who believes that he can “see Bill Gates” about “closing up that internet.” (The old internet is messed up — let’s trade it in for another one.) I suspect they overlap pretty significantly with the folks who demand, after every losing streak, that their favorite team’s coach be fired; and with the more aggressive of the student protestors. Trump supporters may not seem to have much in common with people demanding that racially insensitive university administrators be fired, but there’s a deep temperamental affinity. They’re all enthusiastic adherents of the trade-in society.

Posted in .

Football Coaches and Rational Choices

Benjamin Morris at FiveThirtyEight:

Thus, our best (and perhaps slightly conservative) estimate is that the Packers cost themselves about 7.9 percent of a win by kicking rather than going for two, and this whole thing could have been avoided if NFL coaches took the time to sit down and learn some basic percentages….

Another year, another year with NFL coaches not doing their jobs and not being taken to task for it. By now, coaches have no excuse for not having mastered basic decisions like these.

People say coaches are afraid of media criticism. But they’re professionals, among the handful of elite who are capable of doing what they do. If a coach cares what the media thinks, let him explain his logic.

There’s so much that is touchingly naïve about this, but more than anything, the idea that a football coach — or anyone else — could save his job when under fierce criticism by “explaining his logic.”

Morris thinks coaches, when they make these decisions, are being irrational. They are in fact being perfectly rational. Why does Morris think they are irrational? Because he thinks that the only relevant factor in evaluating decisions is what will increase the likelihood of winning a game. But this is obviously false, because every coach or manager knows that many coaches and managers, across the spectrum of sports, who have been very good at winning games have also, with alarming frequency and without rational justification, been fired. And no coach is selfless enough to factor his own job security out of his calculations.

Here’s what Mike McCarthy may have been thinking at the crucial moment in that playoff game:

  1. If we go for two and make it, I will be praised as a “riverboat gambler,” because hardly anyone in the press or among Packers fandom understands the percentages involved.
  2. If we go for two and fail, millions of people will scream “What the hell was McCarthy doing??” Thousands of people will call into radio shows to demand my ouster. Hundreds of columnists will write stories about my recklessness and thoughtlessness. And the sum total of all these interventions will put pressure on my bosses to fire me, pressure that they well very likely succumb to, especially since we haven’t been all that great the past couple of years.
  3. If I just kick the extra point I will be, generally speaking, neither praised nor blamed.

Ergo, and given that coaches will inevitably be concerned not just with the chances of winning a given game but also with the chances of keeping their jobs, McCarthy’s decision to kick the extra point was perfectly rational, as long as we have a proper understanding of what “reason” is in a given case; which is to say, as long as we factor in the variables that are immensely significant but not in any obvious way mathematically calculable.

The more general lesson to be applied here — one that the analysts at FiveThirtyEight need to reflect on — is this: It is not rational to act perfectly “rationally” when surrounded by irrational people whose actions have influence over your life.

On Suicide and Marriage

Marriage rests upon the immutable givens that compose it: words, bodies, characters, histories, places. Some wishes cannot succeed; some victories cannot be won; some loneliness is incorrigible. But there is relief and freedom in knowing what is real; these givens come to us out of the perennial reality of the world, like the terrain we live on. One does not care for this ground to make it a different place, or to make it perfect, but to make it inhabitable and to make it better. To flee from its realities is only to arrive at them unprepared.

Because the condition of marriage is worldly and its meaning communal, no one party to it can be solely in charge. What you alone think it ought to be, it is not going to be. Where you alone think you want it to go, it is not going to go. It is going where the two of you – and marriage, time, life, history, and the world – will take it. You do not know the road; you have committed your life to a way.

That’s Wendell Berry, from his great essay “Poetry and Marriage.”

Damon Linker makes an interesting argument here, in which he responds to this post by Daniel Payne, which in turn responds to this post by Kevin Drum.
Payne writes that if Drum kills himself “he will have cheated [his wife] out of something that is hers by right: the chance to realize her wedding vows and her matrimonial commitment to the fullest possible degree by conferring upon her husband the last and most important measures of care and comfort she can give him.” Linker replies, “Message: It would be selfish for Drum to end his own suffering and thereby deprive his family members of having to endure that suffering with him.”

Damon Linker is a good thinker and a fine writer, but I have complained more than once over the years that he has a habit of in-other-wordsing. That is, he quotes or cites someone and, adds “in other words” (or in this case “Message:”), attributes to the author something he or she did not say, and then refutes that. Perhaps Payne does indeed believe that “suffering is not only necessary, but even in some respects good,” but that’s not what he says in the passage Linker quotes. What Payne says is that it is good when a person gives “care and comfort” to someone he or she loves in that person’s time of suffering. Good for the person giving the care, and good for the person receiving it. And having cared for my wife through a long and difficult (though not mortal, thanks be to God!) illness, I can testify — and she can testify — that this is true, as long as the care is both given and received graciously.

Linker connects this belief to the claim that our lives are not our own, but rather belong to God, who gives us stewardship over them, and concludes: “Without these theological assumptions, the opposition to assisted suicide makes no sense.” But is the claim that the bonds of marriage are fulfilled when we care for one another in suffering actually a theological claim? Linker clearly thinks so. Perhaps without a Christian doctrinal foundation we cannot make such strong claims on one another, even in marriage. Even when a vision of mutual care and comfort like Payne’s is not articulated in theological terms, it only makes sense when supported and justified by a model of marriage as strong as the Christian one.

Hey, he said it, I didn’t. (Unless I’m in-other-wordsing him.)

Posted in , .

On Arguments Against Religious Education

Academic freedom is something I’ve written about a good deal over the years—and quite recently—which I suppose has been inevitable, since if you teach at a religiously-based institution you always hear that the problem with such places is that they constrain academic freedom.

To that claim, I have always responded that I have taught at religious institutions because in them I have academic freedom. (And if you follow up the links in that post I just mentioned you’ll find other people making the same point.) There is just no logically coherent, evidence-based way to claim that religious institutions have less academic freedom than secular ones. Every community of learning has limits, though different limits, articulated with different degrees of explicitness. Academic freedom is a concept relative to the norms of communities, institutions, and disciplines–and often of society as a whole. Academic freedom is therefore bounded freedom (as freedom always is), and when people don’t recognize that they make incoherent arguments.

Often the claim that religious institutions offer less freedom is based simply on personal feeling: the feeling that you would be constrained there, and therefore it is a place of greater constraint than your current institutional location, where you feel quite free to do and say what you want. But what if I would feel more constrained in your location than I do in my own? Moreover, the legal history of academia is littered with the remains of scholars, sometimes tenured scholars, who thought that their academic freedom was absolute, only to be “terminated for cause.”

In light of all this, I thought it might be a public service to offer some logically consistent and coherent arguments against religious education (some of which people make openly, some of which are only implicit views that people who hold them don’t know that they hold). So here goes:

  1. Education is properly a function of the state. All private educational institutions, including religious ones but others as well, should be abolished. If people want to get together to learn and teach, that’s fine, but they may not formally constitute or incorporate themselves as a school. If this requires a Constitutional amendment, so be it. (N.B.: Anyone who actually wanted this to happen would probably say that it doesn’t violate the Constitution, but quite clearly control of education is not one of the powers allocated to Congress in that document—though some think that it should be—so the only honest and coherent version of this argument would have to accept the necessity of amending the Constitution.)
  2. Private educational institutions may exist, but teachers and students at them should not be eligible for federal or state funding: no NIH or NEH or NSF grants for faculty, no federally guaranteed loans for students. Taxpayer support should go only to those institutions that are constituted and governed by the people.
  3. While the government of the United States may not discriminate against an institution on the grounds of religion, the educational establishment as a whole may—as long as the relevant accrediting agencies cease to seek and receive approval from the Secretary of Education. By cutting all ties to the government, accrediting agencies would free themselves to declare that religious schools violate the core principles of higher education and therefore may not be accredited. To de-accredit such institutions might not kill them, but could damage them seriously.
  4. Religion does more harm than good, and interferes with the state’s ability to care properly for its citizens, so the First Amendment should be repealed and the Constitution amended in order to prohibit—or at least place strict controls on—religious organizations. All Christian schools, at every level, will be abolished. Private but non-religious institutions may remain as they are.

Notice that anyone holding the first argument may not have anything against private education as such, but is willing to sacrifice non-religious private institutions in order to get rid of religious ones, without radically altering the Constitution. The same may be said of the rather milder plan embodied in the second argument. The fourth argument grasps the nettle that the first won’t grasp. The third one … well, something like that may actually happen.

Posted in . Tagged , , .

Parting the Veil of Ignorance

Sometimes it seems that this is John Rawls’ world, we’re just living in it: conducting our political and social debates as though we were behind a veil of ignorance, as though we can only trust the judgments made in perfect abstraction from any actual lived contexts. The debate about the future of Larycia Hawkins at Wheaton College is a classic example, where a complex, embodied, richly personal situation gets translated into the terms of disembodied theological debate.

It’s not that such debates are fruitless or useless: they matter. But they aren’t all that matters. And they can distract us from more complex considerations.

Consider the experience of my friend Matt Milliner, an art historian at Wheaton:

In my field work in Turkey, Egypt and Cyprus I had some negative and some very positive contacts with Muslims, and spent lots of mornings waking to the prayer call of the minaret. But interestingly, the best relationships with Muslims I have had have been in this town. Believe it or not, they (and by “they” I mean real local people like Abraham and Zahra) are well aware that despite this media firestorm, Wheaton College is here for the long haul, and so are they. Accordingly, my son is not even one year old, and he has been held by more Muslims than Christians. The reason for this is that at the Islamic Center of Wheaton, the gracious, hijab-donning women joyfully pass him around while I talk theology with my new friends. From the beginning we have been clear about our differences. I actually believe, for example, you could have passed Allah (the pre-Islamic Arabic term for God which is still used by Arabic-speaking Christians) around in the same way as my son. And sentimental as it may sound, because God freely gave his son, I can freely give mine.

Wheaton College does not exist in a free-standing abstract theological space. It is a college in a town of about 50,000 people, at the north end of which stands the Islamic Center of Wheaton, which just a few years ago moved into a former church building there. Perhaps that accident of real estate — Muslims occupying a church! — has contributed to the cold welcome, or less than a welcome, that those Muslims have experienced from some locals. But my friends at Wheaton College, most of whom also live in the town of Wheaton, or in other nearby communities that also have a visible Muslim presence, know that the people who worship at the Islamic Center of Wheaton are their neighbors, and Christians are supposed to love their neighbors, so … so they show up. They visit. They talk. They bring their children.

Someone doesn’t like that — doesn’t like the very idea that Muslims can be peacefully incorporated into the fabric of a community like Wheaton. So they have created a fake website for the Islamic Center of Wheaton that presents its people as advocates for jihad — and implicates local Christians as well, as another friend and former colleague of mine, Noah Toly, has recently discovered. Christians from the Wheaton community who have befriended their Muslim neighbors are being smeared, along with those neighbors, as advocates of murder and terror.

That’s the context in which, at Wheaton College and in the town of Wheaton, people are discussing and debating what it might mean to say that Muslims and Christians worship the same God.

I’m not saying anything here about whether Larycia Hawkins should or should not be fired. You are free to make your own judgments. My own inclination fits that of Mark Galli, who wrote at Christianity Today that there might be a better way to handle this conflict than dismissal.

But whatever your opinion, it’s important to avoid the temptation to abstract this debate from its human situation — as though we could live behind a veil of ignorance and consider the matter in a purely theoretical sense. It’s on the ground, at the corner of President Street and Geneva Road in Wheaton, Illinois, where those life-challenging and life-transforming questions are being asked: Who are my neighbors? And how might I love them?

Posted in , .

Anti-Socialist elements

Long, along ago, when the world was still young and Communism still ruled Eastern Europe, a labor movement arose in Poland and called itself Solidarność. My wife Teri and I, early-marrieds in those days, were serious politics junkies, and one day she was watching one of Ben Wattenberg’s shows on TV, or maybe listening to him on the radio — the details are fuzzy after all these years. But in any event, Wattenberg was reporting on the anxiety Solidarity was generating in Poland’s communist government, which, when the union called a great strike, made a point of referring to members of the movement as “anti-socialist elements.” This phrase was used so often that the union members came to have affection for it, Wattenberg said, and had started wearing shirts featuring the phrase.

Teri thought this was very cool and very funny, and wished that she had an “anti-socialist element” shirt. So she tracked down Ben Wattenberg’s phone number, managed somehow to get him on the phone, and asked him if he had any idea how she might be able to obtain one of those shirts. He indicated that he might actually be able to procure one for her, for twenty bucks — a good bit of money in 1981. But she was all over it.

A few days later it showed up and she wore it with pride for some time (at first greatly confusing the Polish custodian of the apartment building we lived in). The other day she was cleaning out one of our closets and look at what she found:

P1010431

Still a thing of beauty. Long live the spirit of Solidarność!

I’m Thinking It Over

Just before Christmas, I posted to my personal blog an account of my year in tech; it’s largely the story of how I strove to simplify my life in 2015. That post had only been up for a few hours when I got a long email from someone I don’t know telling me (among other things) that I was wrong to leave public Twitter, that without Twitter the only people who can give me intellectual feedback will be my students, that I should consider whether it’s un-Christian of me to “stop engaging,” and that if people who want to respond to my ideas have to go to the trouble of writing emails they might not respond at all.

My first reaction to this was sheer bemusement: I simply can’t imagine writing to someone I don’t know to tell him (in detail!) how he should and should not use social media. Is that any of my business? But before I could think further, another email showed up. This one was also from a stranger, and was also in response to a post on my personal blog, a brief one in which I explained why I had not commented on the current controversy at Wheaton College. This person told me that my plea of ignorance was unconvincing and my failure to respond was “weak and timid.”

This second email, coming so soon after the first, clarified some things for me.

There’s a famous Jack Benny routine, one he used many times over the course of decades, in which he’s confronted by a mugger. Now, you need to be aware that the chief personality trait of “Jack Benny” — the character Jack Benny played on radio and TV — is a compulsive miserliness. So:

Mugger [pointing a gun at Jack]: Your money or your life.

Jack: [silence]

Mugger: I said, Your money or your life!

Jack [exasperated]: “I’m thinking it over!”

The internet is also a mugger, but what it demands is not my money but my attention and my reaction, and it wants them right now. And “I’m thinking it over” isn’t an acceptable response.

When the leadership of Wheaton College placed Professor Larycia Hawkins on leave, it was not clear to me precisely why. Several weeks later, and after considerably more communication from the college, it’s still not clear to me precisely why, though everyone agrees that it had nothing to do with Hawkins’ wearing of a hijab. Moreover, even her statement that “Muslims and Christians worship the same God” seems not to have been definitive: other faculty have made similar or identical statements, but (says the college administration on the webpage just cited) “In those instances, the individuals rapidly and emphatically explained their opinions and affirmed their full consistency with the theological identity of Wheaton College.” This lack of clarity has not stopped some people from demanding that Hawkins be fired, nor others from confidently declaring that Wheaton’s leaders are bigots motivated by religious intolerance and Islamophobia. How the latter are able to read the minds and hearts of people they don’t know, I can’t tell you; maybe you could ask them.

Anyway, a great many people are going off half-cocked on this issue; and what those emails I got remind me is that going off half-cocked is now widely perceived as a virtue, and the disinclination to do so as a vice. Moreover, that poorly informed and probably inflammatory statement of Your Incontrovertibly Correct Position must be on the internet — and according to my first protestor either directly on or accessible to Twitter — or it doesn’t count towards your treasury of merit.

I want to suggest some alternative ways of thinking about these matters, and related ones:

  • I don’t have to say something just because everyone around me is.
  • I don’t have to speak about things I know little or nothing about.
  • I don’t have to speak about issues that will be totally forgotten in a few weeks or months by the people who at this moment are most strenuously demanding a response.
  • I don’t have to spend my time in environments that press me to speak without knowledge.
  • If I can bring to an issue heat, but no light, it is probably best that I remain silent.
  • Private communication can be more valuable than public.
  • Delayed communication, made when people have had time to think and to calm their emotions, is almost always more valuable than immediate reaction.
  • Some conversations are be more meaningful and effective in living rooms, or at dinner tables, than in the middle of Main Street.

In short, peer pressure is always terrible, and social media are a megaphone for peer pressure. And when you use that megaphone all the time you tend to forget that it’s possible to speak at a normal volume: thus my first protestor’s apparently genuinely-held view that if you’re not talking to peers on Twitter you can’t possibly be talking to peers at all. (We must all have been trapped in our silos of silence before 2006.) But the more general view of both of those who wrote to me — that rapidity of response is a virtue, and therefore that technologies that enable rapid response are superior to ones that enforce slowness — is the really pernicious one, I’ve come to believe.

I keep thinking about my first protestor’s complaint that if people can’t respond to me via Twitter or blog comments they might not respond at all. Given my experience of both public Twitter and blog comment threads, my thought is: feature, not bug. Indeed, I should probably have an auto-reply on my email featuring my postal address and encouraging people to write me letters. That might enliven the daily mail deliveries, which have for me, as for all of us, grown so gray and wan over the years. And maybe I would be doing my correspondents a favor also: if typing, printing, and mailing a letter is too much trouble for you, then it could be that the things you have to say aren’t that important, even to you, and you’d be better off using your time in a different way.

Well, that may be too extreme. I’m not a big fan of printing and mailing either, though I use those technologies when I need to. So perhaps email is slow enough, provides enough of a buffer between people and their immediate impulses. Still, I can’t help thinking of the great computer scientist Donald Knuth, who famously doesn’t have an email address. He explains why: “Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration.” My work may not require the same intensity of concentration that Knuth’s requires, but it requires more than I have been accustomed to give it for the past few social-media years.

It won’t be that long before I turn 60, though I struggle to keep that inescapable (and highly unpleasant) fact clear in my mind. I have ideas I want to pursue, stories I want to tell, and friends and colleagues I want to interact with. Things are happening in the world and on the pages of books that I want to meditate on.

I spent about seven years reading replies to my tweets, and more than a decade reading comments on my blog posts. I have considered the costs and benefits, and I have firmly decided that I’m not going to be held hostage to that stuff any more. The chief reason is not that people are ill-tempered or dim-witted — though Lord knows one of those descriptors is accurate for a distressingly large number of social-media communications — but that so many of them are blown about by every wind of social-media doctrine, their attention swamped by the tsunamis of the moment, their wills captive to the felt need to respond now to what everyone else is responding to now.

Not gonna do it. Wouldn’t be prudent. I’m trying to turn my mind towards the longer term, striving to get a little closer to the bottom of things. I’m thinking it over.

Happy New Year, everybody! And Festina Lente!

Powers

[Editors note: this post contains plot spoilers for Star Wars: The Force Awakens]

Screenwriters love superpowers — a category in which I include magical powers — because they can make those powers wax or wane according to the needs of the plot at any given moment. This point is more interesting than it might at first seem.

A classic example: in The Return of the King, when the Witch-King is able to shatter Gandalf’s staff with a mere thought. Leaving aside the question of why he wouldn’t just kill Gandalf on the spot if it were that easy, I’ll just note that this is the same guy who with four other Ringwraiths was no match for Aragorn on Weathertop. Now, his power had been increasing since then, but that’s a pretty darn rapid and pretty darn massive increase in power; and hadn’t Gandalf himself also passed through death and returned with renewed strength of his own? No, the whole thing makes absolutely no sense: it loks like Peter Jackson deciding that it would be really cool at that point if Gandalf were helpless before the Witch-King and just about to be killed but then something else happens and the Witch-King flies away without doing anything to Gandalf and wow, wasn’t that a close shave for the old wizard? Making absolute nonsense of the whole concept of “powers” was just a price he had to pay for that cool moment.

Similarly, in The Force Awakens we have in Kylo Ren a massively capable and ruthless controller of the Force who was trained as a Jedi by Luke Skywalker himself and therefore is clearly one of the world’s great masters of the lightsaber — but here he can just barely hold his own against two people who have never held such a weapon before. Again, the simple exigencies of plot and action are at work here: we wouldn’t have much of a story if such scenes unfolded plausibly.

The same storytelling logic plays out, as Brad Bird brilliantly shows in The Incredibles, when villains start “monologuing.” Lucius (AKA Frozone) explains:

Lucius: So now I’m in deep trouble. I mean, one more jolt of this death ray and I’m an epitaph. Somehow I manage to find cover and what does Baron von Ruthless do?
Bob: He starts monologuing.
Lucius: He starts monologuing! He starts, like, this prepared speech about how feeble I am compared to him, how inevitable my defeat is, how the world will soon be his, yadda yadda yadda. Yammering! I mean, the guy has me on a platter and he won’t shut up!

This becomes a theme throughout the movie, which (of course!) indulges in the very narrative tic it so brilliantly makes fun of. Because deploying stuff like this — putting your beloved characters in impossible spots before you rescue them — is just how you sustain tension, how you keep viewers on the edge of their seats.

But maybe there’s something else at work here too. Beyond a need to sustain narrative tension, what do all these moments have in common? It seems to me that they all say, in their varying ways, that power as such, power as we typically define it, isn’t the whole story.

“The race is not [always] to the swift, nor the battle to the strong,” saith the Preacher — to which Damon Runyon famously replied, and with some justification, yeah, but that’s the way to bet. It is, if you’re a betting person. Nobody smart would have bet on David against Goliath. But sometimes — not often, but sometimes — longshots come in. And if you hang around long enough, or read enough history, you start to notice that they tend to do so at curiously opportune times. Not often, mind you, not often enough that everyone will see a pattern, but … sometimes the battle is not to the strong; sometimes overwhelming force is defeated. Occasionally it plays a role in its own defeat, by trusting too much in itself — counting on, calculating by, force only — never suspecting that there may be powers at work other than those of strength, skill, numbers.

And that’s what we find buried in so many of our popular stories, stories that arise from the general human sense of things, even if they get taken up by individual authors or managed by vast rapacious corporations: the suspicion that there’s a reason why the race is not always to the swift nor the battle to the strong. Maybe even a reason that the cynical old Preacher didn’t imagine, since he credited the defeat of the powerful to “time and chance.”

In Tolkien’s Lord of the Rings — setting aside Peter Jackson’s — Gandalf tells the history of the Rings of Power, and especially of the Great Ring made “to rule them all,” and explains that that Ring was constantly striving to get back to its maker, Sauron. And yet it did not make its way to him. Instead it came to Bilbo. And behind that curious event “there was something else at work, beyond any design of the Ring-maker. I can put it no plainer than by saying that Bilbo was meant to find the Ring, and not by its maker.” This is not putting the matter very plainly at all: Gandalf’s use of the passive voice is telling. It is perhaps, even for him, just a suspicion — little more than the reading of hints, in the long historical record he knows so well, that Sauron’s bet on the inevitable victory of Power just might not pay off; that perhaps there is something more moving in the world, something that does not work through the Great but through the small, the weak, the unknown, the neglected, the utterly marginal.

Merry Christmas, everyone.

Posted in , .

Christian Education and ‘Intellectual Compromise’

Paradise Picture / Shutterstock.com
Paradise Picture / Shutterstock.com

I have very mixed reactions to the accounts of Christianity and higher education given by Rod’s recent correspondents, but the anonymous Christian professor makes a really vital point about the dangers of the “narrative of Christian oppression.” When I directed the faculty Faith and Learning Program at Wheaton College, years ago, I regularly told younger faculty, “If you submit an article or book for publication and it gets rejected, always, always, always assume that it’s because your work wasn’t good enough.” The assumption of anti-Christian prejudice gets you off the hook for improving your scholarship; ultimately, it makes you listless and lazy and increasingly prone to playing the victim card. Even when you have good reason to suspect such animosity, pretend you don’t, and get back to work.

So props to the Anonymous Christian Prof for noting the dangers of that way of thinking. But not for this:

But there is a second reason that the abandonment of secular institutions by Christians is a shame, and I will be blunt. Contemporary Christians — taken as a group — do not have the intellectual heft of their secular counterparts. The best scholarship, the highest quality thinking, still goes on at secular institutions. To go to a Christian institution involves — by and large – an intellectual compromise.

First of all, the claim is phrased imprecisely: by “contemporary Christians” I assume she means individual academics? But she then goes on in the next sentence to speak of institutions. So I don’t know whether the charge is that individual Christian academics are not as smart as other academics or Christian colleges and universities aren’t as academically rigorous as their secular counterparts. And it’s hard to answer the charge, given that there aren’t very many Christian colleges and universities, and that their institutional missions tend to be quite different than those of secular schools. Also, many faculty at Christian institutions either were (like me) educated wholly at secular universities or in some mix of Christian and secular schools.

But I will say this: I believe the kind of education students receive in the Honors College at Baylor, and at Wheaton, is in most respects far superior to what they would receive at secular schools of greater academic reputation and social prestige. Indeed, in my years at Wheaton I often heard comments to this effect from visitors. I think for instance of a professor at one of America’s top ten universities who said to me, after spending some time with one of my classes, “Your students are better informed and ask more incisive questions than mine do.”

I could adduce more examples, and explore more comparisons, but let me conclude with this: If this professor’s commendation of Wheaton’s students has at least some validity, how might we account for this state of affairs? I would point to three factors:

1) At Christian colleges, students and faculty alike tend to think of learning as a project in which the whole person is involved. Information is not typically separated from knowledge, nor knowledge from wisdom. The quest for education is less performative, more earnest than at many secular institutions. People are more likely to think and speak of education as something that leads to eudaimonia, flourishing.

2) A closely related point: Christian institutions tend to think quite consciously that their task involves Bildung, the formation of young people’s characters as well as their minds. So in hiring and retention they place a greater emphasis on teaching and mentoring than is common in secular institutions. (There are exceptions, of course, but even the most student-centered secular institutions cannot, because of their intrinsic pluralism, specify what good personal formation looks like.)

3) Perhaps the most important feature: Christian teachers and students alike can never forget that their views are not widely shared in the culture as a whole. We read a great many books written by people who don’t believe what we believe; we are always aware of being different. This is a tremendous boon to true learning, because it discourages people from deploying rote pieties as a substitute for genuine thought. No Christian student or professor can ever forget the possibility of alternative beliefs or unbeliefs. Most students who graduate from Christian colleges have a sharp, clear awareness of alternative ways of being in the world; yet students at secular universities can go from their first undergraduate year all the way to a PhD without ever having a serious encounter with religious thought and experience — with any view of the world other than that of their own social class.

Working in Christian institutions, especially because of this alertness to possible alternative beliefs, has been essential to my own intellectual development. Among other things, it has helped me learn to write for multiple audiences, while always keeping near the forefront of my mind the intense relevance of learning for life. I think it is largely because of, and not in spite of, my workplaces that I have been able to write for some of the most academically rigorous presses in the world, and to do so from an explicitly Christian point of view. Does that sound like “intellectual compromise”?

AnonProf says, with what appears to be satisfaction, “At my school we don’t talk about our religious or political identities in class.” At my school we talk about them all the time; we work through them, struggle with them, see them altered by experience and reflection — we put them on the line. In that sense our classrooms aren’t “safe spaces” at all. “Of course they’re not safe,” I’m tempted to say,” “but they’re good.” What happens at secular institutions like the one AnonProf teaches in may also be good, in its own way, and a better choice for some students, including some Christian students. (In education, one shape definitely does not fit all.) But that it’s intrinsically academically superior is an unsustainable claim.

Posted in .

Dialogue on Democracy, Part 7

(Previous installments: One, Two, ThreeFourFive, and Six.)

A: I’mma let you finish, but can I just interject a thought here?

B: Sure.

A: Just think for a minute about what’s been happening on social media in the past few weeks, in the aftermath of the Colorado Springs and San Bernardino shootings. One group lines up and shouts that these things would never happen if we had stricter gun laws, as though terrorists give a rat’s ass about gun laws; another group lines up opposite them and shouts that these things would never happen if we had a properly armed citizenry, as though mutually assured destruction were equal to social peace. None of of these people are considering, reflecting, thinking; they’re just reacting and emoting. Some of them can’t even grasp the moral difference between those who murder and those who are murdered. The very thought of allowing these people a vote — a voice in the shaping of society — appalls me. They’re simply incapable of rational thought.

B. Your description of the behavior of many people — most people? — is depressingly accurate. A few years ago there was an active Tumblr called These Tragic Events Only Prove My Politics, and I expect that the person who made it stopped updating it because it would have been beating a dead horse. There’s more evidence of the thesis every single day.

That said, we may have reached the point in this conversation where you and I most fundamentally differ. While I agree with your description of each side’s position, and while I agree that both sides are being thoroughly and depressingly irrational, I don’t agree that these people are incapable of rationality. After all, the great majority of them navigate their way through life, through work and marriage and child-rearing and all sorts of other things, without catastrophic error.

A. Well, I don’t —

B. Before you say something like Look at the divorce rate or Look at the anti-vaccination weirdos, stop and ask yourself whether the meritocratic elite you want to put in charge do any better in these areas. Remember, it’s among that elite that Soylent is a thing.

A. You may have a point there.

B. Yeah. But even if I don’t, it’s still my turn. And here’s my chief point: we have a paradox to explain. That paradox is that it’s easy to find people who have long personal histories of behaving fairly rationally – not perfectly so, but fairly so – who are nevertheless bizarrely irrational in their political opinions and their voting. So, as I read the evidence, the problem cannot be that “these people are simply incapable of rational thought.” The problem must lie within the structures of our political system.

Now, you and I agree that the current political system is messed up, but you attribute that mess to a failing or a limitation in the nature of most human beings, who in your view amount to little more than the higher cattle – as Nietzsche says, who envy the cow his unreflective “unhistorical” existence. But –

A. I really don’t want to say that. Perhaps my frustration with the pointlessness of recent political discourse, especially on social media, led me to formulate my thoughts a little too crassly. Let me, briefly, try again: Most people are not irrational per se, but their lack of interest in the actual problems and issues of governance, their preference for other activities, a preference that they disguise from themselves by uttering strong political opinions, disqualifies them from governing themselves. They will be happier and the world will run better if –

B. If running the world is left to the experts?

A. Yes. Treating expertise as though it’s frightening or absurd or impossible is one of the pathologies that we need to get over.

B. Okay. Your amendment is a good one. But it doesn’t alter my case, the making of which I shall now resume. You’re not saying that most people are irrational, but you are saying that they are congenitally unlikely to take sufficient interest in actual politics to learn what they need to know to vote wisely and well; and therefore will, perhaps not at first but eventually, be relieved to have political decisions taken out of their hands and given over to people who are qualified (by temperament, ability, and training) to make them and for whom the making of such decisions will be a full-time job.

A. Close enough.

B. What I would say, by contrast, is that people are interested in and even knowledgable about politics – but politics only on a human scale, a scale appropriate to the range of their experience and interests. Many, perhaps most, of the pathologies of our current political order are products of inhuman scale, what one of our best poets called “the long numbers that rocket the mind.” Consider this: people in Boston will consider themselves invested in what happens in San Bernardino in ways they absolutely are not in what happens in, say, Dublin – and yet San Bernardino and Dublin are pretty much equidistant from Boston. People in Boston can’t alter events in San Bernardino any more than they can alter events in Dublin, and yet the ideology of the modern nation-state makes them feel that they can and should have some say in what happens 3,000 miles away, as long as that distant place is in the same country. In one sense, of course, as Terence reminds us, Homo sum, humani nihil a me alienum puto; but in another sense 3,000 miles of separation stretches the possibilities of genuine understanding beyond their natural capacity, and social media and video don’t change that situation.

And this takes me back to my earlier comment about your emphasis on “exit not voice”: the exit-only option might make sense if I dislike life in Waco and decide to try a more congenial environment in Austin; it will make less sense if I live in Topeka and decide that the whole American social order is so flawed that I need to move to Toronto. Exit-not-voice is great if you have either (a) tons of money or (b) a human scale. But then, if you consider matters on a genuinely human scale, the whole question of voice looks very different. Because just leaving Topeka may not seem such a rational option, even though I despise the whole American social order, if my mom lives there.

Francis Bacon, High Noon, and the Student Protestors

Gary Cooper and Grace Kelly in "High Noon"
Gary Cooper and Grace Kelly in "High Noon"

This is a kind of scholarly-geeky follow-up to my earlier post on the ways that humanistic study can — doesn’t always or necessarily, but can, if the good will is there — promote compassion, fellow-feeling, mutual recognition. But if the good will is not there, and there is no curriculum of study in place to promote it, encourage students to consider its value … well, then you get protests and demands.

This is not to say protests and demands have no place in social life — but it’s pretty clear here that on many campuses today demands, and stringent demands at that, are the first recourse: we may take as a fairly representative example Dean Mary Spellman of Claremont McKenna College, whose obviously sympathetic email to a troubled student merely led to her forced resignation. That’s one side of this demand culture — zero tolerance, single sanction (expulsion from the community) — and the other side is the attempt to herd faculty as well as students into cultural/racial/sexual sensitivity courses, in order, ideally, to make such errors impossible.

A lot that can be said about this has already been said: that it’s an obvious repudiation of free speech, that it’s reminiscent of Maoist and Stalinist re-education programs, that it’s an error-has-no-rights model, and so on. And I don’t strongly disagree with any of these arguments, though I think I have more sympathy with at least some of the protestors than many of my fellow conservatives. But what strikes me about the whole approach is how … well, Baconian it is. Sir Francis Bacon, not Francis Bacon the painter or Kevin Bacon. Bacon’s early attempts at inaugurating what we would now call the scientific method, and extending it into the whole of philosophical reflection, seem to me to prefigure rather eerily the thoughts of the student protestors.

As I said, this is a geeky sort of take on the whole business. But please bear with me.

In his New Organon of 1620, Bacon lays out his method for inquiry. He begins by stating, “I propose to establish progressive stages of certainty,” and agrees that many of the medieval scholastics who emphasized the power of logic wanted to do the same thing. He doesn’t think they went about it the right way, but he and they have this in common: “they were in search of helps for the understanding, and had no confidence in the native and spontaneous process of the mind.” That’s true of our modern protestors as well: “the native and spontaneous process of the mind” — the white male mind anyway — is notoriously unreliable.

With that agreement in mind, let’s turn to how Bacon differentiates himself from the logicians:

But this remedy comes too late to do any good, when the mind is already, through the daily intercourse and conversation of life, occupied with unsound doctrines and beset on all sides by vain imaginations…. There remains but one course for the recovery of a sound and healthy condition — namely, that the entire work of the understanding be commenced afresh, and the mind itself be from the very outset not left to take its own course, but guided at every step; and the business be done as if by machinery.

Emphasis mine. The mind of someone like Mary Spellman is “occupied with unsound doctrines and beset on all sides by vain imaginations”: she can only be cast out, so the community can turn to the more promising work of educating younger people. They will “be guided at every step”; and since the lessons they have to learn are fixed and invariant, “the business [may] be done as if by machinery.”

In a vitally important essay that I referred to in an earlier post on this constellation of issues, the philosopher Charles Taylor calls this Baconian habit of mind “code fetishism” or “normolatry.” And in summarizing the thought of Ivan Illich, Taylor develops his point:

Even the best codes can become idolatrous traps that tempt us to complicity in violence. Illich reminds us not to become totally invested in the code — even the best code of a peace-loving, egalitarian variety — of liberalism. We should find the centre of our spiritual lives beyond the code, deeper than the code, in networks of living concern, which are not to be sacrificed to the code, which must even from time to time subvert it.

This probably won’t be the last time I revisit these ideas by Taylor, because they are so vital for understanding multiple pathologies of our current public square. This Baconian disciplinary “machinery,” executing the normolaters’ preferred codes, simply eliminates the possibility of strengthening our “networks of living concern.” The code-fetishist model must be resisted at every turn, and the people best placed to do that are people who have been formed within a deeply humanistic model of inquiry and debate. Nothing is more needful in the current campus environment than a renewal and re-empowerment of the humanities.

There’s a great moment in the movie High Noon — a moment that keeps recurring to my mind as I think about the current conflicts on campus. Despite having promised his fiancée that he will leave town with her, Marshal Will Kane decides he has to stay to fight the Miller gang, who are about to terrorize the town. He makes a strong case, but she — vitally, a Quaker — says, “I don’t care who’s right and who’s wrong, there has to be a better way for people to live.”

Those words keep echoing in my head (I even hinted at them in my previous post on these matters): I don’t care who’s right and who’s wrong, there has to be a better way for people to live. Alas, the way things play out in the movie doesn’t give me a lot of encouragement.

Dialogue on Democracy, Part 6

Claude, View of Carthage with Dido and Aeneas
Claude, View of Carthage with Dido and Aeneas

(Previous installments: One, Two, ThreeFour, and Five.)

A. Whoa, that was quite an interruption. Where were we?

B. You were talking about voice versus exit.

A. Ah yes, thanks. So: one of the ways we might define democracy is as an irrational obsession with voice — with “having a say.” As I have tried to argue, most people don’t know enough, and don’t care enough, to deserve a say; but even those who do have some knowledge and some interest typically disagree with one another often enough and seriously enough that the work of ordering a society is compromised, disrupted, sometimes paralyzed – as, for instance, when the U.S. government shuts down because Congress can’t agree on a budget. This is, as the saying goes, no way to run an airline. What we need to do is overcome this irrational obsession with having a say, and instead place the emphasis on the power of exit. I have a good bit of sympathy with those who argue that exit is the only truly universal human right: the ability to say, “I’ve had enough of this crap, I’m getting out of here.” People who have the right of exit have a very powerful right.

B. But what if you live in Kansas?

A. Huh?

B. Let’s be realistic about this, shall we? The right of exit is one that doesn’t make a whole lot of sense if the only door out is a couple of thousand miles away. You call it a universal human right, but how can it be a right if only those who have a sizable pile of cash are able to take advantage of it? It’s interesting that this argument – the argument that exit is the fundamental human right – tends to come from the seasteaders, from people who have enough money (or hope to get it) that they can actually imagine creating an offshore community, a city floating in the Pacific, that they can live on. And of course that would also mean having enough of a savings account to buy whatever goods can’t be produced out there. It’s ridiculous. It’s Fantasy Island for rich people.

A. No, it doesn’t have —

B. Don’t you think it might be my turn now?

A. Fair enough. Do your worst.

B. Thanks. Okay, so, furthermore — and to me this is a major problem with your whole way of thinking — you’re assuming completely deracinated human beings. The people you have making decisions, whether those decisions involve how to run this imaginary neo-reactionary world or how to leave it, are all just fleshy calculating machines, running the numbers to achieve maximum efficiency according to some totally abstract utilitarian calculus. In your world it wouldn’t even make any sense for someone to say “I don’t like the government that I have right now, but I love this place and don’t want to leave it.” What if we have, as Simone Weil argued, a need for roots? What if, as Roger Scruton argues, piety is fundamental to human flourishing?

A. Piety?? Haven’t heard that term in a while.

B. Piety in the sense that Virgil refers to his hero as “pious Aneas”: pietas, devotion — devotion to something bigger than my own preferences. You can’t build Rome without piety, or sustain it once it’s built; you can’t even build your Fantasy Island in the Pacific without some degree of that virtue.

A. Actually, I think you can build that fantasy Island in the Pacific – if that’s your thing; I’m not saying it’s my thing – without piety as you have described it. But never mind. Let’s think about Virgil’s Aeneas. He didn’t really have a lot of choice about his piety, did he? Every time he was in danger of forgetting it one of the gods came down to kick his ass and get him back on the right track. Pietas is all well and good in a world monitored and disciplined by the gods. We don’t live in that world, and I think all too often what we call “attachment to place” is really just nostalgia for a world we don’t live in any longer, if indeed we ever really did.

B. Oh, I think we live in the same world our ancestors did, overseen by the same deity or deities that have always been here. Isn’t it interesting that our conversation has taken us into the realm of metaphysics? It turns out that if you argue about politics long enough you get to something deeper than politics. But I think we can keep the metaphysical arguments at arm’s length at least for a while longer. It’s sufficient to ask whether human beings can flourish in the kind of moral and political world you are imagining, without pressing the question of why they need certain conditions in order to flourish. (It would be possible to give purely naturalistic, sociological, evolutionary reasons why people need roots and are rightly driven by piety.) It’s enough, I think, for me to insist that in your utopia people would be miserable. And they would be miserable because they would lack a voice, though not, perhaps, in the sense in which the neo-reactionaries typically talk about voice.

Compassion and the Humanities

Flickr Commons

The word “compassion” means to “suffer with.” It presumes fellow feeling, based on a shared experience of what Hamlet called “the thousand natural shocks that flesh is heir to” — and also, alas, of the unnatural shocks that we humans inflict on one another. That this is a reasonable, indeed an essential, way of thinking and feeling is intrinsic to the idea of the humanities, the study of what is human. It is within this moral frame that Terence’s line, clichéd though it has become over the centuries, becomes so vital: Homo sum, humani nihil a me alienum puto — I am human, and nothing human is alien to me.

To be sure, few sentences can be more wounding to the one who suffers than the casual “I know just how you feel.” No, you damned well don’t is the instinctive response, and often rightly so. But here Terence comes to our aid: he doesn’t say “Everything human is fully comprehensible to me,” but rather, “Nothing human is alien to me”: nothing that my fellow humans experience is beyond my ability to recognize, to have some understanding of — to have compassion for.

In The Doors of Perception, Aldous Huxley writes,

By its very nature every embodied spirit is doomed to suffer and enjoy in solitude. Sensations, feelings, insights, fancies—all these are private and, except through symbols and at second hand, incommunicable. We can pool information about experiences, but never the experiences themselves. From family to nation, every human group is a society of island universes.

Most island universes are sufficiently like one another to permit of inferential understanding or even of mutual empathy or “feeling into.” Thus, remembering our own bereavements and humiliations, we can condole with others in analogous circumstances, can put ourselves (always, of course, in a slightly Pickwickian sense) in their places. But in certain cases communication between universes is incomplete or even nonexistent.

Genuine compassion is therefore an achievement, something gained only by disciplined attentiveness, not to “the other,” that empty abstraction, but to some particular other. To a neighbor. (“My neighbor,” Kierkegaard dryly commented, “is what philosophers call The Other.”) And the greater the suffering of that neighbor, the more rigorous must our attention be if we are to reach some understanding. This is one of the great themes of Simone Weil’s writing, especially in her powerful, spiritually intimidating meditation on “The Love of God and Affliction.” But — and Weil makes this clear too — as great as the challenge is, we must always hold the possibility of compassion before us, because the personal and social costs of neglecting or refusing it are catastrophic.

For the last 25 years or more what used to be the humanistic disciplines have ignored the vital Terentian claim that nothing fully alienates one human from another. It might be better to say that they have betrayed it, which is why I insist that they “used to be” the humanistic disciplines. I don’t have a name for what they are now. Not by acknowledging the power and shaping force of race and gender and sexual orientation and culture, but by treating them as a series of hermetically sealed boxes, they have made it increasingly difficult, if not impossible, for their students to see themselves as sharing common experiences and common pursuits. (No one who has even the slightest understanding of how cultures actually operate in relation to one another — a ceaseless interchange of ideas, visions, experiences, and techniques — could take the notion of opprobrious “cultural appropriation” seriously.)

If you cannot see your fellow students, or colleagues, as engaged in a common and intrinsically human search for knowledge — maybe even wisdom — then you will have no incentive to cross the boundaries of race, gender, or culture. You will in the end have no incentive to cross the boundaries of your narrow little self. You may occasionally speak the language of “alliance,” but you will define your allies as those who obey your demands. And, in times of conflict, it will be characteristic of your stance towards the world to make demands.

It’s a way to live. But it’s not a good way to live, and it exacts a heavy toll on everyone involved. If our young people are going to see that there are less confrontational alternatives, something other than zero-sum games, they’ll need instruction in the humanities. Some of us are prepared to give it.

Posted in .

A Farewell to Roger

In descending order of size: Roger, me, Roger's son Tom, my son (and Roger's godson) Wesley.  Christmas Day 1996.
In descending order of size: Roger, me, Roger's son Tom, my son (and Roger's godson) Wesley. Christmas Day 1996.

When I came to teach at Wheaton College, in the fall of 1984, on a one-year contract, I had been a Christian for about five years, knew almost nothing about the evangelical subculture, and was a theological novice. My one virtue was that I had some inkling of how little I knew. So I sought out Roger Lundin.

That year, it should be remembered, was the apogee of the Age of Theory, or near enough, and I had just completed my training in at least some of its intricacies. But when I arrived on Wheaton’s campus I quickly learned that this was not perceived as an accomplishment, but rather a cause for suspicion. Only Roger — who had himself been trained in a long-familiar makeshift blending of historical and New-Critical practices, but had been seeking, without much help or encouragement, to educate himself in the newer directions of our discipline — was a possible conversation partner. I read a few articles he had written and realized that, like me, he believed that within the Christian traditions there were resources for listening and then responding to the voices of critical theory; but, unlike me, he was already prepared by theological training to do this work. So I sought him out.

I would have sought him out anyway; his warmth and kindness drew me, as they drew so many others, towards him. We began talking our way through these issues, and through so much else. He introduced me to his longtime friend Mark Noll, whose office was then four floors up from ours, later a more convenient one floor down; Mark became my second theological teacher and instructor in the ways of the Christian mind. Roger and I prayed together and laughed a great deal. We created an organization called Slow White Men of America, of which we were the charter and only members. Once we spent a morning at Borders where I came across a book on a display table called Hollywood Priest: A Spiritual Struggle. I held it up before Roger solemnly, and he said, “Big Al” — he always called me Big Al, a somewhat comical thing given the disparity in our sizes — “I understand how people can be horrified at living in this country, but I do not understand how anyone could ever be bored.”

I am strolling down memory lane here, but this is appropriate. Anyone who knew Roger at all well noted how profoundly memorial, as well as historical, his imagination was. I may have had a very slightly better verbal memory than he did, which means that I could sometimes remind him what someone had said in a given long-ago conversation — at which point he would tell me precisely when that conversation had happened. Were he reading this, he would cheerfully rattle off the year, month, and day on which we paid that visit to Borders, what the weather had been, what headlines had been featured in the Trib, and how Chicago’s professional sports teams had performed.

This was a kind of parlor trick, and an invariably impressive one, but for Roger the cultivation of memory, both personal and cultural, is an essential spiritual discipline, and one which Americans, in our haste always to fare forward, tend very much to neglect. We strain into the future, but, as Fitzgerald reminds us, “we beat on, boats against the current, borne back ceaselessly into the past.” It was Roger’s distinctive calling as a teacher and scholar to encourage us to embrace that backwards pull, to use it to help us understand where we have come from, and to do honor to those who went before us. As we read in Ecclesiastucus: “Leaders of the people by their counsels, and by their knowledge of learning meet for the people, wise and eloquent are their instructions: … All these were honoured in their generations, and were the glory of their times.” Roger, who died suddenly last week, has now joined this great company.

The last communication I received from Roger, several weeks ago, concluded with these words:

Thanks, deep thanks, for your kindness, which is rooted in a friendship that now reaches back 31 years!

Rog

A Farewell to Brett

Brett

My beloved friend Brett Foster never wanted to leave anything behind. My wife Teri took this picture of him on the streets of Durham, I believe, in the summer of 2011, when we spent the summer in England leading a study tour. You can see the bag handing from his shoulder, and the scarf, and top of the coffee cup, but not the enormous backback that’s sitting just outside the frame. He was always encumbered in this way, because he never wanted to leave anything behind that he might need.

Now he has left us all behind.

Wheaton College’s Pierce Chapel was not made for funerals, especially Anglican ones, but it served, and hundreds of Brett’s friends arrived there on Saturday to commend his soul to God. November in Chicagoland is typically an unkind month, but the day was sunny and clear and warm enough that someone had opened windows in the chapel. Father Martin Johnson, the rector of All Souls’ Church, where Brett and I and our families were members for a decade, had brought the church’s butcher-block altar over and set it up in front of the stage. That altar had been made for the church by David Hooker, who became one of Brett’s dearest friends. Martin was grieving hard and perhaps working even harder — to make the day right for Brett, and for us.

After the tributes had been made — beautiful words, by some of the many who loved Brett — it was time for Communion. Brad Cathey had made the loaves and laid crosses upon them before baking, and Father Martin and Father Paul stood before the long lines and gave the bread of life, with the words of life.

bread

But then some small chunks of bread fell to the ground at Martin’s feet. One piece was stepped on. I noticed that the poet Scott Cairns, who was sitting right in front of me, saw this, and was looking on in some discomfort. Then, when the last communicant had served, Scott and I at the same moment hunched forward, knelt, picked up every piece, and ate — I quickly and in some embarrassment, he, back in his seat, slowly and with reverence. This is my body, broken for you.

It was time for the final commendations. Martin circled Brett’s coffin with the censer. The sun was low enough now that its light slanted through the open windows, and I could see by that light the smoke of the incense caressing the embroidered pall that covered the coffin. A quiet breeze wafted it away as Martin, his censing completed, stood directly before the coffin. Then he bowed and, with infinite gentleness, kissed it.

We moved then to the cemetery, to commend Brett’s body to the earth, in the hope of the resurrection. As Brett’s wife Anise, and their children Avery and Gus, and Brett’s mother Suzanne, and Anise’s mother Sharie, and all the rest of the family and the great circle of friends looked on, Martin spoke the ancient and beautiful words, and, according to custom, cast the first handfuls of earth onto the coffin.

Though I love those words, I did not, this time, hear them. I looked around and saw that Brett’s corner of the cemetery was surrounded mostly by pines, though one old oak reached out a limb over the grave. It still had its leaves; soon it will shed them over the broken earth there. I remembered the words that Scott Cairns had read earlier, in the time of tributes at the chapel, words that he had written years ago for his father but that could not be more right for this grievous occasion, full of its own strange hope.

And this is the consolation — that the world
doesn’t end, that the world one day
opens up into something better, and then we
one day open up into something far better.

Maybe like this: one morning you finally wake
to a light you recognize as the light you’ve wanted
every morning that has come before. And the air
itself has some light thing in it that you’ve always

hoped the air might have. And One is there
to welcome you whose face you’ve looked for during all
the best and worst times of your life. He takes you to himself
and holds you close until you fully wake.

And it seems you’ve only just awakened, but you turn
and there we are, the rest of us, arriving just behind you.

We’ll go the rest of the way together.

Fear, Safety, and Education

books alan jacobs

The books above are most of the ones I’ve assigned for my Great Texts of the Twentieth Century course (missing are Art Spiegelman’s Maus and the daily poems I’ll be reading to my students). It’s a pretty heterogeneous group, but taken together these books touch on a great many of the key issues of our time, and those of all times: not just racism, sexism, and colonialism, but also the rise of biological science as First Philosophy, the various ways cultures constitute identity, the furthest reaches of human barbarity, the transformation of culture by electronic media, and the miraculous power of the writings of a first-century Jew to illuminate and interpret modern consciousness. Something to offend everyone, you might say.

And this is the point, or at least one of the chief points. Here at Baylor, I want my students—most but not all of whom are Christians; some are simply unbelievers, some are uncertain and struggling—to encounter the texts, and through those texts the experiences, that served to undermine Christian faith and practice in the twentieth century. But I also want them to encounter Christians (Barth, Merton, Bonhoeffer, Eliot, Auden, Simone Weil in her unique way) for whom the twentieth century’s challenges provided an impetus to re-think and re-live Christianity in fresh ways.

I could, of course, work to protect them from this violent clash of powerful and contradictory ideas; I could—I am free to do this—build a syllabus that focused on Christian writers and perhaps other religious believers and presented anti-religious writers whose work is cartoonish or in other ways simplistic. And perhaps if I did that some of my students would feel safer. But that, I am convinced, would be a false sense of safety, and would leave them underprepared for an adult world in which their ideas and beliefs will receive daily challenges. What kind of teacher would I be if I let that happen?

Frank Furedi writes:

There is one point on which the crusade for the imposition of trigger warnings is absolutely right. It is not for nothing that reading was always feared throughout history. It is indeed a risky activity: reading possesses the power to capture the imagination, create emotional upheaval and force people towards an existential crisis. Indeed, for many it is the excitement of embarking on a journey into the unknown that leads them to pick up a book in the first place.

Can one read Proust’s In Search of Lost Time or Tolstoy’s Anna Karenina ‘without experiencing a new infirmity or occasion in the very core of one’s sexual feelings?’ asked the literary critic George Steiner in Language and Silence: Essays 1958–1966. It is precisely because reading catches us unaware and offers an experience that is rarely under our full control that it has played, and continues to play, such an important role in humanity’s search for meaning. That is also why it is so often feared.

And reading should be feared; particular books and authors should be feared. But it is not always—indeed, it is not often—best to flee from what we fear. Better to master the fear, to approach what scares us, but to do so with care and preparation and in an environment where those around you wish you well.

My dear friend Brett Foster, whose death earlier this week hangs over me heavily, wrote a poem that I have been thinking about a great deal. It’s called “Back-to-School Rondeau”, and I think it beautifully describes the fears and excitements of genuine education, the ways the pursuit of knowledge takes up and involves the whole of our being. I’ll leave you with it.

It’s almost time to set aside the waning
distractions of first youth, the life contained
for years at home. What’s home? The place you grow
out of, everything receding slowly,
fading like a chalked sidewalk in the rain.
Leave childish things behind, said a certain
fellow. (Others afterward.) Don’t remain:
the friends gone late in summer let you know
it’s almost time.

Don’t leave behind new clothes, impromptu plans —
they’ll match surroundings well, remind again
of shining coming: new homes to let go
of, too; the best things said; mind’s overflow;
surprising callings; time for love, and pain.
It’s almost time.

Posted in , .

Safe Homes and Public Individuality at Yale

Silliman College at Yale University Sage Ross/Flickr
Silliman College at Yale University Sage Ross/Flickr

Richard Rodriguez, in his great memoir Hunger of Memory, movingly recounts the day when a nun, one of his teachers at his parochial school in Sacramento, asks his Mexican parents to speak English at home in order to encourage Richard to improve his English, to be more confident speaking it in school. He was in first grade.

Not a request that would be made today, I suspect. His parents agreed, of course—a nun had asked them! And while young Richard missed very much the sounds of Spanish at home, his English did get better. He became more comfortable at school; indeed, eventually his public identity came to be closely associated with his academic success. And he became strangely grateful for that nun’s request. It set him on the road to manhood: “I became a man by becoming a public man.”

This experience (and others like it) led eventually to Rodriguez, as a graduate student, becoming notorious for his opposition to bilingual education programs. He may or may not have been right in that opposition—there should be, and there is, serious debate about when young people need to make that essential transition from the private comforts of home to the sometimes challenging but also rewarding demands of public life—but no one has ever articulated more precisely the essential principle at stake here:

While one suffers a diminished sense of private individuality by becoming assimilated into public society, such assimilation makes possible the achievement of public individuality.

I think Rodriguez’s point is essential for understanding the current kerfuffle at Yale University, where students and alumni are—track with me here—demanding the resignation of of the master of a college and the assistant master because the master is refusing to apologize for not having exercised dictatorial authority over other students’ Halloween costumes. To most outside observers this will seem pretty silly. But let’s ask where the kind of reaction the students and alumni are having comes from.

The key may be found in this op-ed in the Yale Herald, significantly titled “Hurt at Home.” Jencey Paz writes,

As a Silimander, I feel that my home is being threatened. Last week, Erika Christakis, the associate master of Silliman College, sent an email to the Silliman community that called an earlier entreaty for Yalies to be more sensitive about culturally appropriating Halloween costumes a threat to free speech. In the aftermath of the email, I saw my community divide. She did not just start a political discourse as she intended. She marginalized many students of color in what is supposed to be their home.

But Silliman College is not “supposed to be their home.” It is a residential college in a university, a place where people from all over the world, from a wide range of social backgrounds, and with a wide range of interests and abilities, come to live together temporarily, for about 30 weeks a year, before moving on to their careers. It is an essentially public space, though with controls on ingress and egress to prevent chaos and foster friendship and fellowship.

It is possible, of course, that Yale sells their residential college system to students as a kind of “home”; I don’t know. The official description seems to me to strike an appropriate note without over-promising: “The residential colleges allow students to experience the cohesiveness and intimacy of a small school while still enjoying the cultural and scholarly resources of a large university; the residential colleges do much to foster spirit, allegiance, and a sense of community at Yale.”

Now, to be sure, this “cohesiveness and intimacy” can for some students be very powerful—their college can even be a better and healthier environment for them than their actual home. The great theater critic Kenneth Tynan loved Magdalen College, Oxford (where C.S. Lewis was his tutor) so much that he wanted his ashes to be interred there. But it was not his home, and could not have been, because there were other people there who didn’t even know him, or who knew him but didn’t like him, or whose preferences were radically different than his, and who had no long-term bond with him to force them to come to some mutually agreeable terms beyond basic tolerance for three years or so.

Residential colleges have long been defended as transitional spaces between the world of home and a fully independent adult life, and it would be a great mistake to think of them as merely continuing the ethos of home. That would leave young people totally unprepared for that “adult life,” which I think we might, for the purposes of this discussion, define as that period of one’s existence during which there is no one to run to to demand control over other people’s Halloween costumes. When one only has, to return to Rodriguez’s terms, “private individuality,” it is quite natural, if not altogether admirable, to seek out an authority figure when someone’s holiday costume offends you. But by the time one gets to college one’s “public individuality” should be sufficiently developed that the wearing of costumes should be seen as an essentially trivial matter that students can deal with among themselves. If they can’t, then the university needs to acknowledge that they’re dealing with some serious cases of arrested development.

Let me wrap this up by simply repeating a passage from a post I wrote some months ago: In a fascinating article called “The Japanese Preschool’s Pedagogy of Peripheral Participation,” Akiko Hayashi and Joseph Tobin describe a twofold strategy commonly deployed in Japan to deal with preschoolers’ conflicts: machi no hoiku and mimamoru. The former means “caring by waiting”; the second means “standing guard.” When children come into conflict, the teacher makes sure the students know that she is present, that she is watching—she may even add, kamisama datte miterun, daiyo (the gods too are watching)—but she does not intervene unless absolutely necessary. Even if the children start to fight she may not intervene; that will depend on whether a child is genuinely attempting to hurt another or the two are halfheartedly “play-fighting.”

The idea is to give children every possible opportunity to resolve their own conflicts—even past the point at which it might, to an American observer, seem that a conflict is irresolvable. This requires patient waiting; and of course one can wait too long—just as one can intervene too quickly. The mimamoru strategy is meant to reassure children that their authorities will not allow anything really bad to happen to them, though perhaps some unpleasant moments may arise. But those unpleasant moments must be tolerated, else how will the children learn to respond constructively and effectively to conflict—conflict which is, after all, inevitable in any social environment? And if children don’t begin to learn such responses in preschool when will they learn it?

Imagine if at university they had developed no such abilities and were constantly dependent on authorities to ease every instance of social friction. What a mess that would be.

Alan Jacobs is a Distinguished Professor of the Humanities in the Honors Program at Baylor University in Waco, Texas, and the author most recently of The Book of Common Prayer: A Biography.

On Conservatives in the Academy, Once More

hxdbzxy/Shutterstock

Damon Linker, with whom I seem to be debating a lot these days:

Professors in the humanities and social sciences engage in highly specialized research, attempting to push knowledge into new areas — and many view this effort as a project that involves and requires liberating individuals from the dead weight of received prejudices.

The result is that academics usually end up pursuing scholarly agendas that are the furthest thing from anything that could be described as “conservative.” The imperative to advance knowledge demands that research contributes something new. Meanwhile, the tendency to relegate all received truth claims to the category of prejudice leads to suspicion even of the established findings of the previous generation of scholars.

This would be a more convincing argument if “the established findings of the previous generation of scholars” were never oriented towards or tended to reinforce political liberalism; and if liberals had a tendency to relegate their own preferred truth claims, the ones they have received from previous generations of liberals, to the category of prejudice. (I’m using “liberalism” in a loose and conventional sense here; one could of course ask whether liberalism is really liberal, and so on.)

Linker wrote in a recent column devoted to defining contemporary conservatism that “If you ask conservatives what this comprehensive moral outlook consists of, they’ll likely say one of several things: Devotion to individual freedom. Constitutionalism. A concern with limited government. Fear of tyranny.” In many different academic disciplines, there’s an enormous body of scholarly work that, in the view of the conservatives Linker describes, neglects or attacks these values. Wouldn’t conservatives then have an inclination to critique that scholarship and defend those values? Would they be piously reverent towards scholarship that endorses, and often enforces, liberalism simply because it’s “established”?

On the other side of the fence: as Linker acknowledges, an essentially if vaguely liberal outlook on the world simply is the default position of the majority of professors in the majority of academic disciplines, and has been for several professorial generations; so where among these liberals is the suspicion of the past that Linker identifies with liberalism? It turns out that such suspicion is highly circumscribed: it’s limited to some findings of previous generations of scholars, not extended to those scholars’ overall political outlook.

Linker wants to argue that conservatives have, because of their core values, selected themselves out of the academy. And no doubt that happens sometimes. But a great many more who would like to make a contribution are ruled out — quietly and behind the scenes — for exactly the same reason that genuinely radical leftists are ruled out: they would rock our gently swaying boat, here on our calm, calm lake.

← Older posts