To understand why I think disciplinary originalism is valuable, let’s take a look at one of the most widely condemned of SCOTUS decisions, Korematsu vs. the United States. In Korematsu the court allowed the practice of evicting United States citizens, often native-born citizens, from their homes and moving them away from the West Coast simply because they were of Japanese descent. The vote was 6-3, and each of the justices who voted in favor of the executive order that mandated the evictions was appointed by President Roosevelt, the man who issued that order. (In a separate but closely related case, the Court ruled that such citizens could not be “detained,” thus depriving the internment camps for Japanese-Americans of legal sanction.)
The chief interest of Korematsu, for today’s reader of the history, is the dissent by Justice Robert Jackson, later to become the Chief Prosecutor at the Nuremberg Trials. In the first stage of his dissent — which you may see in full by going here and scrolling about three-fourths of the way down — Jackson points out that Fred Korematsu was a natural-born citizen of the United States whose loyalty to his country had never been questioned by anyone. He was merely living and working in the place of his birth, but was by the Executive Order obliged to turn himself in to military authorities — an obligation that he would not have faced had he been even “a German alien enemy, an Italian alien enemy, [or] a citizen of American-born ancestors, convicted of treason but out on parole.” Yet he was different from those others “only in that he was born of different racial stock.” Jackson continues:
Now, if any fundamental assumption underlies our system, it is that guilt is personal and not inheritable. Even if all of one’s antecedents had been convicted of treason, the Constitution forbids its penalties to be visited upon him, for it provides that ‘no Attainder of Treason shall work Corruption of Blood, or Forfeiture except during the Life of the Person attained.’ Article 3, 3, cl. 2. But here is an attempt to make an otherwise innocent act a crime merely because this prisoner is the son of parents as to whom he had no choice, and belongs to a race from which there is no way to resign.
This point would have been sufficient in itself to declare Roosevelt’s order unconstitutional, but Jackson discerned a larger and greater issue at stake:
Much is said of the danger to liberty from the Army program for deporting and detaining these citizens of Japanese extraction. But a judicial construction of the due process clause that will sustain this order is a far more [323 U.S. 214, 246] subtle blow to liberty than the promulgation of the order itself. A military order, however unconstitutional, is not apt to last longer than the military emergency. Even during that period a succeeding commander may revoke it all. But once a judicial opinion rationalizes such an order to show that it conforms to the Constitution, or rather rationalizes the Constitution to show that the Constitution sanctions such an order, the Court for all time has validated the principle of racial discrimination in criminal procedure and of transplanting American citizens.
Jackson’s point here is exceptionally acute: this is not a matter of “rationalizing” — giving an implausible intellectual account of — the order, but rationalizing the Constitution. Which is a far more dangerous move.
The principle then lies about like a loaded weapon ready for the hand of any authority that can bring forward a plausible claim of an urgent need. Every repetition imbeds that principle more deeply in our law and thinking and expands it to new purposes. All who observe the work of courts are familiar with what Judge Cardozo described as ‘the tendency of a principle to expand itself to the limit of its logic.’ A military commander may overstep the bounds of constitutionality, and it is an incident. But if we review and approve, that passing incident becomes the doctrine of the Constitution. There it has a generative power of its own, and all that it creates will be in its own image. Nothing better illustrates this danger than does the Court’s opinion in this case.
People are often automatically dismissive of “slippery-slope” arguments, as though no slopes are ever slippery; but once a metaphor is dead it’s dead. Justice Cardozo’s phrasing may be more useful: “the tendency of a principle to expand itself to the limits of its logic.” This tendency is almost inevitable in SCOTUS decisions, because of the power of precedent: only rarely is a decision walked back; rather, a “passing incident” very easily and naturally “becomes the doctrine of the Constitution” when justices see different situations in which it can be applied. All the pressure is on one side, towards expansion rather than contraction of the principle.
Such expansion of a principle is all the more likely to happen when popular opinion, especially elite popular opinion, is also strongly on one side. FDR’s decision to move Japanese-Americans from their homes was quite popular (as were the internment camps) and eight of the Justices had the further pressure of owing their positions on the Court to the Roosevelt. What they needed — but what only three of them had — was a jurisprudential principle substantial enough to make a counterweight to those pressures. All three of the dissenting judges had that principle, but it was most fully developed in and articulated by Jackson.
Just a few months ago Justice Antonin Scalia was asked, by law students at Santa Clara University, which Supreme Court opinion he most admired. He named Jackson’s dissent in Korematsu.
There are two very different ways to think of Antonin Scalia’s preferred method of interpreting the Constitution, often called originalism—and by the way, that Wikipedia page is unusually accurate and useful, though perhaps skewed a bit towards critics of originalism.
One might think of originalism as a method, or one might think of it as a discipline. If you conceive of it in the former way, you will run into some serious problems; if you conceive of it in the latter way, it is profoundly salutary. In a recent critique of Scalia, Laurence Tribe uses the words “method” and “methods” seventeen times — he can’t conceive of originalism in any other way. But there is another way.
Originalism as a method is unmanageable for several reasons, all of which stem from the essential and unavoidable condition of Constitutional interpretation, which is to apply to legal situations today a Constitution that was written more than two centuries ago. For one thing, it often requires justices, even after thorough and detailed research, to guess (infer, intuit) what the Framers might have thought about what’s happening today. But how do you do that? In the strictest sense, the Framers imagined almost nothing that we are dealing with today, since our world is so different. So rigorous methodological originalism would make the Constitution irrelevant to the law today.
It is tempting to say that, since strict or what we might call methodological originalism removes the Constitution from current legal disputes, then must we not follow the living Constitution model? In that just-linked (and now-archaic, though still-relevant) article, Jack Balkin of Yale Law School writes, “We are all living constitutionalists now. But only some of us are willing to admit it.” But that’s true only if the “living Constitution” model is the only alternative to methodological originalism.
If the problem with methodological originalism is that it renders the Constitution effectively nugatory in current legal disputes … well, that’s the problem with the living Constitution model too. Because in practice what makes the Constitution “living” is that it says what we want it to say. Scholars like Balkin more-or-less explicitly endorse this stance: “There’s something deeply wrong with a theory of constitutional interpretation that treats some of the key civil rights decisions of the 20th century as mistakes that we are stuck with.” That is, those “key civil rights decisions” produce immensely valuable and just results — a point I absolutely agree with — and therefore must be good decisions.
As I commented in an earlier post on Scalia, Balkin’s essential jurisprudential principle might be summarized thus:
If a law produces, or seems likely to produce, an outcome that right-thinking people deem socially desirable, then that law is ipso facto constitutional; by contrast, if that law produces, or seems likely to produce, an outcome that right-thinking people deem socially undesirable, then that law is ipso facto unconstitutional.
But it’s hard to see how a “living Constitution” that is alive in this way is anything more than a re-animated corpse controlled by a console in the hands of SCOTUS. Balkin has tried to square this circle, but in a way that it seems to me makes virtually no concessions to the originalist view it claims to be taking seriously. And Tribe simply grasps the nettle: “I see [Scalia], with great respect, as a worthy adversary—but an adversary all the same—of the just and inclusive society that our Constitution and laws should be interpreted to advance rather than impede.” First you decide what you think a “just and inclusive” society is, and then you interpret the Constitution so that it endorses your views. In such a scheme the Constitution, and therefore our own national history, is rendered incapable of speaking back to us — of having its own voice rather than a dim echo of our own.
I confess to much ambivalence on this score. In a very important sense it would have been far, far better for the key social and legal decisions of the Civil Rights era to have been made by legislative rather than the judicial system. But our legislators, especially on the state level, were moving very slowly or not at all. And when I think about those who in those days counseled patience, I always hear the voice of Martin Luther King, Jr.:
We have waited for more than three hundred and forty years for our God-given and constitutional rights. The nations of Asia and Africa are moving with jetlike speed toward the goal of political independence, and we still creep at horse-and-buggy pace toward the gaining of a cup of coffee at a lunch counter. I guess it is easy for those who have never felt the stinging darts of segregation to say “wait.” But when you have seen vicious mobs lynch your mothers and fathers at will and drown your sisters and brothers at whim; when you have seen hate-filled policemen curse, kick, brutalize, and even kill your black brothers and sisters with impunity; when you see the vast majority of your twenty million Negro brothers smothering in an airtight cage of poverty in the midst of an affluent society; when you suddenly find your tongue twisted and your speech stammering as you seek to explain to your six-year-old daughter why she cannot go to the public amusement park that has just been advertised on television, and see tears welling up in her little eyes when she is told that Funtown is closed to colored children, and see the depressing clouds of inferiority begin to form in her little mental sky, and see her begin to distort her little personality by unconsciously developing a bitterness toward white people; when you have to concoct an answer for a five-year-old son asking in agonizing pathos, “Daddy, why do white people treat colored people so mean?”; when you take a cross-country drive and find it necessary to sleep night after night in the uncomfortable corners of your automobile because no motel will accept you; when you are humiliated day in and day out by nagging signs reading “white” and “colored”; when your first name becomes “nigger” and your middle name becomes “boy” (however old you are) and your last name becomes “John,” and when your wife and mother are never given the respected title “Mrs.”; when you are harried by day and haunted by night by the fact that you are a Negro, living constantly at tiptoe stance, never quite knowing what to expect next, and plagued with inner fears and outer resentments; when you are forever fighting a degenerating sense of “nobodyness” — then you will understand why we find it difficult to wait. There comes a time when the cup of endurance runs over and men are no longer willing to be plunged into an abyss of injustice where they experience the bleakness of corroding despair. I hope, sirs, you can understand our legitimate and unavoidable impatience.
A long passage, but one that can’t be reflected on too deeply or too often. The extended, throbbing sentence in the middle of that paragraph is as powerful an embodiment as I know of the pain of waiting, waiting, waiting, for a remediation of the grossest of injustices.
So I get what Balkin is saying when he notes that few of us, on the Left or on the Right, would want to undo “the key civil rights decisions” of the 20th century. But not all the decisions, not most of the decisions, made under the living-Constitution model have been as just or as commendable. Indeed, some of them have made a mockery of the Constitution and cannot possibly be defended in terms of legal reasoning, however desirable one might think the outcome.
So this is why discplinary originalism matters. Disciplinary originalism understands that methodological originalism is unworkable because it makes the Constitution useless. But it also wants to allow the Constitution to speak to us, and to force us, when we are departing in some significant way from its principles, to go back to our legislators and change the laws — and amend the Constitution itself when necessary. Disciplinary originalism keeps us honest. It forces us to know what we’re doing, and not to console ourselves with the pretense that we are somehow in the Great Tradition of the Framers when we are in fact repudiating much of what they believed. It doesn’t tell us we can’t or shouldn’t dissent from the beliefs of the Framers; it just asks us to admit it openly when we do so.
Critics of Justice Scalia often accused him of inconsistency. And insofar as he was a methodological originalist he sometimes was inconsistent. But I think the heart of his jurisprudence was disciplinary originalism, and with his death the most powerful embodiment of that vital principle was lost. I do not think we shall look upon his like again. And that means that our Supreme Court will continue to make the kinds of decisions it has been making for decades, but will have no one on its bench to remind it of what it’s really doing. Antonin Scalia was the conscience of SCOTUS, and I don’t see how it’s going to get another one.
Here’s a fascinating post by Martin Filler about the history of staging Wagner’s operas:
The Wagner siblings’ [Wieland and Wolfgang, grandsons of the composer] starkly minimalist productions — in several instances the stage was left completely bare, and lighting alone defined some indoor or outdoor space — reflected two harsh realities: a lack of funding, and the need to avoid any representational feature that could be interpreted as political, given how compromised the Wagner legacy was. Significantly, these stripped-down stagings, which owed much to the reductive modernist concepts of the Swiss stage designer Adolphe Appia (1862–1928), had a purifying effect that was quite intentional in shifting audience attention toward the Ring’s penetrating psychological aspects and away from the impossible stage business — swimming mermaids, flying horses, a fire-breathing dragon, and a talkative bird, among other fantasies — that has always bedeviled directors.
Like the best abstract art, the New Bayreuth Style, as it was dubbed, allowed viewers to project their own interpretations onto a nearly blank canvas, and to draw illuminating conclusions of their own about the deeper meaning of what they saw before them. The pendulum theory of culture — which holds that action and reaction reflexively animate stylistic swings as broad as those from Rococo to Neoclassical and Pop to Minimalism — might indicate that after every last anachronistic dystopia has been exploited as a stand-in for Valhalla we might finally see a return to undistracting theatrical values more closely aligned with Wagner’s gloriously transcendent music of the spheres.
I appreciate Wagner’s music more than I enjoy it, but I resonate with Filler’s argument because I have a long-standing and oddly passionate interest in the relationship between constraint and creativity. I often think of Miles Davis, who developed his distinctively “cool” style of trumpet-playing — slow, no vibrato, muted more often than not — because he realized that he could never compete with the technical virtuosity of Dizzy Gillespie and refused to be a second-rate version of Diz.
Such creativity-from-constraint doesn’t always emerge from limitations of technique: there has rarely if ever been a more technically masterful artist than Picasso, but all he needed to make wonderful art was two pieces of an old bicycle — which I suspect was harder to get just right than one might at first think: when you only have two items to work with, their juxtaposition must be exact to create the desired effect. In a similar vein, I once heard a set designer comment that spare sets are far more expensive than elaborate ones, because in a spare set every item has to be perfect.
Many years ago I attended a performance, at the Chicago Lyric Opera, of Handel’s Samson — which is not an opera but rather an oratorio. However, the CLO had money in hand and decided to operify the performance, which they did primarily by placing a big wheeled cart on the stage, to which Samson was chained for most of the production. From time to time extras could come out and wheel the thing from one side of the stage to the other. That was all the dynamism the director could manage to generate, and the effect was to distract the audience’s attention from Handel’s music and focus it instead on the futility of the cart’s movements — which managed to drain every last ounce of energy from the room.
All this was especially sad because the singers were excellent and the music often inspired. If the CLO leadership had allowed the oratorio to be an oratorio — if the singers had just stood at the front of the stage and sung to us — I suspect it would have been a memorably wonderful evening, instead of a memorably dumb one. But the director did not trust the singers or the music. There’s a lesson to be learned from this, one that, to judge from Filler’s post, few contemporary directors of Wagner’s operas have yet learned.
Leader: Donald Trump used to be a Democrat.
Congregation: Trump voters don’t care.
Leader: Donald Trump is the spoiled child of a rich man.
Congregation: Trump voters don’t care.
Leader: Donald Trump has failed in business many, many times.
Congregation: Trump voters don’t care.
Leader: Donald Trump viciously exploits those who work for him.
Congregation: Trump voters don’t care.
Leader: Donald Trump created a “university” that was a massive scam.
Congregation: Trump voters don’t care.
Leader: Donald Trump won’t denounce the Ku Klux Klan.
Congregation: Trump voters don’t care.
Leader: Donald Trump verbally abuses women.
Congregation: Trump voters don’t care.
Leader: Donald Trump verbally abuses everyone else.
Congregation: Trump voters don’t care.
Leader: Donald Trump has almost no discernible policy stances.
Congregation: Trump voters don’t care.
Leader: Donald Trump says he will do things he can’t possibly do.
Congregation: Trump voters don’t care.
Leader: Donald Trump is a serial liar.
Congregation: Trump voters don’t care.
Leader: Donald Trump has no discernible religious or moral commitments.
Congregation: Trump voters don’t care.
Leader: Donald Trump has the emotional stability of a toddler.
Congregation: Trump voters don’t care.
Leader: Donald Trump knows nothing about how the government works.
Congregation: Trump voters don’t care.
Leader: Donald Trump is the least qualified man ever to run for President.
Congregation: Trump voters don’t care.
What I find fascinating about the punditocracy’s constant commentary on the Trump Phenomenon is this: Nothing about that phenomenon is difficult to understand. In fact, it seems to me that thoughtful commenters have reached a remarkable level of consensus about Trumpery. Almost everyone now understands that Trump’s constituency is comprised largely of people who have been ignored by the Republican establishment; that they are angry about how peripheral they have become to to professional politicians and to American society in general; that Trump effectively channels that anger; and that the more outrageous and offensive his rhetoric is the more convinced they become that he will speak for them and their concerns when no one else will.
But that doesn’t mean that no serious questions remain. Here are a few that I’m thinking about:
First, why did so few people see this phenomenon coming? Was it unpredictable, or was it perfectly predictable for anyone (on the political left, right, or center) who bothered to pay attention to that obscure-but-fairly-sizable constituency?
Second, how might the little society populated by politicians, political operatives, lobbyists, donors, and journalists—who, even when they disagree politically, share a Lebenswelt—develop a better understanding of the people who support Trump?
Third, is the GOP going to make a serious effort to win those voters back, or will they assume that Trump is a one-off and that they won’t have to worry about another similar figure arriving on the scene to siphon off voters they think are their rightful property? (Not incidentally, a similar question may be asked about how the Democrats will deal with Sanders supporters.)
And most important of all: to what extent is Trump a one-off, and to what extent does he represent a constituency that will remain relatively coherent for at least the next couple of election cycles?
That last question breaks down into several others. That constituency can only remain coherent if it can raise up a prominent figure to speak on its behalf: is that possible? After all, who else has Trump’s combination of celebrity and financial independence? It’s hard to imagine that even Trump’s wealth would have been sufficient to keep him in the public eye had he not already been famous; and it’s hard to imagine how a famous person who decided to run for President could sustain a truly renegade campaign if he (or she?) were dependent on donors.
If the GOP establishment is thinking along these lines, then they just might decide to ignore Trump’s voters, assuming that from now on those weirdos won’t have a representative and will simply stay home on voting day. I would prefer them not to reach that conclusion, but I can’t say with confidence that they’d be irrational to reach it.
Similarly, I can’t say that the Democratic leadership would be irrational to ignore Sanders supporters, in the belief that once this brief insurrection is put down they’ll fall back into line—even though another Bernie would be far easier to conjure than another Donald, the Dems strike me as a more unified party who have a better chance of rallying the troops in time of need.
So while it’s tempting to see this election as portending big changes in the American political landscape—and Lord knows I long for big changes in the American political landscape—I suspect that it’s still more than likely that we’ll be faced with cookie-cutter candidates on both sides for the foreseeable future. To paraphrase Damon Runyan, the race is not always to the rich, nor the battle to the deeply-entrenched, but that’s the way to bet. Even with The Donald around.
The judicial reasoning of most recent and current Supreme Court Justices emerges from one central principle:
If a law produces, or seems likely to produce, an outcome that right-thinking people deem socially desirable, then that law is ipso facto constitutional; by contrast, if that law produces, or seems likely to produce, an outcome that right-thinking people deem socially undesirable, then that law is ipso facto unconstitutional.
Antonin Scalia was the most intellectually powerful and eloquent opponent—ever—of that principle, which he believed to be inconsistent with the appointed role of the judiciary and insufficiently respectful of the other branches of government, especially the legislative branch.
Your acceptance or rejection of that principle will probably determine whether you rejoice that Scalia is dead, or lament his departure.
Requiem æternam dona ei, Domine; et lux perpetua luceat ei. Requiescat in pace. Amen.
Lawrence M. Krauss, a physicist at Arizona State University, writes in the New Yorker about the Doomsday Clock, which was created in 1947 by people associated with the Bulletin of the Atomic Scientists:
I am privileged to chair the Bulletin’s Board of Sponsors, a group of scientists, including sixteen Nobel laureates, that was created by Albert Einstein and Robert Oppenheimer after the Second World War to advise the Bulletin. As a result, I also work with the Bulletin’s Science and Security Board, which, each year, decides on the position of the Doomsday Clock. It’s a difficult task. Many disparate, worldwide factors must be judged in order to realistically assess the total existential risk facing humanity. This task has become even more complex in the past decade because the Bulletin has begun to explore issues beyond nuclear weapons, including climate change, bioterrorism, and cyber threats. Last year, in January, 2015, the Bulletin set the Doomsday Clock at three minutes to midnight. In a statement, we wrote that “Unchecked climate change, global nuclear weapons modernizations, and outsized nuclear weapons arsenals pose extraordinary and undeniable threats to the continued existence of humanity.”
This year, we’ve decided not to move the clock either forward or backward. It will remain set at 11:57 — three minutes to midnight. The fact that the clock’s hands aren’t moving isn’t good news. It’s an expression of grave concern about how the global situation remains largely the same. The last time the clock was this close to midnight was in 1983 — the height of the Cold War.
The “Doomsday Clock” is an odd thing — or non-thing, because of course there really isn’t any such clock. The idea, in 1947, was to use a ticking clock as an image of approaching nuclear war. Eventually someone decided to make a fake clock with moveable hands to make the occasional Doomsday Press Conferences a little more dramatic, but that’s not a timepiece; rather, it’s an image of an image of an emotion: fear.
I say “image of an emotion” because no actual science goes into the decision of where to place the hands of the clock. The scientists who make the decision have no particular expertise in geopolitical strategy, military and political risk assessment, or even climatology (relevant since they incorporate climate change into their assessment). They just read a bunch of stuff and take their own emotional temperature.
Moreover, now that climate change has entered in a major way into their thinking, the “ticking clock” metaphor has lost its fit to the circumstances. It was a good, strong image in the days of the Cold War, when the perceived danger was a nearly-simultaneous firing of nuclear weapons that could destroy a large part of human civilization in a just few hours. But when you’re trying to think about the consequences of anthropogenic climate change, the idea of a clock ticking down to midnight is meaningless. What would “midnight” be? The effects of such alterations to the ecosphere may indeed be vast, but “vaster than empires and more slow,” as the poet says, unfolding over centuries and millennia.
Still, you can understand why Krauss and his fellow members of the Science and Security Board would want to hang on to it. The fake clock, with its ominous name, seems more real than the guesses and anxieties that establish the position of its hands. And though Krauss in no way hides the Board’s responsibility for its decisions, it’s interesting how his language — and the language of many journalists who write about the clock — veers towards objective description: “The fact that the clock’s hands aren’t moving isn’t good news…. The last time the clock was this close to midnight was in 1983 — the height of the Cold War.” When described in this way — “the clock’s hands aren’t moving” — the thing seems to assume volition, like the planchette of a Ouija board. “The last time the clock was this close to midnight was in 1983” gives the appearance of being a more substantial and objective statement than “The last time members of this Board decided to place the hands this close to midnight was 1983.” (Which, incidentally, wasn’t “the height of the Cold War” — that would have been 1962.)
If indeed anthropogenic climate change has become a greater danger than nuclear weapons — and I’m inclined to think that they may well be true — then the “ticking clock” metaphor needs to be retired. But what would replace it? I’m thinking “the spreading (or contracting) slime mold.” Pretty catchy, no?
Having recently disagreed with Damon Linker, it’s nice to find agreement on something. In his most recent column, he describes a recent essay by Corey Robin thus:
Is this indifference to political reality a defect or a plus? Left-wing commentator Corey Robin clearly thinks it’s a virtue — and not just when it comes to presidential politics. In a provocative essay for The Chronicle of Higher Education, “How Intellectuals Create a Public,” Robin argues that “the problem with our public intellectuals today is that they are writing for readers who already exist, as they exist,” as opposed to “summoning” a new world, a new public, a new reality, into being.
I had semi-drafted a post on the Robin essay before Damon’s post, so let me take this ball and run with it.
When I read Robin’s essay, what immediately came to my mind is Wallace Stevens’s great poem “The Idea of Order at Key West”:
It was her voice that made
The sky acutest at its vanishing.
She measured to the hour its solitude.
She was the single artificer of the world
In which she sang. And when she sang, the sea,
Whatever self it had, became the self
That was her song, for she was the maker. Then we,
As we beheld her striding there alone,
Knew that there never was a world for her
Except the one she sang and, singing, made.
For Robin, this is the vision: the public intellectual as modernist poet, driven by a “Blessed rage for order … The maker’s rage to order words of the sea….” And all of the rest of us are the rapt, passively obedient audience “summoned” into being by the Maker’s brave new word.
Robin talks a lot about summoning.
“That’s also how public intellectuals work. By virtue of the demands they make upon the reader, they force a reckoning. They summon a public into being.”
“It is precisely that sense of a public — summoned into being by a writer’s demands; divided, forced to take sides — that Sunstein’s writing is in flight from.”
“the politics of division and summoning that is the public intellectual’s stock in trade”
“A world where it is difficult to imagine the summoning of a public, beyond the intermittent, ever-more-fleeting summons we’ve seen these past 20 years”
The word has two major overtones: in the first, a lord summons a servant; in the second, a magician summons a spirit. (Maybe three, if you add a court of law issuing a summons.) These are tropes of mastery, in which the public intellectual assumes dominion over a servile public. To be sure, Robin graciously allows that the public so summoned might disagree with what the public intellectual says; but he does not acknowledge the possibility of ignoring the summons.
A position like this is a recipe for political ineffectuality, because people know when they’re being condescended to, especially when the condescension is this flagrant, and simply will not agree to be summoned. For good and for ill, powerful political rhetoric acknowledges the meaningful agency of the audience, and typically positions the politician as the people’s emissary. Thus Churchill: “It was the nation and the race dwelling all round the globe that had the lion’s heart. I had the luck to be called upon to give the roar.”
We might argue about whether Churchill really believed this — I think he did — but even in the most cynical reading, he was a great politician in part because he knew it was important to say it. Public intellectuals rarely become politicians, but if they want political influence, they need to take a lesson from Churchill. If you don’t actually have respect for the people you’re writing for, fake it.
Let’s ask a question: Why was David Blatt fired as coach of the Cleveland Cavaliers? The man who fired him said it was a matter of “a lack of fit with our personnel and our vision.” Possibly true. But it would be more useful to say this: David Blatt got fired because Chip Kelly got fired before him, and Jose Mourinho before him, and Kevin McHale before him, and so on nearly ad infinitum.
That is to say: firing coaches is how professional sports franchises deal with conflict. And athletes know that this is how professional sports franchises deal with conflict: so when a team hits a bad patch, and the players are underperforming, and the coach is getting angry with them, and relationships are fraying… why bother stitching them up? Why bother salving the wounds? If everyone knows where the situation is headed — sacking the manager — then isn’t there rather a strong incentive to make things worse, in order to hasten the inevitable, put an end to the frustrations, start afresh, get a do-over? Of course there is.
And precisely the same tendencies are at work in many of the key institutions of American social life. This is one of the chief reasons why so many marriages end quickly; this is why so many Christians church-hop, to the point that pastors will tell you that church discipline is simply impossible: if you challenge or rebuke a church member for bad behavior, he or she will simply be at another church the next week, or at no church at all.
It seems that we — and I’m using “we” advisedly here, as you’ll see in a moment — are becoming habituated to making the nuclear option the first option, or very close to the first option, when we can. Trying to come to terms with a difficult person, or a difficult situation, is an endeavor fraught with uncertainty: it might work, but it might not, and even if it does work, I could end up paying a big emotional price. Why not just bail out and start over?
I know at least some of these temptations well. Not all of them: I am deeply grateful that I went into my marriage, 35 years ago, sharing with my beloved the bone-deep conviction that, except in the most tragic circumstances, Christian marriage is indissoluble. Bailing out has never been an option for either of us, and since we are very different people with very different responses to the world, that’s been invaluable for us. We’ve had a lot of work to do, but it has been good work, rewarding work.
But in the three decades that I lived in Wheaton, Illinois, I was a member of three different churches, and I often wonder what I might have learned — what wisdom I might have gained, what benefits of character I might have reaped, what good I might have done for others, what I might have been taught by fellow parishioners — if I had never left the first one. I can’t manage to wish I had stayed, but that may be because all I know is what went wrong there, what made me frustrated and unhappy. Any benefits I (or others) might have received through persistent faithfulness are unknown to me, a matter of speculation.
Looking back on my decision to leave that first church, I realize that I did so because I was confident that, whatever good things might have come to me at that church, those good things, or very similar ones, would be available to me elsewhere. It seems to me that if there’s one thing that our current version of advertising-based capitalism teaches us all it’s that everything is replaceable: everything can be reproduced, or traded in for a new and improved model. And that applies to coaches, to churches, to spouses. We live in a trade-in society.
This belief breeds impatience with everything, and that impatience in turn breeds immense frustration with any situation that doesn’t lend itself to the discard-and-replace approach. I think even our recent university-campus controversies can be explained in these terms. Students don’t want to deal with administrators who don’t see things their way, or speakers who say things they find offensive, but they realize that an immediate opt-out isn’t possible. You can’t walk away from Oberlin on a Friday and show up for class at Carleton on Monday morning. At least for a time, you’re stuck. But what if you’re stuck in a situation and have never been taught how to negotiate, how to work things out, how to be patient in the midst of conflict? Well, then, you make demands. You are very insistent that “These are demands and not suggestions”. And often those demands are that administrators or faculty be fired — like football coaches who haven’t won enough, basketball coaches who manifest “a lack of fit with our personnel and our vision” — because that, they think, can be done right now.
What most troubles me about these pathologies is that I don’t see any way back from the current level of impatience and the inability — indeed, refusal — to persist through difficulties. You can always point to marriages that have survived struggles and come to thrive; or workplace enemies who became mutually-valued collaborators; or sports franchises, like the San Antonio Spurs, that have succeeded through a commitment to continuity. But in a trade-in society, those situations look like black swans: unpredictable, inexplicable. (And will the Spurs continue to prize continuity when Tim Duncan, one of the best players ever, and Gregg Popovich, one of the best coaches ever, retire? Or will the pressure towards immediate action prove too much for their institutional culture to resist?)
The president of Oberlin, Marvin Krislov, has published an open letter in response to the protestors in which he says “I will not respond directly to any document that explicitly rejects the notion of collaborative engagement,” in part because “many of its demands contravene principles of shared governance.” That is, the students are demanding that the college president dictate changes that he doesn’t actually have the power to do, according to the by-laws and written procedures of the college. Similarly, when students at public universities demand the punishment or prohibition of “hate speech,” they can be reminded that the First Amendment makes no exception for hate speech. So in increasing numbers Americans, especially younger Americans, support the repeal of the First Amendment. It turns out that many people are profoundly unhappy with social and political structures that prevent the immediate implementation of their desires, and are willing to discard them — without pausing to reflect that the people who share their desires may not always be in the majority. You can’t remove those breaks for yourself without simultaneously removing them for your political and social enemies. (Though Lord knows people try.)
The impatience that people feel with manifest injustice is understandable, and more than understandable. In perhaps the most powerful passage in his “Letter from the Birmingham Jail,” Martin Luther King, Jr. answers white moderates’ counsel of patience with a long litany of everyday abuse and affliction, and concludes: “There comes a time when the cup of endurance runs over and men are no longer willing to be plunged into an abyss of injustice where they experience the bleakness of corroding despair. I hope, sirs, you can understand our legitimate and unavoidable impatience.” And all God’s people say Amen.
But it’s worth noting that those white moderates wanted King and his fellow protestors to do nothing: simply to wait and trust that time would somehow naturally bring about justice. And it is also worth noting that it’s socially unhealthy when people exhibit impatience far beyond Dr. King’s when confronted by injustices that are far less massive — or when faced by mere inconveniences or strictly personal discomforts. People want to be able to trade in old models of anything and everything, and profoundly resent any social or political structures that inhibit instantaneous action.
In such an environment, it’s no wonder that a great many people applaud a Presidential candidate who believes that he can “see Bill Gates” about “closing up that internet.” (The old internet is messed up — let’s trade it in for another one.) I suspect they overlap pretty significantly with the folks who demand, after every losing streak, that their favorite team’s coach be fired; and with the more aggressive of the student protestors. Trump supporters may not seem to have much in common with people demanding that racially insensitive university administrators be fired, but there’s a deep temperamental affinity. They’re all enthusiastic adherents of the trade-in society.
Thus, our best (and perhaps slightly conservative) estimate is that the Packers cost themselves about 7.9 percent of a win by kicking rather than going for two, and this whole thing could have been avoided if NFL coaches took the time to sit down and learn some basic percentages….
Another year, another year with NFL coaches not doing their jobs and not being taken to task for it. By now, coaches have no excuse for not having mastered basic decisions like these.
People say coaches are afraid of media criticism. But they’re professionals, among the handful of elite who are capable of doing what they do. If a coach cares what the media thinks, let him explain his logic.
There’s so much that is touchingly naïve about this, but more than anything, the idea that a football coach — or anyone else — could save his job when under fierce criticism by “explaining his logic.”
Morris thinks coaches, when they make these decisions, are being irrational. They are in fact being perfectly rational. Why does Morris think they are irrational? Because he thinks that the only relevant factor in evaluating decisions is what will increase the likelihood of winning a game. But this is obviously false, because every coach or manager knows that many coaches and managers, across the spectrum of sports, who have been very good at winning games have also, with alarming frequency and without rational justification, been fired. And no coach is selfless enough to factor his own job security out of his calculations.
Here’s what Mike McCarthy may have been thinking at the crucial moment in that playoff game:
- If we go for two and make it, I will be praised as a “riverboat gambler,” because hardly anyone in the press or among Packers fandom understands the percentages involved.
- If we go for two and fail, millions of people will scream “What the hell was McCarthy doing??” Thousands of people will call into radio shows to demand my ouster. Hundreds of columnists will write stories about my recklessness and thoughtlessness. And the sum total of all these interventions will put pressure on my bosses to fire me, pressure that they well very likely succumb to, especially since we haven’t been all that great the past couple of years.
- If I just kick the extra point I will be, generally speaking, neither praised nor blamed.
Ergo, and given that coaches will inevitably be concerned not just with the chances of winning a given game but also with the chances of keeping their jobs, McCarthy’s decision to kick the extra point was perfectly rational, as long as we have a proper understanding of what “reason” is in a given case; which is to say, as long as we factor in the variables that are immensely significant but not in any obvious way mathematically calculable.
The more general lesson to be applied here — one that the analysts at FiveThirtyEight need to reflect on — is this: It is not rational to act perfectly “rationally” when surrounded by irrational people whose actions have influence over your life.
Marriage rests upon the immutable givens that compose it: words, bodies, characters, histories, places. Some wishes cannot succeed; some victories cannot be won; some loneliness is incorrigible. But there is relief and freedom in knowing what is real; these givens come to us out of the perennial reality of the world, like the terrain we live on. One does not care for this ground to make it a different place, or to make it perfect, but to make it inhabitable and to make it better. To flee from its realities is only to arrive at them unprepared.
Because the condition of marriage is worldly and its meaning communal, no one party to it can be solely in charge. What you alone think it ought to be, it is not going to be. Where you alone think you want it to go, it is not going to go. It is going where the two of you – and marriage, time, life, history, and the world – will take it. You do not know the road; you have committed your life to a way.
That’s Wendell Berry, from his great essay “Poetry and Marriage.”
Damon Linker makes an interesting argument here, in which he responds to this post by Daniel Payne, which in turn responds to this post by Kevin Drum.
Payne writes that if Drum kills himself “he will have cheated [his wife] out of something that is hers by right: the chance to realize her wedding vows and her matrimonial commitment to the fullest possible degree by conferring upon her husband the last and most important measures of care and comfort she can give him.” Linker replies, “Message: It would be selfish for Drum to end his own suffering and thereby deprive his family members of having to endure that suffering with him.”
Damon Linker is a good thinker and a fine writer, but I have complained more than once over the years that he has a habit of in-other-wordsing. That is, he quotes or cites someone and, adds “in other words” (or in this case “Message:”), attributes to the author something he or she did not say, and then refutes that. Perhaps Payne does indeed believe that “suffering is not only necessary, but even in some respects good,” but that’s not what he says in the passage Linker quotes. What Payne says is that it is good when a person gives “care and comfort” to someone he or she loves in that person’s time of suffering. Good for the person giving the care, and good for the person receiving it. And having cared for my wife through a long and difficult (though not mortal, thanks be to God!) illness, I can testify — and she can testify — that this is true, as long as the care is both given and received graciously.
Linker connects this belief to the claim that our lives are not our own, but rather belong to God, who gives us stewardship over them, and concludes: “Without these theological assumptions, the opposition to assisted suicide makes no sense.” But is the claim that the bonds of marriage are fulfilled when we care for one another in suffering actually a theological claim? Linker clearly thinks so. Perhaps without a Christian doctrinal foundation we cannot make such strong claims on one another, even in marriage. Even when a vision of mutual care and comfort like Payne’s is not articulated in theological terms, it only makes sense when supported and justified by a model of marriage as strong as the Christian one.
Hey, he said it, I didn’t. (Unless I’m in-other-wordsing him.)
Academic freedom is something I’ve written about a good deal over the years—and quite recently—which I suppose has been inevitable, since if you teach at a religiously-based institution you always hear that the problem with such places is that they constrain academic freedom.
To that claim, I have always responded that I have taught at religious institutions because in them I have academic freedom. (And if you follow up the links in that post I just mentioned you’ll find other people making the same point.) There is just no logically coherent, evidence-based way to claim that religious institutions have less academic freedom than secular ones. Every community of learning has limits, though different limits, articulated with different degrees of explicitness. Academic freedom is a concept relative to the norms of communities, institutions, and disciplines–and often of society as a whole. Academic freedom is therefore bounded freedom (as freedom always is), and when people don’t recognize that they make incoherent arguments.
Often the claim that religious institutions offer less freedom is based simply on personal feeling: the feeling that you would be constrained there, and therefore it is a place of greater constraint than your current institutional location, where you feel quite free to do and say what you want. But what if I would feel more constrained in your location than I do in my own? Moreover, the legal history of academia is littered with the remains of scholars, sometimes tenured scholars, who thought that their academic freedom was absolute, only to be “terminated for cause.”
In light of all this, I thought it might be a public service to offer some logically consistent and coherent arguments against religious education (some of which people make openly, some of which are only implicit views that people who hold them don’t know that they hold). So here goes:
- Education is properly a function of the state. All private educational institutions, including religious ones but others as well, should be abolished. If people want to get together to learn and teach, that’s fine, but they may not formally constitute or incorporate themselves as a school. If this requires a Constitutional amendment, so be it. (N.B.: Anyone who actually wanted this to happen would probably say that it doesn’t violate the Constitution, but quite clearly control of education is not one of the powers allocated to Congress in that document—though some think that it should be—so the only honest and coherent version of this argument would have to accept the necessity of amending the Constitution.)
- Private educational institutions may exist, but teachers and students at them should not be eligible for federal or state funding: no NIH or NEH or NSF grants for faculty, no federally guaranteed loans for students. Taxpayer support should go only to those institutions that are constituted and governed by the people.
- While the government of the United States may not discriminate against an institution on the grounds of religion, the educational establishment as a whole may—as long as the relevant accrediting agencies cease to seek and receive approval from the Secretary of Education. By cutting all ties to the government, accrediting agencies would free themselves to declare that religious schools violate the core principles of higher education and therefore may not be accredited. To de-accredit such institutions might not kill them, but could damage them seriously.
- Religion does more harm than good, and interferes with the state’s ability to care properly for its citizens, so the First Amendment should be repealed and the Constitution amended in order to prohibit—or at least place strict controls on—religious organizations. All Christian schools, at every level, will be abolished. Private but non-religious institutions may remain as they are.
Notice that anyone holding the first argument may not have anything against private education as such, but is willing to sacrifice non-religious private institutions in order to get rid of religious ones, without radically altering the Constitution. The same may be said of the rather milder plan embodied in the second argument. The fourth argument grasps the nettle that the first won’t grasp. The third one … well, something like that may actually happen.
Sometimes it seems that this is John Rawls’ world, we’re just living in it: conducting our political and social debates as though we were behind a veil of ignorance, as though we can only trust the judgments made in perfect abstraction from any actual lived contexts. The debate about the future of Larycia Hawkins at Wheaton College is a classic example, where a complex, embodied, richly personal situation gets translated into the terms of disembodied theological debate.
It’s not that such debates are fruitless or useless: they matter. But they aren’t all that matters. And they can distract us from more complex considerations.
Consider the experience of my friend Matt Milliner, an art historian at Wheaton:
In my field work in Turkey, Egypt and Cyprus I had some negative and some very positive contacts with Muslims, and spent lots of mornings waking to the prayer call of the minaret. But interestingly, the best relationships with Muslims I have had have been in this town. Believe it or not, they (and by “they” I mean real local people like Abraham and Zahra) are well aware that despite this media firestorm, Wheaton College is here for the long haul, and so are they. Accordingly, my son is not even one year old, and he has been held by more Muslims than Christians. The reason for this is that at the Islamic Center of Wheaton, the gracious, hijab-donning women joyfully pass him around while I talk theology with my new friends. From the beginning we have been clear about our differences. I actually believe, for example, you could have passed Allah (the pre-Islamic Arabic term for God which is still used by Arabic-speaking Christians) around in the same way as my son. And sentimental as it may sound, because God freely gave his son, I can freely give mine.
Wheaton College does not exist in a free-standing abstract theological space. It is a college in a town of about 50,000 people, at the north end of which stands the Islamic Center of Wheaton, which just a few years ago moved into a former church building there. Perhaps that accident of real estate — Muslims occupying a church! — has contributed to the cold welcome, or less than a welcome, that those Muslims have experienced from some locals. But my friends at Wheaton College, most of whom also live in the town of Wheaton, or in other nearby communities that also have a visible Muslim presence, know that the people who worship at the Islamic Center of Wheaton are their neighbors, and Christians are supposed to love their neighbors, so … so they show up. They visit. They talk. They bring their children.
Someone doesn’t like that — doesn’t like the very idea that Muslims can be peacefully incorporated into the fabric of a community like Wheaton. So they have created a fake website for the Islamic Center of Wheaton that presents its people as advocates for jihad — and implicates local Christians as well, as another friend and former colleague of mine, Noah Toly, has recently discovered. Christians from the Wheaton community who have befriended their Muslim neighbors are being smeared, along with those neighbors, as advocates of murder and terror.
That’s the context in which, at Wheaton College and in the town of Wheaton, people are discussing and debating what it might mean to say that Muslims and Christians worship the same God.
I’m not saying anything here about whether Larycia Hawkins should or should not be fired. You are free to make your own judgments. My own inclination fits that of Mark Galli, who wrote at Christianity Today that there might be a better way to handle this conflict than dismissal.
But whatever your opinion, it’s important to avoid the temptation to abstract this debate from its human situation — as though we could live behind a veil of ignorance and consider the matter in a purely theoretical sense. It’s on the ground, at the corner of President Street and Geneva Road in Wheaton, Illinois, where those life-challenging and life-transforming questions are being asked: Who are my neighbors? And how might I love them?
Long, along ago, when the world was still young and Communism still ruled Eastern Europe, a labor movement arose in Poland and called itself Solidarność. My wife Teri and I, early-marrieds in those days, were serious politics junkies, and one day she was watching one of Ben Wattenberg’s shows on TV, or maybe listening to him on the radio — the details are fuzzy after all these years. But in any event, Wattenberg was reporting on the anxiety Solidarity was generating in Poland’s communist government, which, when the union called a great strike, made a point of referring to members of the movement as “anti-socialist elements.” This phrase was used so often that the union members came to have affection for it, Wattenberg said, and had started wearing shirts featuring the phrase.
Teri thought this was very cool and very funny, and wished that she had an “anti-socialist element” shirt. So she tracked down Ben Wattenberg’s phone number, managed somehow to get him on the phone, and asked him if he had any idea how she might be able to obtain one of those shirts. He indicated that he might actually be able to procure one for her, for twenty bucks — a good bit of money in 1981. But she was all over it.
A few days later it showed up and she wore it with pride for some time (at first greatly confusing the Polish custodian of the apartment building we lived in). The other day she was cleaning out one of our closets and look at what she found:
Still a thing of beauty. Long live the spirit of Solidarność!
Just before Christmas, I posted to my personal blog an account of my year in tech; it’s largely the story of how I strove to simplify my life in 2015. That post had only been up for a few hours when I got a long email from someone I don’t know telling me (among other things) that I was wrong to leave public Twitter, that without Twitter the only people who can give me intellectual feedback will be my students, that I should consider whether it’s un-Christian of me to “stop engaging,” and that if people who want to respond to my ideas have to go to the trouble of writing emails they might not respond at all.
My first reaction to this was sheer bemusement: I simply can’t imagine writing to someone I don’t know to tell him (in detail!) how he should and should not use social media. Is that any of my business? But before I could think further, another email showed up. This one was also from a stranger, and was also in response to a post on my personal blog, a brief one in which I explained why I had not commented on the current controversy at Wheaton College. This person told me that my plea of ignorance was unconvincing and my failure to respond was “weak and timid.”
This second email, coming so soon after the first, clarified some things for me.
There’s a famous Jack Benny routine, one he used many times over the course of decades, in which he’s confronted by a mugger. Now, you need to be aware that the chief personality trait of “Jack Benny” — the character Jack Benny played on radio and TV — is a compulsive miserliness. So:
Mugger [pointing a gun at Jack]: Your money or your life.
Mugger: I said, Your money or your life!
Jack [exasperated]: “I’m thinking it over!”
The internet is also a mugger, but what it demands is not my money but my attention and my reaction, and it wants them right now. And “I’m thinking it over” isn’t an acceptable response.
When the leadership of Wheaton College placed Professor Larycia Hawkins on leave, it was not clear to me precisely why. Several weeks later, and after considerably more communication from the college, it’s still not clear to me precisely why, though everyone agrees that it had nothing to do with Hawkins’ wearing of a hijab. Moreover, even her statement that “Muslims and Christians worship the same God” seems not to have been definitive: other faculty have made similar or identical statements, but (says the college administration on the webpage just cited) “In those instances, the individuals rapidly and emphatically explained their opinions and affirmed their full consistency with the theological identity of Wheaton College.” This lack of clarity has not stopped some people from demanding that Hawkins be fired, nor others from confidently declaring that Wheaton’s leaders are bigots motivated by religious intolerance and Islamophobia. How the latter are able to read the minds and hearts of people they don’t know, I can’t tell you; maybe you could ask them.
Anyway, a great many people are going off half-cocked on this issue; and what those emails I got remind me is that going off half-cocked is now widely perceived as a virtue, and the disinclination to do so as a vice. Moreover, that poorly informed and probably inflammatory statement of Your Incontrovertibly Correct Position must be on the internet — and according to my first protestor either directly on or accessible to Twitter — or it doesn’t count towards your treasury of merit.
I want to suggest some alternative ways of thinking about these matters, and related ones:
- I don’t have to say something just because everyone around me is.
- I don’t have to speak about things I know little or nothing about.
- I don’t have to speak about issues that will be totally forgotten in a few weeks or months by the people who at this moment are most strenuously demanding a response.
- I don’t have to spend my time in environments that press me to speak without knowledge.
- If I can bring to an issue heat, but no light, it is probably best that I remain silent.
- Private communication can be more valuable than public.
- Delayed communication, made when people have had time to think and to calm their emotions, is almost always more valuable than immediate reaction.
- Some conversations are be more meaningful and effective in living rooms, or at dinner tables, than in the middle of Main Street.
In short, peer pressure is always terrible, and social media are a megaphone for peer pressure. And when you use that megaphone all the time you tend to forget that it’s possible to speak at a normal volume: thus my first protestor’s apparently genuinely-held view that if you’re not talking to peers on Twitter you can’t possibly be talking to peers at all. (We must all have been trapped in our silos of silence before 2006.) But the more general view of both of those who wrote to me — that rapidity of response is a virtue, and therefore that technologies that enable rapid response are superior to ones that enforce slowness — is the really pernicious one, I’ve come to believe.
I keep thinking about my first protestor’s complaint that if people can’t respond to me via Twitter or blog comments they might not respond at all. Given my experience of both public Twitter and blog comment threads, my thought is: feature, not bug. Indeed, I should probably have an auto-reply on my email featuring my postal address and encouraging people to write me letters. That might enliven the daily mail deliveries, which have for me, as for all of us, grown so gray and wan over the years. And maybe I would be doing my correspondents a favor also: if typing, printing, and mailing a letter is too much trouble for you, then it could be that the things you have to say aren’t that important, even to you, and you’d be better off using your time in a different way.
Well, that may be too extreme. I’m not a big fan of printing and mailing either, though I use those technologies when I need to. So perhaps email is slow enough, provides enough of a buffer between people and their immediate impulses. Still, I can’t help thinking of the great computer scientist Donald Knuth, who famously doesn’t have an email address. He explains why: “Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration.” My work may not require the same intensity of concentration that Knuth’s requires, but it requires more than I have been accustomed to give it for the past few social-media years.
It won’t be that long before I turn 60, though I struggle to keep that inescapable (and highly unpleasant) fact clear in my mind. I have ideas I want to pursue, stories I want to tell, and friends and colleagues I want to interact with. Things are happening in the world and on the pages of books that I want to meditate on.
I spent about seven years reading replies to my tweets, and more than a decade reading comments on my blog posts. I have considered the costs and benefits, and I have firmly decided that I’m not going to be held hostage to that stuff any more. The chief reason is not that people are ill-tempered or dim-witted — though Lord knows one of those descriptors is accurate for a distressingly large number of social-media communications — but that so many of them are blown about by every wind of social-media doctrine, their attention swamped by the tsunamis of the moment, their wills captive to the felt need to respond now to what everyone else is responding to now.
Not gonna do it. Wouldn’t be prudent. I’m trying to turn my mind towards the longer term, striving to get a little closer to the bottom of things. I’m thinking it over.
Happy New Year, everybody! And Festina Lente!
[Editors note: this post contains plot spoilers for Star Wars: The Force Awakens]
Screenwriters love superpowers — a category in which I include magical powers — because they can make those powers wax or wane according to the needs of the plot at any given moment. This point is more interesting than it might at first seem.
A classic example: in The Return of the King, when the Witch-King is able to shatter Gandalf’s staff with a mere thought. Leaving aside the question of why he wouldn’t just kill Gandalf on the spot if it were that easy, I’ll just note that this is the same guy who with four other Ringwraiths was no match for Aragorn on Weathertop. Now, his power had been increasing since then, but that’s a pretty darn rapid and pretty darn massive increase in power; and hadn’t Gandalf himself also passed through death and returned with renewed strength of his own? No, the whole thing makes absolutely no sense: it loks like Peter Jackson deciding that it would be really cool at that point if Gandalf were helpless before the Witch-King and just about to be killed but then something else happens and the Witch-King flies away without doing anything to Gandalf and wow, wasn’t that a close shave for the old wizard? Making absolute nonsense of the whole concept of “powers” was just a price he had to pay for that cool moment.
Similarly, in The Force Awakens we have in Kylo Ren a massively capable and ruthless controller of the Force who was trained as a Jedi by Luke Skywalker himself and therefore is clearly one of the world’s great masters of the lightsaber — but here he can just barely hold his own against two people who have never held such a weapon before. Again, the simple exigencies of plot and action are at work here: we wouldn’t have much of a story if such scenes unfolded plausibly.
The same storytelling logic plays out, as Brad Bird brilliantly shows in The Incredibles, when villains start “monologuing.” Lucius (AKA Frozone) explains:
Lucius: So now I’m in deep trouble. I mean, one more jolt of this death ray and I’m an epitaph. Somehow I manage to find cover and what does Baron von Ruthless do?
Bob: He starts monologuing.
Lucius: He starts monologuing! He starts, like, this prepared speech about how feeble I am compared to him, how inevitable my defeat is, how the world will soon be his, yadda yadda yadda. Yammering! I mean, the guy has me on a platter and he won’t shut up!
This becomes a theme throughout the movie, which (of course!) indulges in the very narrative tic it so brilliantly makes fun of. Because deploying stuff like this — putting your beloved characters in impossible spots before you rescue them — is just how you sustain tension, how you keep viewers on the edge of their seats.
But maybe there’s something else at work here too. Beyond a need to sustain narrative tension, what do all these moments have in common? It seems to me that they all say, in their varying ways, that power as such, power as we typically define it, isn’t the whole story.
“The race is not [always] to the swift, nor the battle to the strong,” saith the Preacher — to which Damon Runyon famously replied, and with some justification, yeah, but that’s the way to bet. It is, if you’re a betting person. Nobody smart would have bet on David against Goliath. But sometimes — not often, but sometimes — longshots come in. And if you hang around long enough, or read enough history, you start to notice that they tend to do so at curiously opportune times. Not often, mind you, not often enough that everyone will see a pattern, but … sometimes the battle is not to the strong; sometimes overwhelming force is defeated. Occasionally it plays a role in its own defeat, by trusting too much in itself — counting on, calculating by, force only — never suspecting that there may be powers at work other than those of strength, skill, numbers.
And that’s what we find buried in so many of our popular stories, stories that arise from the general human sense of things, even if they get taken up by individual authors or managed by vast rapacious corporations: the suspicion that there’s a reason why the race is not always to the swift nor the battle to the strong. Maybe even a reason that the cynical old Preacher didn’t imagine, since he credited the defeat of the powerful to “time and chance.”
In Tolkien’s Lord of the Rings — setting aside Peter Jackson’s — Gandalf tells the history of the Rings of Power, and especially of the Great Ring made “to rule them all,” and explains that that Ring was constantly striving to get back to its maker, Sauron. And yet it did not make its way to him. Instead it came to Bilbo. And behind that curious event “there was something else at work, beyond any design of the Ring-maker. I can put it no plainer than by saying that Bilbo was meant to find the Ring, and not by its maker.” This is not putting the matter very plainly at all: Gandalf’s use of the passive voice is telling. It is perhaps, even for him, just a suspicion — little more than the reading of hints, in the long historical record he knows so well, that Sauron’s bet on the inevitable victory of Power just might not pay off; that perhaps there is something more moving in the world, something that does not work through the Great but through the small, the weak, the unknown, the neglected, the utterly marginal.
Merry Christmas, everyone.
I have very mixed reactions to the accounts of Christianity and higher education given by Rod’s recent correspondents, but the anonymous Christian professor makes a really vital point about the dangers of the “narrative of Christian oppression.” When I directed the faculty Faith and Learning Program at Wheaton College, years ago, I regularly told younger faculty, “If you submit an article or book for publication and it gets rejected, always, always, always assume that it’s because your work wasn’t good enough.” The assumption of anti-Christian prejudice gets you off the hook for improving your scholarship; ultimately, it makes you listless and lazy and increasingly prone to playing the victim card. Even when you have good reason to suspect such animosity, pretend you don’t, and get back to work.
So props to the Anonymous Christian Prof for noting the dangers of that way of thinking. But not for this:
But there is a second reason that the abandonment of secular institutions by Christians is a shame, and I will be blunt. Contemporary Christians — taken as a group — do not have the intellectual heft of their secular counterparts. The best scholarship, the highest quality thinking, still goes on at secular institutions. To go to a Christian institution involves — by and large – an intellectual compromise.
First of all, the claim is phrased imprecisely: by “contemporary Christians” I assume she means individual academics? But she then goes on in the next sentence to speak of institutions. So I don’t know whether the charge is that individual Christian academics are not as smart as other academics or Christian colleges and universities aren’t as academically rigorous as their secular counterparts. And it’s hard to answer the charge, given that there aren’t very many Christian colleges and universities, and that their institutional missions tend to be quite different than those of secular schools. Also, many faculty at Christian institutions either were (like me) educated wholly at secular universities or in some mix of Christian and secular schools.
But I will say this: I believe the kind of education students receive in the Honors College at Baylor, and at Wheaton, is in most respects far superior to what they would receive at secular schools of greater academic reputation and social prestige. Indeed, in my years at Wheaton I often heard comments to this effect from visitors. I think for instance of a professor at one of America’s top ten universities who said to me, after spending some time with one of my classes, “Your students are better informed and ask more incisive questions than mine do.”
I could adduce more examples, and explore more comparisons, but let me conclude with this: If this professor’s commendation of Wheaton’s students has at least some validity, how might we account for this state of affairs? I would point to three factors:
1) At Christian colleges, students and faculty alike tend to think of learning as a project in which the whole person is involved. Information is not typically separated from knowledge, nor knowledge from wisdom. The quest for education is less performative, more earnest than at many secular institutions. People are more likely to think and speak of education as something that leads to eudaimonia, flourishing.
2) A closely related point: Christian institutions tend to think quite consciously that their task involves Bildung, the formation of young people’s characters as well as their minds. So in hiring and retention they place a greater emphasis on teaching and mentoring than is common in secular institutions. (There are exceptions, of course, but even the most student-centered secular institutions cannot, because of their intrinsic pluralism, specify what good personal formation looks like.)
3) Perhaps the most important feature: Christian teachers and students alike can never forget that their views are not widely shared in the culture as a whole. We read a great many books written by people who don’t believe what we believe; we are always aware of being different. This is a tremendous boon to true learning, because it discourages people from deploying rote pieties as a substitute for genuine thought. No Christian student or professor can ever forget the possibility of alternative beliefs or unbeliefs. Most students who graduate from Christian colleges have a sharp, clear awareness of alternative ways of being in the world; yet students at secular universities can go from their first undergraduate year all the way to a PhD without ever having a serious encounter with religious thought and experience — with any view of the world other than that of their own social class.
Working in Christian institutions, especially because of this alertness to possible alternative beliefs, has been essential to my own intellectual development. Among other things, it has helped me learn to write for multiple audiences, while always keeping near the forefront of my mind the intense relevance of learning for life. I think it is largely because of, and not in spite of, my workplaces that I have been able to write for some of the most academically rigorous presses in the world, and to do so from an explicitly Christian point of view. Does that sound like “intellectual compromise”?
AnonProf says, with what appears to be satisfaction, “At my school we don’t talk about our religious or political identities in class.” At my school we talk about them all the time; we work through them, struggle with them, see them altered by experience and reflection — we put them on the line. In that sense our classrooms aren’t “safe spaces” at all. “Of course they’re not safe,” I’m tempted to say,” “but they’re good.” What happens at secular institutions like the one AnonProf teaches in may also be good, in its own way, and a better choice for some students, including some Christian students. (In education, one shape definitely does not fit all.) But that it’s intrinsically academically superior is an unsustainable claim.
A: I’mma let you finish, but can I just interject a thought here?
A: Just think for a minute about what’s been happening on social media in the past few weeks, in the aftermath of the Colorado Springs and San Bernardino shootings. One group lines up and shouts that these things would never happen if we had stricter gun laws, as though terrorists give a rat’s ass about gun laws; another group lines up opposite them and shouts that these things would never happen if we had a properly armed citizenry, as though mutually assured destruction were equal to social peace. None of of these people are considering, reflecting, thinking; they’re just reacting and emoting. Some of them can’t even grasp the moral difference between those who murder and those who are murdered. The very thought of allowing these people a vote — a voice in the shaping of society — appalls me. They’re simply incapable of rational thought.
B. Your description of the behavior of many people — most people? — is depressingly accurate. A few years ago there was an active Tumblr called These Tragic Events Only Prove My Politics, and I expect that the person who made it stopped updating it because it would have been beating a dead horse. There’s more evidence of the thesis every single day.
That said, we may have reached the point in this conversation where you and I most fundamentally differ. While I agree with your description of each side’s position, and while I agree that both sides are being thoroughly and depressingly irrational, I don’t agree that these people are incapable of rationality. After all, the great majority of them navigate their way through life, through work and marriage and child-rearing and all sorts of other things, without catastrophic error.
A. Well, I don’t —
B. Before you say something like Look at the divorce rate or Look at the anti-vaccination weirdos, stop and ask yourself whether the meritocratic elite you want to put in charge do any better in these areas. Remember, it’s among that elite that Soylent is a thing.
A. You may have a point there.
B. Yeah. But even if I don’t, it’s still my turn. And here’s my chief point: we have a paradox to explain. That paradox is that it’s easy to find people who have long personal histories of behaving fairly rationally – not perfectly so, but fairly so – who are nevertheless bizarrely irrational in their political opinions and their voting. So, as I read the evidence, the problem cannot be that “these people are simply incapable of rational thought.” The problem must lie within the structures of our political system.
Now, you and I agree that the current political system is messed up, but you attribute that mess to a failing or a limitation in the nature of most human beings, who in your view amount to little more than the higher cattle – as Nietzsche says, who envy the cow his unreflective “unhistorical” existence. But –
A. I really don’t want to say that. Perhaps my frustration with the pointlessness of recent political discourse, especially on social media, led me to formulate my thoughts a little too crassly. Let me, briefly, try again: Most people are not irrational per se, but their lack of interest in the actual problems and issues of governance, their preference for other activities, a preference that they disguise from themselves by uttering strong political opinions, disqualifies them from governing themselves. They will be happier and the world will run better if –
B. If running the world is left to the experts?
A. Yes. Treating expertise as though it’s frightening or absurd or impossible is one of the pathologies that we need to get over.
B. Okay. Your amendment is a good one. But it doesn’t alter my case, the making of which I shall now resume. You’re not saying that most people are irrational, but you are saying that they are congenitally unlikely to take sufficient interest in actual politics to learn what they need to know to vote wisely and well; and therefore will, perhaps not at first but eventually, be relieved to have political decisions taken out of their hands and given over to people who are qualified (by temperament, ability, and training) to make them and for whom the making of such decisions will be a full-time job.
A. Close enough.
B. What I would say, by contrast, is that people are interested in and even knowledgable about politics – but politics only on a human scale, a scale appropriate to the range of their experience and interests. Many, perhaps most, of the pathologies of our current political order are products of inhuman scale, what one of our best poets called “the long numbers that rocket the mind.” Consider this: people in Boston will consider themselves invested in what happens in San Bernardino in ways they absolutely are not in what happens in, say, Dublin – and yet San Bernardino and Dublin are pretty much equidistant from Boston. People in Boston can’t alter events in San Bernardino any more than they can alter events in Dublin, and yet the ideology of the modern nation-state makes them feel that they can and should have some say in what happens 3,000 miles away, as long as that distant place is in the same country. In one sense, of course, as Terence reminds us, Homo sum, humani nihil a me alienum puto; but in another sense 3,000 miles of separation stretches the possibilities of genuine understanding beyond their natural capacity, and social media and video don’t change that situation.
And this takes me back to my earlier comment about your emphasis on “exit not voice”: the exit-only option might make sense if I dislike life in Waco and decide to try a more congenial environment in Austin; it will make less sense if I live in Topeka and decide that the whole American social order is so flawed that I need to move to Toronto. Exit-not-voice is great if you have either (a) tons of money or (b) a human scale. But then, if you consider matters on a genuinely human scale, the whole question of voice looks very different. Because just leaving Topeka may not seem such a rational option, even though I despise the whole American social order, if my mom lives there.
This is a kind of scholarly-geeky follow-up to my earlier post on the ways that humanistic study can — doesn’t always or necessarily, but can, if the good will is there — promote compassion, fellow-feeling, mutual recognition. But if the good will is not there, and there is no curriculum of study in place to promote it, encourage students to consider its value … well, then you get protests and demands.
This is not to say protests and demands have no place in social life — but it’s pretty clear here that on many campuses today demands, and stringent demands at that, are the first recourse: we may take as a fairly representative example Dean Mary Spellman of Claremont McKenna College, whose obviously sympathetic email to a troubled student merely led to her forced resignation. That’s one side of this demand culture — zero tolerance, single sanction (expulsion from the community) — and the other side is the attempt to herd faculty as well as students into cultural/racial/sexual sensitivity courses, in order, ideally, to make such errors impossible.
A lot that can be said about this has already been said: that it’s an obvious repudiation of free speech, that it’s reminiscent of Maoist and Stalinist re-education programs, that it’s an error-has-no-rights model, and so on. And I don’t strongly disagree with any of these arguments, though I think I have more sympathy with at least some of the protestors than many of my fellow conservatives. But what strikes me about the whole approach is how … well, Baconian it is. Sir Francis Bacon, not Francis Bacon the painter or Kevin Bacon. Bacon’s early attempts at inaugurating what we would now call the scientific method, and extending it into the whole of philosophical reflection, seem to me to prefigure rather eerily the thoughts of the student protestors.
As I said, this is a geeky sort of take on the whole business. But please bear with me.
In his New Organon of 1620, Bacon lays out his method for inquiry. He begins by stating, “I propose to establish progressive stages of certainty,” and agrees that many of the medieval scholastics who emphasized the power of logic wanted to do the same thing. He doesn’t think they went about it the right way, but he and they have this in common: “they were in search of helps for the understanding, and had no confidence in the native and spontaneous process of the mind.” That’s true of our modern protestors as well: “the native and spontaneous process of the mind” — the white male mind anyway — is notoriously unreliable.
With that agreement in mind, let’s turn to how Bacon differentiates himself from the logicians:
But this remedy comes too late to do any good, when the mind is already, through the daily intercourse and conversation of life, occupied with unsound doctrines and beset on all sides by vain imaginations…. There remains but one course for the recovery of a sound and healthy condition — namely, that the entire work of the understanding be commenced afresh, and the mind itself be from the very outset not left to take its own course, but guided at every step; and the business be done as if by machinery.
Emphasis mine. The mind of someone like Mary Spellman is “occupied with unsound doctrines and beset on all sides by vain imaginations”: she can only be cast out, so the community can turn to the more promising work of educating younger people. They will “be guided at every step”; and since the lessons they have to learn are fixed and invariant, “the business [may] be done as if by machinery.”
In a vitally important essay that I referred to in an earlier post on this constellation of issues, the philosopher Charles Taylor calls this Baconian habit of mind “code fetishism” or “normolatry.” And in summarizing the thought of Ivan Illich, Taylor develops his point:
Even the best codes can become idolatrous traps that tempt us to complicity in violence. Illich reminds us not to become totally invested in the code — even the best code of a peace-loving, egalitarian variety — of liberalism. We should find the centre of our spiritual lives beyond the code, deeper than the code, in networks of living concern, which are not to be sacrificed to the code, which must even from time to time subvert it.
This probably won’t be the last time I revisit these ideas by Taylor, because they are so vital for understanding multiple pathologies of our current public square. This Baconian disciplinary “machinery,” executing the normolaters’ preferred codes, simply eliminates the possibility of strengthening our “networks of living concern.” The code-fetishist model must be resisted at every turn, and the people best placed to do that are people who have been formed within a deeply humanistic model of inquiry and debate. Nothing is more needful in the current campus environment than a renewal and re-empowerment of the humanities.
There’s a great moment in the movie High Noon — a moment that keeps recurring to my mind as I think about the current conflicts on campus. Despite having promised his fiancée that he will leave town with her, Marshal Will Kane decides he has to stay to fight the Miller gang, who are about to terrorize the town. He makes a strong case, but she — vitally, a Quaker — says, “I don’t care who’s right and who’s wrong, there has to be a better way for people to live.”
Those words keep echoing in my head (I even hinted at them in my previous post on these matters): I don’t care who’s right and who’s wrong, there has to be a better way for people to live. Alas, the way things play out in the movie doesn’t give me a lot of encouragement.
A. Whoa, that was quite an interruption. Where were we?
B. You were talking about voice versus exit.
A. Ah yes, thanks. So: one of the ways we might define democracy is as an irrational obsession with voice — with “having a say.” As I have tried to argue, most people don’t know enough, and don’t care enough, to deserve a say; but even those who do have some knowledge and some interest typically disagree with one another often enough and seriously enough that the work of ordering a society is compromised, disrupted, sometimes paralyzed – as, for instance, when the U.S. government shuts down because Congress can’t agree on a budget. This is, as the saying goes, no way to run an airline. What we need to do is overcome this irrational obsession with having a say, and instead place the emphasis on the power of exit. I have a good bit of sympathy with those who argue that exit is the only truly universal human right: the ability to say, “I’ve had enough of this crap, I’m getting out of here.” People who have the right of exit have a very powerful right.
B. But what if you live in Kansas?
B. Let’s be realistic about this, shall we? The right of exit is one that doesn’t make a whole lot of sense if the only door out is a couple of thousand miles away. You call it a universal human right, but how can it be a right if only those who have a sizable pile of cash are able to take advantage of it? It’s interesting that this argument – the argument that exit is the fundamental human right – tends to come from the seasteaders, from people who have enough money (or hope to get it) that they can actually imagine creating an offshore community, a city floating in the Pacific, that they can live on. And of course that would also mean having enough of a savings account to buy whatever goods can’t be produced out there. It’s ridiculous. It’s Fantasy Island for rich people.
A. No, it doesn’t have —
B. Don’t you think it might be my turn now?
A. Fair enough. Do your worst.
B. Thanks. Okay, so, furthermore — and to me this is a major problem with your whole way of thinking — you’re assuming completely deracinated human beings. The people you have making decisions, whether those decisions involve how to run this imaginary neo-reactionary world or how to leave it, are all just fleshy calculating machines, running the numbers to achieve maximum efficiency according to some totally abstract utilitarian calculus. In your world it wouldn’t even make any sense for someone to say “I don’t like the government that I have right now, but I love this place and don’t want to leave it.” What if we have, as Simone Weil argued, a need for roots? What if, as Roger Scruton argues, piety is fundamental to human flourishing?
A. Piety?? Haven’t heard that term in a while.
B. Piety in the sense that Virgil refers to his hero as “pious Aneas”: pietas, devotion — devotion to something bigger than my own preferences. You can’t build Rome without piety, or sustain it once it’s built; you can’t even build your Fantasy Island in the Pacific without some degree of that virtue.
A. Actually, I think you can build that fantasy Island in the Pacific – if that’s your thing; I’m not saying it’s my thing – without piety as you have described it. But never mind. Let’s think about Virgil’s Aeneas. He didn’t really have a lot of choice about his piety, did he? Every time he was in danger of forgetting it one of the gods came down to kick his ass and get him back on the right track. Pietas is all well and good in a world monitored and disciplined by the gods. We don’t live in that world, and I think all too often what we call “attachment to place” is really just nostalgia for a world we don’t live in any longer, if indeed we ever really did.
B. Oh, I think we live in the same world our ancestors did, overseen by the same deity or deities that have always been here. Isn’t it interesting that our conversation has taken us into the realm of metaphysics? It turns out that if you argue about politics long enough you get to something deeper than politics. But I think we can keep the metaphysical arguments at arm’s length at least for a while longer. It’s sufficient to ask whether human beings can flourish in the kind of moral and political world you are imagining, without pressing the question of why they need certain conditions in order to flourish. (It would be possible to give purely naturalistic, sociological, evolutionary reasons why people need roots and are rightly driven by piety.) It’s enough, I think, for me to insist that in your utopia people would be miserable. And they would be miserable because they would lack a voice, though not, perhaps, in the sense in which the neo-reactionaries typically talk about voice.