(To be continued…)
When I came to teach at Wheaton College, in the fall of 1984, on a one-year contract, I had been a Christian for about five years, knew almost nothing about the evangelical subculture, and was a theological novice. My one virtue was that I had some inkling of how little I knew. So I sought out Roger Lundin.
That year, it should be remembered, was the apogee of the Age of Theory, or near enough, and I had just completed my training in at least some of its intricacies. But when I arrived on Wheaton’s campus I quickly learned that this was not perceived as an accomplishment, but rather a cause for suspicion. Only Roger — who had himself been trained in a long-familiar makeshift blending of historical and New-Critical practices, but had been seeking, without much help or encouragement, to educate himself in the newer directions of our discipline — was a possible conversation partner. I read a few articles he had written and realized that, like me, he believed that within the Christian traditions there were resources for listening and then responding to the voices of critical theory; but, unlike me, he was already prepared by theological training to do this work. So I sought him out.
I would have sought him out anyway; his warmth and kindness drew me, as they drew so many others, towards him. We began talking our way through these issues, and through so much else. He introduced me to his longtime friend Mark Noll, whose office was then four floors up from ours, later a more convenient one floor down; Mark became my second theological teacher and instructor in the ways of the Christian mind. Roger and I prayed together and laughed a great deal. We created an organization called Slow White Men of America, of which we were the charter and only members. Once we spent a morning at Borders where I came across a book on a display table called Hollywood Priest: A Spiritual Struggle. I held it up before Roger solemnly, and he said, “Big Al” — he always called me Big Al, a somewhat comical thing given the disparity in our sizes — “I understand how people can be horrified at living in this country, but I do not understand how anyone could ever be bored.”
I am strolling down memory lane here, but this is appropriate. Anyone who knew Roger at all well noted how profoundly memorial, as well as historical, his imagination was. I may have had a very slightly better verbal memory than he did, which means that I could sometimes remind him what someone had said in a given long-ago conversation — at which point he would tell me precisely when that conversation had happened. Were he reading this, he would cheerfully rattle off the year, month, and day on which we paid that visit to Borders, what the weather had been, what headlines had been featured in the Trib, and how Chicago’s professional sports teams had performed.
This was a kind of parlor trick, and an invariably impressive one, but for Roger the cultivation of memory, both personal and cultural, is an essential spiritual discipline, and one which Americans, in our haste always to fare forward, tend very much to neglect. We strain into the future, but, as Fitzgerald reminds us, “we beat on, boats against the current, borne back ceaselessly into the past.” It was Roger’s distinctive calling as a teacher and scholar to encourage us to embrace that backwards pull, to use it to help us understand where we have come from, and to do honor to those who went before us. As we read in Ecclesiastucus: “Leaders of the people by their counsels, and by their knowledge of learning meet for the people, wise and eloquent are their instructions: … All these were honoured in their generations, and were the glory of their times.” Roger, who died suddenly last week, has now joined this great company.
The last communication I received from Roger, several weeks ago, concluded with these words:
Thanks, deep thanks, for your kindness, which is rooted in a friendship that now reaches back 31 years!
My beloved friend Brett Foster never wanted to leave anything behind. My wife Teri took this picture of him on the streets of Durham, I believe, in the summer of 2011, when we spent the summer in England leading a study tour. You can see the bag handing from his shoulder, and the scarf, and top of the coffee cup, but not the enormous backback that’s sitting just outside the frame. He was always encumbered in this way, because he never wanted to leave anything behind that he might need.
Now he has left us all behind.
Wheaton College’s Pierce Chapel was not made for funerals, especially Anglican ones, but it served, and hundreds of Brett’s friends arrived there on Saturday to commend his soul to God. November in Chicagoland is typically an unkind month, but the day was sunny and clear and warm enough that someone had opened windows in the chapel. Father Martin Johnson, the rector of All Souls’ Church, where Brett and I and our families were members for a decade, had brought the church’s butcher-block altar over and set it up in front of the stage. That altar had been made for the church by David Hooker, who became one of Brett’s dearest friends. Martin was grieving hard and perhaps working even harder — to make the day right for Brett, and for us.
After the tributes had been made — beautiful words, by some of the many who loved Brett — it was time for Communion. Brad Cathey had made the loaves and laid crosses upon them before baking, and Father Martin and Father Paul stood before the long lines and gave the bread of life, with the words of life.
But then some small chunks of bread fell to the ground at Martin’s feet. One piece was stepped on. I noticed that the poet Scott Cairns, who was sitting right in front of me, saw this, and was looking on in some discomfort. Then, when the last communicant had served, Scott and I at the same moment hunched forward, knelt, picked up every piece, and ate — I quickly and in some embarrassment, he, back in his seat, slowly and with reverence. This is my body, broken for you.
It was time for the final commendations. Martin circled Brett’s coffin with the censer. The sun was low enough now that its light slanted through the open windows, and I could see by that light the smoke of the incense caressing the embroidered pall that covered the coffin. A quiet breeze wafted it away as Martin, his censing completed, stood directly before the coffin. Then he bowed and, with infinite gentleness, kissed it.
We moved then to the cemetery, to commend Brett’s body to the earth, in the hope of the resurrection. As Brett’s wife Anise, and their children Avery and Gus, and Brett’s mother Suzanne, and Anise’s mother Sharie, and all the rest of the family and the great circle of friends looked on, Martin spoke the ancient and beautiful words, and, according to custom, cast the first handfuls of earth onto the coffin.
Though I love those words, I did not, this time, hear them. I looked around and saw that Brett’s corner of the cemetery was surrounded mostly by pines, though one old oak reached out a limb over the grave. It still had its leaves; soon it will shed them over the broken earth there. I remembered the words that Scott Cairns had read earlier, in the time of tributes at the chapel, words that he had written years ago for his father but that could not be more right for this grievous occasion, full of its own strange hope.
And this is the consolation — that the world
doesn’t end, that the world one day
opens up into something better, and then we
one day open up into something far better.
Maybe like this: one morning you finally wake
to a light you recognize as the light you’ve wanted
every morning that has come before. And the air
itself has some light thing in it that you’ve always
hoped the air might have. And One is there
to welcome you whose face you’ve looked for during all
the best and worst times of your life. He takes you to himself
and holds you close until you fully wake.
And it seems you’ve only just awakened, but you turn
and there we are, the rest of us, arriving just behind you.
We’ll go the rest of the way together.
The books above are most of the ones I’ve assigned for my Great Texts of the Twentieth Century course (missing are Art Spiegelman’s Maus and the daily poems I’ll be reading to my students). It’s a pretty heterogeneous group, but taken together these books touch on a great many of the key issues of our time, and those of all times: not just racism, sexism, and colonialism, but also the rise of biological science as First Philosophy, the various ways cultures constitute identity, the furthest reaches of human barbarity, the transformation of culture by electronic media, and the miraculous power of the writings of a first-century Jew to illuminate and interpret modern consciousness. Something to offend everyone, you might say.
And this is the point, or at least one of the chief points. Here at Baylor, I want my students—most but not all of whom are Christians; some are simply unbelievers, some are uncertain and struggling—to encounter the texts, and through those texts the experiences, that served to undermine Christian faith and practice in the twentieth century. But I also want them to encounter Christians (Barth, Merton, Bonhoeffer, Eliot, Auden, Simone Weil in her unique way) for whom the twentieth century’s challenges provided an impetus to re-think and re-live Christianity in fresh ways.
I could, of course, work to protect them from this violent clash of powerful and contradictory ideas; I could—I am free to do this—build a syllabus that focused on Christian writers and perhaps other religious believers and presented anti-religious writers whose work is cartoonish or in other ways simplistic. And perhaps if I did that some of my students would feel safer. But that, I am convinced, would be a false sense of safety, and would leave them underprepared for an adult world in which their ideas and beliefs will receive daily challenges. What kind of teacher would I be if I let that happen?
There is one point on which the crusade for the imposition of trigger warnings is absolutely right. It is not for nothing that reading was always feared throughout history. It is indeed a risky activity: reading possesses the power to capture the imagination, create emotional upheaval and force people towards an existential crisis. Indeed, for many it is the excitement of embarking on a journey into the unknown that leads them to pick up a book in the first place.
Can one read Proust’s In Search of Lost Time or Tolstoy’s Anna Karenina ‘without experiencing a new infirmity or occasion in the very core of one’s sexual feelings?’ asked the literary critic George Steiner in Language and Silence: Essays 1958–1966. It is precisely because reading catches us unaware and offers an experience that is rarely under our full control that it has played, and continues to play, such an important role in humanity’s search for meaning. That is also why it is so often feared.
And reading should be feared; particular books and authors should be feared. But it is not always—indeed, it is not often—best to flee from what we fear. Better to master the fear, to approach what scares us, but to do so with care and preparation and in an environment where those around you wish you well.
My dear friend Brett Foster, whose death earlier this week hangs over me heavily, wrote a poem that I have been thinking about a great deal. It’s called “Back-to-School Rondeau”, and I think it beautifully describes the fears and excitements of genuine education, the ways the pursuit of knowledge takes up and involves the whole of our being. I’ll leave you with it.
It’s almost time to set aside the waning
distractions of first youth, the life contained
for years at home. What’s home? The place you grow
out of, everything receding slowly,
fading like a chalked sidewalk in the rain.
Leave childish things behind, said a certain
fellow. (Others afterward.) Don’t remain:
the friends gone late in summer let you know
it’s almost time.
Don’t leave behind new clothes, impromptu plans —
they’ll match surroundings well, remind again
of shining coming: new homes to let go
of, too; the best things said; mind’s overflow;
surprising callings; time for love, and pain.
It’s almost time.
Richard Rodriguez, in his great memoir Hunger of Memory, movingly recounts the day when a nun, one of his teachers at his parochial school in Sacramento, asks his Mexican parents to speak English at home in order to encourage Richard to improve his English, to be more confident speaking it in school. He was in first grade.
Not a request that would be made today, I suspect. His parents agreed, of course—a nun had asked them! And while young Richard missed very much the sounds of Spanish at home, his English did get better. He became more comfortable at school; indeed, eventually his public identity came to be closely associated with his academic success. And he became strangely grateful for that nun’s request. It set him on the road to manhood: “I became a man by becoming a public man.”
This experience (and others like it) led eventually to Rodriguez, as a graduate student, becoming notorious for his opposition to bilingual education programs. He may or may not have been right in that opposition—there should be, and there is, serious debate about when young people need to make that essential transition from the private comforts of home to the sometimes challenging but also rewarding demands of public life—but no one has ever articulated more precisely the essential principle at stake here:
While one suffers a diminished sense of private individuality by becoming assimilated into public society, such assimilation makes possible the achievement of public individuality.
I think Rodriguez’s point is essential for understanding the current kerfuffle at Yale University, where students and alumni are—track with me here—demanding the resignation of of the master of a college and the assistant master because the master is refusing to apologize for not having exercised dictatorial authority over other students’ Halloween costumes. To most outside observers this will seem pretty silly. But let’s ask where the kind of reaction the students and alumni are having comes from.
The key may be found in this op-ed in the Yale Herald, significantly titled “Hurt at Home.” Jencey Paz writes,
As a Silimander, I feel that my home is being threatened. Last week, Erika Christakis, the associate master of Silliman College, sent an email to the Silliman community that called an earlier entreaty for Yalies to be more sensitive about culturally appropriating Halloween costumes a threat to free speech. In the aftermath of the email, I saw my community divide. She did not just start a political discourse as she intended. She marginalized many students of color in what is supposed to be their home.
But Silliman College is not “supposed to be their home.” It is a residential college in a university, a place where people from all over the world, from a wide range of social backgrounds, and with a wide range of interests and abilities, come to live together temporarily, for about 30 weeks a year, before moving on to their careers. It is an essentially public space, though with controls on ingress and egress to prevent chaos and foster friendship and fellowship.
It is possible, of course, that Yale sells their residential college system to students as a kind of “home”; I don’t know. The official description seems to me to strike an appropriate note without over-promising: “The residential colleges allow students to experience the cohesiveness and intimacy of a small school while still enjoying the cultural and scholarly resources of a large university; the residential colleges do much to foster spirit, allegiance, and a sense of community at Yale.”
Now, to be sure, this “cohesiveness and intimacy” can for some students be very powerful—their college can even be a better and healthier environment for them than their actual home. The great theater critic Kenneth Tynan loved Magdalen College, Oxford (where C.S. Lewis was his tutor) so much that he wanted his ashes to be interred there. But it was not his home, and could not have been, because there were other people there who didn’t even know him, or who knew him but didn’t like him, or whose preferences were radically different than his, and who had no long-term bond with him to force them to come to some mutually agreeable terms beyond basic tolerance for three years or so.
Residential colleges have long been defended as transitional spaces between the world of home and a fully independent adult life, and it would be a great mistake to think of them as merely continuing the ethos of home. That would leave young people totally unprepared for that “adult life,” which I think we might, for the purposes of this discussion, define as that period of one’s existence during which there is no one to run to to demand control over other people’s Halloween costumes. When one only has, to return to Rodriguez’s terms, “private individuality,” it is quite natural, if not altogether admirable, to seek out an authority figure when someone’s holiday costume offends you. But by the time one gets to college one’s “public individuality” should be sufficiently developed that the wearing of costumes should be seen as an essentially trivial matter that students can deal with among themselves. If they can’t, then the university needs to acknowledge that they’re dealing with some serious cases of arrested development.
Let me wrap this up by simply repeating a passage from a post I wrote some months ago: In a fascinating article called “The Japanese Preschool’s Pedagogy of Peripheral Participation,” Akiko Hayashi and Joseph Tobin describe a twofold strategy commonly deployed in Japan to deal with preschoolers’ conflicts: machi no hoiku and mimamoru. The former means “caring by waiting”; the second means “standing guard.” When children come into conflict, the teacher makes sure the students know that she is present, that she is watching—she may even add, kamisama datte miterun, daiyo (the gods too are watching)—but she does not intervene unless absolutely necessary. Even if the children start to fight she may not intervene; that will depend on whether a child is genuinely attempting to hurt another or the two are halfheartedly “play-fighting.”
The idea is to give children every possible opportunity to resolve their own conflicts—even past the point at which it might, to an American observer, seem that a conflict is irresolvable. This requires patient waiting; and of course one can wait too long—just as one can intervene too quickly. The mimamoru strategy is meant to reassure children that their authorities will not allow anything really bad to happen to them, though perhaps some unpleasant moments may arise. But those unpleasant moments must be tolerated, else how will the children learn to respond constructively and effectively to conflict—conflict which is, after all, inevitable in any social environment? And if children don’t begin to learn such responses in preschool when will they learn it?
Imagine if at university they had developed no such abilities and were constantly dependent on authorities to ease every instance of social friction. What a mess that would be.
Damon Linker, with whom I seem to be debating a lot these days:
Professors in the humanities and social sciences engage in highly specialized research, attempting to push knowledge into new areas — and many view this effort as a project that involves and requires liberating individuals from the dead weight of received prejudices.
The result is that academics usually end up pursuing scholarly agendas that are the furthest thing from anything that could be described as “conservative.” The imperative to advance knowledge demands that research contributes something new. Meanwhile, the tendency to relegate all received truth claims to the category of prejudice leads to suspicion even of the established findings of the previous generation of scholars.
This would be a more convincing argument if “the established findings of the previous generation of scholars” were never oriented towards or tended to reinforce political liberalism; and if liberals had a tendency to relegate their own preferred truth claims, the ones they have received from previous generations of liberals, to the category of prejudice. (I’m using “liberalism” in a loose and conventional sense here; one could of course ask whether liberalism is really liberal, and so on.)
Linker wrote in a recent column devoted to defining contemporary conservatism that “If you ask conservatives what this comprehensive moral outlook consists of, they’ll likely say one of several things: Devotion to individual freedom. Constitutionalism. A concern with limited government. Fear of tyranny.” In many different academic disciplines, there’s an enormous body of scholarly work that, in the view of the conservatives Linker describes, neglects or attacks these values. Wouldn’t conservatives then have an inclination to critique that scholarship and defend those values? Would they be piously reverent towards scholarship that endorses, and often enforces, liberalism simply because it’s “established”?
On the other side of the fence: as Linker acknowledges, an essentially if vaguely liberal outlook on the world simply is the default position of the majority of professors in the majority of academic disciplines, and has been for several professorial generations; so where among these liberals is the suspicion of the past that Linker identifies with liberalism? It turns out that such suspicion is highly circumscribed: it’s limited to some findings of previous generations of scholars, not extended to those scholars’ overall political outlook.
Linker wants to argue that conservatives have, because of their core values, selected themselves out of the academy. And no doubt that happens sometimes. But a great many more who would like to make a contribution are ruled out — quietly and behind the scenes — for exactly the same reason that genuinely radical leftists are ruled out: they would rock our gently swaying boat, here on our calm, calm lake.
A: Well … Loki’s quite right, you know — at least in terms of what he says, as opposed to what he means. He means that we want to be ruled by him, a claim that I would firmly though politely (his being a god and all) reject. But do we want to be ruled? Of course we do. That’s why human societies so strenuously avoid direct democracy. Rule is tedious; it’s boring — almost no one actually wants to do it — we have a thousand other things we’d rather pursue, including, as a high priority, announcing to everyone who’ll listen how much better off the world would be if we ran it. For every person who votes there are at least four who want to tell you how they have all our political questions sorted out. Of course we want to be ruled. The only questions are who will rule us and how they will do so. And I’m making a proposal concerning those questions.
B. You’re confusing delegation and abdication. Those of us who through electing representatives delegate certain civic responsibilities aren’t abandoning self-rule! Note that, for one thing, we reserve the power to recall our representatives if we think they are abusing the trust we have placed in them, at the next election or, in desperate circumstances, earlier.
A. “If we think they are abusing the trust,” indeed. In a democracy hoi polloi are notoriously incompetent at figuring out whether they are being abused, having a strong tendency to re-elect their abusers while rejecting with alacrity people who are telling them sober and necessary truths. Didn’t Burke tell us this long ago?
When the leaders choose to make themselves bidders at an auction of popularity, their talents, in the construction of the state, will be of no service. They will become flatterers instead of legislators… If any of them should happen to propose a scheme of liberty, soberly limited, and defined with proper qualifications, he will be immediately outbid by his competitors, who will produce something more splendidly popular. Suspicions will be raised of his fidelity to his cause. Moderation will be stigmatized as the virtue of cowards; and compromise as the prudence of traitors; until, in hopes of preserving the credit which may enable him to temper, and moderate, on some occasions, the popular leader is obliged to become active in propagating doctrines, and establishing powers, that will afterwards defeat any sober purpose at which he ultimately might have aimed.
The best description imaginable of President Trump. Again, I think the people have proven both their incapacity to rule themselves and their fundamental disinclination to do so.
B. You would win me over with this citation of Burke if I didn’t know that Burke would have been horrified by the proposal you’re making.
A. Would he have? Is Burke the enemy of aristocracy?
B. Of your kind of aristocracy, I believe he is. He wrote in his famous letter to the Duke of Richmond, speaking of the hereditary aristocracy, “You, if you are what you ought to be, are in my eye the great oaks that shade a country, and perpetuate your benefits from generation to generation. The immediate power of a Duke of Richmond, or a Marquis of Rockingham, is not so much of moment; but if their conduct and example hand down their principles to their successors, then their houses become the public repositories and offices of record for the constitution.” This kind of long-term care for the good of a dear local place, or even a fatherland, is unlikely to be in the minds of your New Meritocrats. Indeed, I suspect that you’ll want to have such sentimentality bred out of them.
A. The most interesting and important phrase in that quotation is “if you are what you ought to be” — he knew perfectly well that the British aristocracy rarely were what they ought to be, and on occasion let them know his opinion of them with considerable asperity. He preferred that aristocracy to the rule of the demos, and with good reason. But I think that if we could bring Burke back now, I would at least try to convince him that there is a better model of aristocracy than the one he knew.
B. Yeah, well, good luck with that. But let’s not waste time debating counterfactuals or imagining alternative histories. In your imagined world, the demos will have no voice. You say that’s fine, because they don’t want one. But of course some of us will want one. And — let me guess here — you’re not planning to give us one. You’re going to offer no voice, but the possibility of exit. Right?
A. You are correct, sir.
Okay, Ron, you ask whether you’re allowed to sneeze. A tendentious way of presenting the issue. Of course you’re allowed to sneeze — it’s not as though anyone can stop you. If you’re in a closet with your family hiding from intruders bearing pistols and daggers, you’re allowed to sneeze, but I wouldn’t recommend it.
What’s that? Watching deer eat from a feeder isn’t like hiding from violent criminals? You’re right, Ron, it isn’t. Not very much, anyway. But again you’re missing the key point that you should be focusing on. We’re trying to establish a principle here, Ron, and the principle is that you can suppress a sneeze if you want to. But — and here we’re approaching the crux of the matter — you didn’t want to.
Go back and watch that video again, Ron. Look at those beautiful creatures: their delicate faces, their gentle demeanor, their polite interest in the contents of the feeder. And then the white snow in the background. There’s serenity there, a peaceful interlude in our lives that are so full of conflict; a chance for the deer to forage a bit — always more difficult for them in the winter — and for us to have a moment’s communion with the natural world.
And that’s when you decide to let one rip, Ron. Great.
You are what’s wrong with America, Ron, did you know that? You could have suppressed your impulse to sneeze, suppressed it for the greater good, for the good of the deer and your wife (if that’s your wife) and for all the good people of YouTube; but you chose not to do that. You didn’t even turn aside, or sneeze into your sleeve. You thought indulging your impulse was the most important thing in the world, and you got positively angry when someone suggested to you that it just might not be. A total lack of impulse control is what’s sending our country into what may be a permanent moral decline, and you’re the poster child for that vice.
Thanks, Ron. Thanks a lot.
One of the most regular running jokes in my family, for many years now, is that I don’t play Wii Boxing because I think it’s too violent. We make a joke of my tender conscience, but I really do wince when a little Mii’s head snaps back. I can’t play for more than a couple of minutes. I pause the game; I switch to golf, or tennis, or frisbee. My discomfort is genuine, and deeper than any reasonable standard would deem appropriate, and (to me, anyway) not funny at all. The roots of it sink deep into my life; follow those roots 40 years deep — give or take a few days — and eventually you’ll find yourself in front of a little black-and-white television set, in Birmingham, Alabama, on the first day of October 1975. Three days earlier I had turned seventeen.
Until then, I had been for most of my young life a very serious boxing fan. Boxing was common on network TV in those days, which was good, because network TV was all we had. Muhammed Ali was of course the dominant figure of the era, the one you couldn’t escape even if you wanted to, and a few years earlier, in my local library, I had picked up Sting Like A Bee: The Muhammed Ali Story, written by the admirable light-heavyweight Jose Torres in collaboration with Bert Randolph Sugar. I read it, read it again, went back to the library to renew it, and read it one more time.
What fascinated me wasn’t the biographical narrative, but Torres’ account of what life in the ring was really like. I have never forgotten his words in praise of body-punching:
I’ve hit fighters in their bodies with so much force that they couldn’t help but let out an involuntary groan like a wounded wolf. Uually the man who connects will jump at the hurt fighter with more punches. I never attacked after such a punch. I used to step back and let my rival savor every second of pain. I was not only a sadist but a technician; I knew how discouraging those punches were to the body. I became world’s champion by throwing one. A left hook to the liver.
You can see the left hook Torres is talking about at the beginning of this clip: Willie Pastrano is the victim, and it takes Willie about two seconds after the punch lands to feel its effect. When he does, he crumples. He gets back up, God bless him, and finishes the round, but that’s as far as he can go. The ref stops the fight, and Torres takes the title.1
It was Pastrano’s last fight. He retired, age twenty-nine.
I didn’t fight, myself, aside from a handful of schoolyard flailings; I small for my age, already a lover of words, and Torres wrote vividly; I became a literary boxing fan long before I knew that that was a tradition. By the time Ali fought Joe Frazier for the third time I considered myself a connoisseur. I had never heard of A. J. Liebling and his “sweet science of bruising” but I would have loved it if I had known.
I don’t think I had watched the first Ali-Frazier fight live, though I had seen replays on Wide World of Sports. Until that fight it was commonly said of Ali that he wouldn’t be able to take a punch, but in the fifteenth round of that fight Frazier hit Ali with as perfect a left hook to the jaw as has ever been thrown … and Ali got right back up. No one has ever had a more devastating left hook than Frazier, and no matter how many times I watch that clip I still cannot understand how that punch didn’t knock Ali cold. In slow motion you can see Ali just beginning to turn his head away a millisecond before the punch lands, though it doesn’t seem likely that that small motion could have made a difference. But in any case, no one ever — ever — again said anything about Ali being unable to take a punch.
I didn’t see the second fight either, and all I remember from it is the controversy about Tony Perez, the referee, who let Ali repeatedly grab the back of Frazier’s head and pull it down in their clinches. But by the time the third fight rolled around I was fully alert to the drama of it. I understood the contrast in styles — after all, there has never been a more obvious one: Frazier moving relentlessly, maliciously forward, head low, throwing hook after hook after hook to head and body, with both hands; Ali upright and bouncing, circling always to his left, disdaining body punches and hooks in favor of rapid-fire straight lefts and rights.
I understood also that these men were not rivals but rather actual enemies, that they truly hated each other. Having lived all my life in Alabama, where the world was neatly and simply divided between white people and black people, in that order, I don’t think I then grasped the racial dimensions of that hatred. I knew that Ali called Frazier a “gorilla,” but I never imagined the significance of a light-skinned Negro man saying that to a dark-skinned one. I might have been awakened to that dynamic if I had known that a few days before the fight, at the Marcos’s palace in Manila, Frazier had leaned over to Ali and quietly said, “I’m gonna whup your half-breed ass.” In turn, Ali would say to his corner just before the fight, “I’m gonna put a whuppin’ on this n*r’s head.” But I didn’t learn about any of that until later; I just knew that I had never anticipated anything in my short life as passionately as I anticipated what Ali had already called the Thrilla in Manila.
The classic account of what happened in that ring — and what happened before, and after — was written for Sports Illustrated by Mark Kram, and it remains the finest essay in sportswriting I have ever read. It captures with uncanny faithfulness the single fundamental fact about that fight, which is its ceaseless and horrifying brutality. By the third round Ali had pummeled Frazier so relentlessly that I was embarrassed for Joe, and I didn’t want to watch any more; I also knew that I would watch until the end, which I expected to come any moment. Then Frazier started to fight back.
As the advantage shifted back and forth between the two boxers, I watched in a state of ongoing incredulity. It was like seeing that Frazier punch that dropped Ali in their first fight, but a hundred times — a thousand. Ten thousand, it seemed. After a while I simply could not understand how either man remained standing, yet stand they did. And they punched — though “punch” is a pathetic word: the only adequate words are the ones that seem hyperbolic, like “bludgeon.”
It went on. For a time, for several rounds in the middle of the fight, Frazier got inside Ali’s guard and planted the top of his head under Ali’s chin and smashed Ali’s flanks and jaw again and again until I couldn’t imagine anything else happening, ever; but eventually, as the number of rounds (the number of years, I almost said) mounted, he grew exhausted and couldn’t get in there any more. And Ali, freed from that terrible pressure, found room to move; and then those long guns fired, repeatedly finding Frazier’s face and turning it gradually to pulp.
Frazier wouldn’t have quit, of course, under any circumstances less severe than death, but his trainer Eddie Futch couldn’t bear it any more and stopped the fight. The day after, Ali talked to Kram about what it had been like to be in that ring: “It was like death,” he said. He praised Frazier: “I’m gonna tell ya, that’s one helluva man, and God bless him” — but then, there was no reason for him to stint the praise. He had won; and Frazier had never had words to hurt him the way his contempt had slashed Frazier. The really remarkable thing was Frazier’s response, uttered just hours after his long war with Ali drew to its terrible close. “Man, I hit him with punches that’d bring down the walls of a city. Lawdy, Lawdy, he’s a great champion.”
As for me, I sat there for a while, once it was over, in my little bedroom in Alabama, staring at my little black-and-white TV. I could have watched elsewhere in the house on a larger screen, and in color, but I would never have risked being distracted by my uncomprehending family. So I sat there alone and in silence. I didn’t know it, but boxing was over for me; I would never watch another bout with interest and attention, and my tolerance for boxing’s aggression would shrink and shrink until I found myself avoiding Wii Boxing. And I still remember that night, when sleep took long to come; and for days afterward, a haze hung over my mind.
1. Torres beat Pastrano thanks in part to the instructions in combination-punching that Cus d’Amato — later Mike Tyson’s trainer — gave him Torres knew what it was like to be on the receiving end of such punches as well: four months after winning his crown he took on a non-title bout with a journeyman heavyweight named Tom McNeely, and though he won the fight he took such a beating to the body that some observers thought he was never again the same fighter.↩
B. Ah, the famous Imperial Examination! Ideal of meritocrats everywhere! But it’s not as though everyone in China had an equal shot at passing it — or even taking it. The rich who could afford tutors and bribes had a massive advantage over poor families whose sons had to rely on their wits and hard work. There was always some social group who were excluded from taking the exams — and of course women were never allowed to take it — and there was massive cheating —
A. Of course, of course. There is no possible system of politics or anything else that can’t be gamed, and in which the rich do not have advantages that the poor lack. To raise that as an objection to any scheme for social improvement is to allow the perfect — the impossible, the unrealizable perfect — to be the enemy of the good.
There will certainly be inequality at the beginning, but since money and discipline can only partially compensate for a lack of brains, and poverty can only partially impede the extravagantly intelligent, there would in such a system, over time, arise greater and greater equality both of opportunity and achievement. If you care about that kind of thing. I do, sort of, but not as much as I care about creating a political system in which the very best actually rule.
B. And it’s your view that China in the Imperial era actually achieved this genuine meritocracy?
A. Glad you asked. The answer is a firm No, in part because of the cheating and gaming we talked about a moment ago, but also because in the Imperial system the best were allowed to advise — but not to rule. The cult of the Emperor and the imperial family remained in place. China had created an enormously powerful system for funding and training the most gifted young men — and yes, it’s a shame that it was men only — ever devised, but restricted the ability of those men to set the course of Empire. So what I am arguing for is the next and obvious step: putting the aristoi — the genuine aristoi, not those of the dominant social class — in charge.
B. I wonder if you’d get “the genuine aristoi.” I recall that one Chinese philosopher, Ye Shi, commented that “A healthy society cannot come about when people study not for the purpose of gaining wisdom and knowledge but for the purpose of becoming government officials.”
A. I think Ye Shi may have been a little too concerned about people’s motives. If we can create examinations that accurately test for the skills that our rulers really need to have, and we select as our leaders the people who have those skills, who cares if their motives aren’t pure?
B. Hm. If you tell me that you can produce the best medical researchers, or particle physicists, by means of an examination, I might — might — take the notion seriously. But political rule? I don’t think so. Political leadership requires a whole host of skills and virtues — people skills, as we like to say, prudence, discernment, judgment of character — all traits that can’t possibly be tested for, but only developed through practice, experience. And some of those traits are virtues — so the character of the person in leadership actually matters. Politics isn’t a matter of A/B testing, of choosing the best option from a group of four!
A. Isn’t it? I’m not so sure. But I’ll grant that under democracy what you say may well be true — let’s say it is true. But democracy is what I’m trying to get rid of here, and one of the chief reasons I want to get rid of it is its tendency to generate just this kind of leader: someone who doesn’t know anything about anything but can somehow generate trust ex nihilo. I want — society needs — to ground our leadership choices in more objective terms of excellence, and relieving ourselves of the burden of democracy will give us a chance to do that. If instead of choosing leaders who can please hoi polloi we choose leaders with demonstrable expertise in the issues we face — poverty, poor health, inefficient energy usage, upheavals due to foreign conflicts, uncertainty because of irrational foreign governments —
B. Some of which are democracies. And even the ones that aren’t often have governments that stand because of their ability to “please hoi polloi.” Do you think you exam-crushing experts are going to have what it takes to deal with such retrograde social orders?
A. I think they’ll have a much better chance than the pols we send around the world today, many of whom have amateurish knowledge of the cultures within which they’re placed – and those are the good ones. I’d rather choose people with some of those social virtues you were lauding from within a pool of the demonstrably knowledgable than from within a pool produced by our current patronage system.
B. You know America has a foreign service exam, right?
A. Sure. And many of the people who aced it are working and suffering under inept direction from higher-ups who have no business making decisions. We’re like imperial China in that respect.
B. So you want to put the people who ace the exam in charge? And then extended a similar model into the rest of the governmental system?
A. Right — though of course people will need to gain experience over time — I wouldn’t suggest putting a 22-year-old in immediate charge of an embassy because she had the highest test scores.
B. Based on what you’ve said so far, I’m not sure why not. But let’s drop that — I have a different question for you. You’re creating a system in which almost everyone will be deprived of self-government. Do you think people in general will accept such a deprivation?
A. It’ll be a hard sell at first, because most people like to think of themselves as not just worthy of self-determination but positively inclined towards it. But they’re not — not either: not worthy and not so inclined. As I argued from the outset, the demos has made an absolute mess of things, implementing (through their chosen leaders) a vastly long series of selfish and stupid decisions, which they have also tried with considerable desperation to avoid facing the consequences of. But I also think on some level they know this — they understand that they are not suited for self-governance. And when someone comes forward with the ability to explain this to them in non-threatening terms, and to show them that democracy is not inevitable and that there really may be a better way, then I think they’ll be glad to be relieved of the burden of self-rule.
B. So if people are going to be persuaded to relinquish a system in which they choose leaders solely on the basis of trust-inducing capacity, they’re going to need one or more people they trust to do that persuading.
A. Yes. Ironic, isn’t it. But the history of politics is full of ironies. Only Nixon could go to China, etc.
B. You remind me of Loki.
B. Loki. In The Avengers. Telling people that they were made to be ruled.
(To be continued…)
There is no better journalist in America than Andrew Ferguson, and his brilliant takedown of bad behavioral science provides yet more evidence for that claim. A passage on Stanley Milgram’s famous obedience-to-authority experiment especially caught my eye:
The results were an instant sensation. The New York Times headline told the story: “Sixty-five Percent in Test Blindly Obey Order to Inflict Pain.” Two out of three of his subjects, Milgram reported, had cranked the dial all the way up when the lab-coat guy insisted they do so. Milgram explained the moral, or lack thereof: The “chief finding” of his study, he wrote, was “the extreme willingness of adults to go to almost any lengths on the command of an authority.” Milgram, his admirers believed, had unmasked the Nazi within us all.
Did he? A formidable sample of more than 600 subjects took part in his original study, Milgram said. As the psychologist Gina Perry pointed out in a devastating account, Beyond the Shock Machine, the number was misleading. The 65 percent figure came from a “baseline” experiment; the 600 were spread out across more than a dozen other experiments that were variations of the baseline. A large majority of the 600 did not increase the voltage to inflict severe pain. As for the the participants in the baseline experiment who did inflict the worst shocks, they were 65 percent of a group of only 40 subjects. They were all male, most of them college students, who had been recruited through a newspaper advertisement and paid $4.50 to participate.
The famous 65 percent thus comprised 26 men. How we get from the 26 Yalies in a New Haven psych lab to the antisemitic psychosis of Nazi Germany has never been explained.
I’m interested in this because in my book on original sin I referred to Milgram’s experiments quite positively — and moreover, I never did any reading to find out whether they had been subjected to critique. I just assumed that they were universally accepted as valid. And why did I make that assumption? Because Milgram’s experiments confirmed the story I was telling about the return, in the twentieth century, of a widespread belief in human depravity.
Now, to be sure, the book by Gina Perry that Ferguson cites as authoritative on this matter has itself come under some criticism for one-sidedness; Milgram’s famous experiment may indeed hold up, at least in large part. But the point I want to make here is that I didn’t do anything to check it out — for me, the story Milgram told was too good to be false.