I have very mixed reactions to the accounts of Christianity and higher education given by Rod’s recent correspondents, but the anonymous Christian professor makes a really vital point about the dangers of the “narrative of Christian oppression.” When I directed the faculty Faith and Learning Program at Wheaton College, years ago, I regularly told younger faculty, “If you submit an article or book for publication and it gets rejected, always, always, always assume that it’s because your work wasn’t good enough.” The assumption of anti-Christian prejudice gets you off the hook for improving your scholarship; ultimately, it makes you listless and lazy and increasingly prone to playing the victim card. Even when you have good reason to suspect such animosity, pretend you don’t, and get back to work.
So props to the Anonymous Christian Prof for noting the dangers of that way of thinking. But not for this:
But there is a second reason that the abandonment of secular institutions by Christians is a shame, and I will be blunt. Contemporary Christians — taken as a group — do not have the intellectual heft of their secular counterparts. The best scholarship, the highest quality thinking, still goes on at secular institutions. To go to a Christian institution involves — by and large – an intellectual compromise.
First of all, the claim is phrased imprecisely: by “contemporary Christians” I assume she means individual academics? But she then goes on in the next sentence to speak of institutions. So I don’t know whether the charge is that individual Christian academics are not as smart as other academics or Christian colleges and universities aren’t as academically rigorous as their secular counterparts. And it’s hard to answer the charge, given that there aren’t very many Christian colleges and universities, and that their institutional missions tend to be quite different than those of secular schools. Also, many faculty at Christian institutions either were (like me) educated wholly at secular universities or in some mix of Christian and secular schools.
But I will say this: I believe the kind of education students receive in the Honors College at Baylor, and at Wheaton, is in most respects far superior to what they would receive at secular schools of greater academic reputation and social prestige. Indeed, in my years at Wheaton I often heard comments to this effect from visitors. I think for instance of a professor at one of America’s top ten universities who said to me, after spending some time with one of my classes, “Your students are better informed and ask more incisive questions than mine do.”
I could adduce more examples, and explore more comparisons, but let me conclude with this: If this professor’s commendation of Wheaton’s students has at least some validity, how might we account for this state of affairs? I would point to three factors:
1) At Christian colleges, students and faculty alike tend to think of learning as a project in which the whole person is involved. Information is not typically separated from knowledge, nor knowledge from wisdom. The quest for education is less performative, more earnest than at many secular institutions. People are more likely to think and speak of education as something that leads to eudaimonia, flourishing.
2) A closely related point: Christian institutions tend to think quite consciously that their task involves Bildung, the formation of young people’s characters as well as their minds. So in hiring and retention they place a greater emphasis on teaching and mentoring than is common in secular institutions. (There are exceptions, of course, but even the most student-centered secular institutions cannot, because of their intrinsic pluralism, specify what good personal formation looks like.)
3) Perhaps the most important feature: Christian teachers and students alike can never forget that their views are not widely shared in the culture as a whole. We read a great many books written by people who don’t believe what we believe; we are always aware of being different. This is a tremendous boon to true learning, because it discourages people from deploying rote pieties as a substitute for genuine thought. No Christian student or professor can ever forget the possibility of alternative beliefs or unbeliefs. Most students who graduate from Christian colleges have a sharp, clear awareness of alternative ways of being in the world; yet students at secular universities can go from their first undergraduate year all the way to a PhD without ever having a serious encounter with religious thought and experience — with any view of the world other than that of their own social class.
Working in Christian institutions, especially because of this alertness to possible alternative beliefs, has been essential to my own intellectual development. Among other things, it has helped me learn to write for multiple audiences, while always keeping near the forefront of my mind the intense relevance of learning for life. I think it is largely because of, and not in spite of, my workplaces that I have been able to write for some of the most academically rigorous presses in the world, and to do so from an explicitly Christian point of view. Does that sound like “intellectual compromise”?
AnonProf says, with what appears to be satisfaction, “At my school we don’t talk about our religious or political identities in class.” At my school we talk about them all the time; we work through them, struggle with them, see them altered by experience and reflection — we put them on the line. In that sense our classrooms aren’t “safe spaces” at all. “Of course they’re not safe,” I’m tempted to say,” “but they’re good.” What happens at secular institutions like the one AnonProf teaches in may also be good, in its own way, and a better choice for some students, including some Christian students. (In education, one shape definitely does not fit all.) But that it’s intrinsically academically superior is an unsustainable claim.
A: I’mma let you finish, but can I just interject a thought here?
A: Just think for a minute about what’s been happening on social media in the past few weeks, in the aftermath of the Colorado Springs and San Bernardino shootings. One group lines up and shouts that these things would never happen if we had stricter gun laws, as though terrorists give a rat’s ass about gun laws; another group lines up opposite them and shouts that these things would never happen if we had a properly armed citizenry, as though mutually assured destruction were equal to social peace. None of of these people are considering, reflecting, thinking; they’re just reacting and emoting. Some of them can’t even grasp the moral difference between those who murder and those who are murdered. The very thought of allowing these people a vote — a voice in the shaping of society — appalls me. They’re simply incapable of rational thought.
B. Your description of the behavior of many people — most people? — is depressingly accurate. A few years ago there was an active Tumblr called These Tragic Events Only Prove My Politics, and I expect that the person who made it stopped updating it because it would have been beating a dead horse. There’s more evidence of the thesis every single day.
That said, we may have reached the point in this conversation where you and I most fundamentally differ. While I agree with your description of each side’s position, and while I agree that both sides are being thoroughly and depressingly irrational, I don’t agree that these people are incapable of rationality. After all, the great majority of them navigate their way through life, through work and marriage and child-rearing and all sorts of other things, without catastrophic error.
A. Well, I don’t —
B. Before you say something like Look at the divorce rate or Look at the anti-vaccination weirdos, stop and ask yourself whether the meritocratic elite you want to put in charge do any better in these areas. Remember, it’s among that elite that Soylent is a thing.
A. You may have a point there.
B. Yeah. But even if I don’t, it’s still my turn. And here’s my chief point: we have a paradox to explain. That paradox is that it’s easy to find people who have long personal histories of behaving fairly rationally – not perfectly so, but fairly so – who are nevertheless bizarrely irrational in their political opinions and their voting. So, as I read the evidence, the problem cannot be that “these people are simply incapable of rational thought.” The problem must lie within the structures of our political system.
Now, you and I agree that the current political system is messed up, but you attribute that mess to a failing or a limitation in the nature of most human beings, who in your view amount to little more than the higher cattle – as Nietzsche says, who envy the cow his unreflective “unhistorical” existence. But –
A. I really don’t want to say that. Perhaps my frustration with the pointlessness of recent political discourse, especially on social media, led me to formulate my thoughts a little too crassly. Let me, briefly, try again: Most people are not irrational per se, but their lack of interest in the actual problems and issues of governance, their preference for other activities, a preference that they disguise from themselves by uttering strong political opinions, disqualifies them from governing themselves. They will be happier and the world will run better if –
B. If running the world is left to the experts?
A. Yes. Treating expertise as though it’s frightening or absurd or impossible is one of the pathologies that we need to get over.
B. Okay. Your amendment is a good one. But it doesn’t alter my case, the making of which I shall now resume. You’re not saying that most people are irrational, but you are saying that they are congenitally unlikely to take sufficient interest in actual politics to learn what they need to know to vote wisely and well; and therefore will, perhaps not at first but eventually, be relieved to have political decisions taken out of their hands and given over to people who are qualified (by temperament, ability, and training) to make them and for whom the making of such decisions will be a full-time job.
A. Close enough.
B. What I would say, by contrast, is that people are interested in and even knowledgable about politics – but politics only on a human scale, a scale appropriate to the range of their experience and interests. Many, perhaps most, of the pathologies of our current political order are products of inhuman scale, what one of our best poets called “the long numbers that rocket the mind.” Consider this: people in Boston will consider themselves invested in what happens in San Bernardino in ways they absolutely are not in what happens in, say, Dublin – and yet San Bernardino and Dublin are pretty much equidistant from Boston. People in Boston can’t alter events in San Bernardino any more than they can alter events in Dublin, and yet the ideology of the modern nation-state makes them feel that they can and should have some say in what happens 3,000 miles away, as long as that distant place is in the same country. In one sense, of course, as Terence reminds us, Homo sum, humani nihil a me alienum puto; but in another sense 3,000 miles of separation stretches the possibilities of genuine understanding beyond their natural capacity, and social media and video don’t change that situation.
And this takes me back to my earlier comment about your emphasis on “exit not voice”: the exit-only option might make sense if I dislike life in Waco and decide to try a more congenial environment in Austin; it will make less sense if I live in Topeka and decide that the whole American social order is so flawed that I need to move to Toronto. Exit-not-voice is great if you have either (a) tons of money or (b) a human scale. But then, if you consider matters on a genuinely human scale, the whole question of voice looks very different. Because just leaving Topeka may not seem such a rational option, even though I despise the whole American social order, if my mom lives there.
This is a kind of scholarly-geeky follow-up to my earlier post on the ways that humanistic study can — doesn’t always or necessarily, but can, if the good will is there — promote compassion, fellow-feeling, mutual recognition. But if the good will is not there, and there is no curriculum of study in place to promote it, encourage students to consider its value … well, then you get protests and demands.
This is not to say protests and demands have no place in social life — but it’s pretty clear here that on many campuses today demands, and stringent demands at that, are the first recourse: we may take as a fairly representative example Dean Mary Spellman of Claremont McKenna College, whose obviously sympathetic email to a troubled student merely led to her forced resignation. That’s one side of this demand culture — zero tolerance, single sanction (expulsion from the community) — and the other side is the attempt to herd faculty as well as students into cultural/racial/sexual sensitivity courses, in order, ideally, to make such errors impossible.
A lot that can be said about this has already been said: that it’s an obvious repudiation of free speech, that it’s reminiscent of Maoist and Stalinist re-education programs, that it’s an error-has-no-rights model, and so on. And I don’t strongly disagree with any of these arguments, though I think I have more sympathy with at least some of the protestors than many of my fellow conservatives. But what strikes me about the whole approach is how … well, Baconian it is. Sir Francis Bacon, not Francis Bacon the painter or Kevin Bacon. Bacon’s early attempts at inaugurating what we would now call the scientific method, and extending it into the whole of philosophical reflection, seem to me to prefigure rather eerily the thoughts of the student protestors.
As I said, this is a geeky sort of take on the whole business. But please bear with me.
In his New Organon of 1620, Bacon lays out his method for inquiry. He begins by stating, “I propose to establish progressive stages of certainty,” and agrees that many of the medieval scholastics who emphasized the power of logic wanted to do the same thing. He doesn’t think they went about it the right way, but he and they have this in common: “they were in search of helps for the understanding, and had no confidence in the native and spontaneous process of the mind.” That’s true of our modern protestors as well: “the native and spontaneous process of the mind” — the white male mind anyway — is notoriously unreliable.
With that agreement in mind, let’s turn to how Bacon differentiates himself from the logicians:
But this remedy comes too late to do any good, when the mind is already, through the daily intercourse and conversation of life, occupied with unsound doctrines and beset on all sides by vain imaginations…. There remains but one course for the recovery of a sound and healthy condition — namely, that the entire work of the understanding be commenced afresh, and the mind itself be from the very outset not left to take its own course, but guided at every step; and the business be done as if by machinery.
Emphasis mine. The mind of someone like Mary Spellman is “occupied with unsound doctrines and beset on all sides by vain imaginations”: she can only be cast out, so the community can turn to the more promising work of educating younger people. They will “be guided at every step”; and since the lessons they have to learn are fixed and invariant, “the business [may] be done as if by machinery.”
In a vitally important essay that I referred to in an earlier post on this constellation of issues, the philosopher Charles Taylor calls this Baconian habit of mind “code fetishism” or “normolatry.” And in summarizing the thought of Ivan Illich, Taylor develops his point:
Even the best codes can become idolatrous traps that tempt us to complicity in violence. Illich reminds us not to become totally invested in the code — even the best code of a peace-loving, egalitarian variety — of liberalism. We should find the centre of our spiritual lives beyond the code, deeper than the code, in networks of living concern, which are not to be sacrificed to the code, which must even from time to time subvert it.
This probably won’t be the last time I revisit these ideas by Taylor, because they are so vital for understanding multiple pathologies of our current public square. This Baconian disciplinary “machinery,” executing the normolaters’ preferred codes, simply eliminates the possibility of strengthening our “networks of living concern.” The code-fetishist model must be resisted at every turn, and the people best placed to do that are people who have been formed within a deeply humanistic model of inquiry and debate. Nothing is more needful in the current campus environment than a renewal and re-empowerment of the humanities.
There’s a great moment in the movie High Noon — a moment that keeps recurring to my mind as I think about the current conflicts on campus. Despite having promised his fiancée that he will leave town with her, Marshal Will Kane decides he has to stay to fight the Miller gang, who are about to terrorize the town. He makes a strong case, but she — vitally, a Quaker — says, “I don’t care who’s right and who’s wrong, there has to be a better way for people to live.”
Those words keep echoing in my head (I even hinted at them in my previous post on these matters): I don’t care who’s right and who’s wrong, there has to be a better way for people to live. Alas, the way things play out in the movie doesn’t give me a lot of encouragement.
A. Whoa, that was quite an interruption. Where were we?
B. You were talking about voice versus exit.
A. Ah yes, thanks. So: one of the ways we might define democracy is as an irrational obsession with voice — with “having a say.” As I have tried to argue, most people don’t know enough, and don’t care enough, to deserve a say; but even those who do have some knowledge and some interest typically disagree with one another often enough and seriously enough that the work of ordering a society is compromised, disrupted, sometimes paralyzed – as, for instance, when the U.S. government shuts down because Congress can’t agree on a budget. This is, as the saying goes, no way to run an airline. What we need to do is overcome this irrational obsession with having a say, and instead place the emphasis on the power of exit. I have a good bit of sympathy with those who argue that exit is the only truly universal human right: the ability to say, “I’ve had enough of this crap, I’m getting out of here.” People who have the right of exit have a very powerful right.
B. But what if you live in Kansas?
B. Let’s be realistic about this, shall we? The right of exit is one that doesn’t make a whole lot of sense if the only door out is a couple of thousand miles away. You call it a universal human right, but how can it be a right if only those who have a sizable pile of cash are able to take advantage of it? It’s interesting that this argument – the argument that exit is the fundamental human right – tends to come from the seasteaders, from people who have enough money (or hope to get it) that they can actually imagine creating an offshore community, a city floating in the Pacific, that they can live on. And of course that would also mean having enough of a savings account to buy whatever goods can’t be produced out there. It’s ridiculous. It’s Fantasy Island for rich people.
A. No, it doesn’t have —
B. Don’t you think it might be my turn now?
A. Fair enough. Do your worst.
B. Thanks. Okay, so, furthermore — and to me this is a major problem with your whole way of thinking — you’re assuming completely deracinated human beings. The people you have making decisions, whether those decisions involve how to run this imaginary neo-reactionary world or how to leave it, are all just fleshy calculating machines, running the numbers to achieve maximum efficiency according to some totally abstract utilitarian calculus. In your world it wouldn’t even make any sense for someone to say “I don’t like the government that I have right now, but I love this place and don’t want to leave it.” What if we have, as Simone Weil argued, a need for roots? What if, as Roger Scruton argues, piety is fundamental to human flourishing?
A. Piety?? Haven’t heard that term in a while.
B. Piety in the sense that Virgil refers to his hero as “pious Aneas”: pietas, devotion — devotion to something bigger than my own preferences. You can’t build Rome without piety, or sustain it once it’s built; you can’t even build your Fantasy Island in the Pacific without some degree of that virtue.
A. Actually, I think you can build that fantasy Island in the Pacific – if that’s your thing; I’m not saying it’s my thing – without piety as you have described it. But never mind. Let’s think about Virgil’s Aeneas. He didn’t really have a lot of choice about his piety, did he? Every time he was in danger of forgetting it one of the gods came down to kick his ass and get him back on the right track. Pietas is all well and good in a world monitored and disciplined by the gods. We don’t live in that world, and I think all too often what we call “attachment to place” is really just nostalgia for a world we don’t live in any longer, if indeed we ever really did.
B. Oh, I think we live in the same world our ancestors did, overseen by the same deity or deities that have always been here. Isn’t it interesting that our conversation has taken us into the realm of metaphysics? It turns out that if you argue about politics long enough you get to something deeper than politics. But I think we can keep the metaphysical arguments at arm’s length at least for a while longer. It’s sufficient to ask whether human beings can flourish in the kind of moral and political world you are imagining, without pressing the question of why they need certain conditions in order to flourish. (It would be possible to give purely naturalistic, sociological, evolutionary reasons why people need roots and are rightly driven by piety.) It’s enough, I think, for me to insist that in your utopia people would be miserable. And they would be miserable because they would lack a voice, though not, perhaps, in the sense in which the neo-reactionaries typically talk about voice.
The word “compassion” means to “suffer with.” It presumes fellow feeling, based on a shared experience of what Hamlet called “the thousand natural shocks that flesh is heir to” — and also, alas, of the unnatural shocks that we humans inflict on one another. That this is a reasonable, indeed an essential, way of thinking and feeling is intrinsic to the idea of the humanities, the study of what is human. It is within this moral frame that Terence’s line, clichéd though it has become over the centuries, becomes so vital: Homo sum, humani nihil a me alienum puto — I am human, and nothing human is alien to me.
To be sure, few sentences can be more wounding to the one who suffers than the casual “I know just how you feel.” No, you damned well don’t is the instinctive response, and often rightly so. But here Terence comes to our aid: he doesn’t say “Everything human is fully comprehensible to me,” but rather, “Nothing human is alien to me”: nothing that my fellow humans experience is beyond my ability to recognize, to have some understanding of — to have compassion for.
In The Doors of Perception, Aldous Huxley writes,
By its very nature every embodied spirit is doomed to suffer and enjoy in solitude. Sensations, feelings, insights, fancies—all these are private and, except through symbols and at second hand, incommunicable. We can pool information about experiences, but never the experiences themselves. From family to nation, every human group is a society of island universes.
Most island universes are sufficiently like one another to permit of inferential understanding or even of mutual empathy or “feeling into.” Thus, remembering our own bereavements and humiliations, we can condole with others in analogous circumstances, can put ourselves (always, of course, in a slightly Pickwickian sense) in their places. But in certain cases communication between universes is incomplete or even nonexistent.
Genuine compassion is therefore an achievement, something gained only by disciplined attentiveness, not to “the other,” that empty abstraction, but to some particular other. To a neighbor. (“My neighbor,” Kierkegaard dryly commented, “is what philosophers call The Other.”) And the greater the suffering of that neighbor, the more rigorous must our attention be if we are to reach some understanding. This is one of the great themes of Simone Weil’s writing, especially in her powerful, spiritually intimidating meditation on “The Love of God and Affliction.” But — and Weil makes this clear too — as great as the challenge is, we must always hold the possibility of compassion before us, because the personal and social costs of neglecting or refusing it are catastrophic.
For the last 25 years or more what used to be the humanistic disciplines have ignored the vital Terentian claim that nothing fully alienates one human from another. It might be better to say that they have betrayed it, which is why I insist that they “used to be” the humanistic disciplines. I don’t have a name for what they are now. Not by acknowledging the power and shaping force of race and gender and sexual orientation and culture, but by treating them as a series of hermetically sealed boxes, they have made it increasingly difficult, if not impossible, for their students to see themselves as sharing common experiences and common pursuits. (No one who has even the slightest understanding of how cultures actually operate in relation to one another — a ceaseless interchange of ideas, visions, experiences, and techniques — could take the notion of opprobrious “cultural appropriation” seriously.)
If you cannot see your fellow students, or colleagues, as engaged in a common and intrinsically human search for knowledge — maybe even wisdom — then you will have no incentive to cross the boundaries of race, gender, or culture. You will in the end have no incentive to cross the boundaries of your narrow little self. You may occasionally speak the language of “alliance,” but you will define your allies as those who obey your demands. And, in times of conflict, it will be characteristic of your stance towards the world to make demands.
It’s a way to live. But it’s not a good way to live, and it exacts a heavy toll on everyone involved. If our young people are going to see that there are less confrontational alternatives, something other than zero-sum games, they’ll need instruction in the humanities. Some of us are prepared to give it.
When I came to teach at Wheaton College, in the fall of 1984, on a one-year contract, I had been a Christian for about five years, knew almost nothing about the evangelical subculture, and was a theological novice. My one virtue was that I had some inkling of how little I knew. So I sought out Roger Lundin.
That year, it should be remembered, was the apogee of the Age of Theory, or near enough, and I had just completed my training in at least some of its intricacies. But when I arrived on Wheaton’s campus I quickly learned that this was not perceived as an accomplishment, but rather a cause for suspicion. Only Roger — who had himself been trained in a long-familiar makeshift blending of historical and New-Critical practices, but had been seeking, without much help or encouragement, to educate himself in the newer directions of our discipline — was a possible conversation partner. I read a few articles he had written and realized that, like me, he believed that within the Christian traditions there were resources for listening and then responding to the voices of critical theory; but, unlike me, he was already prepared by theological training to do this work. So I sought him out.
I would have sought him out anyway; his warmth and kindness drew me, as they drew so many others, towards him. We began talking our way through these issues, and through so much else. He introduced me to his longtime friend Mark Noll, whose office was then four floors up from ours, later a more convenient one floor down; Mark became my second theological teacher and instructor in the ways of the Christian mind. Roger and I prayed together and laughed a great deal. We created an organization called Slow White Men of America, of which we were the charter and only members. Once we spent a morning at Borders where I came across a book on a display table called Hollywood Priest: A Spiritual Struggle. I held it up before Roger solemnly, and he said, “Big Al” — he always called me Big Al, a somewhat comical thing given the disparity in our sizes — “I understand how people can be horrified at living in this country, but I do not understand how anyone could ever be bored.”
I am strolling down memory lane here, but this is appropriate. Anyone who knew Roger at all well noted how profoundly memorial, as well as historical, his imagination was. I may have had a very slightly better verbal memory than he did, which means that I could sometimes remind him what someone had said in a given long-ago conversation — at which point he would tell me precisely when that conversation had happened. Were he reading this, he would cheerfully rattle off the year, month, and day on which we paid that visit to Borders, what the weather had been, what headlines had been featured in the Trib, and how Chicago’s professional sports teams had performed.
This was a kind of parlor trick, and an invariably impressive one, but for Roger the cultivation of memory, both personal and cultural, is an essential spiritual discipline, and one which Americans, in our haste always to fare forward, tend very much to neglect. We strain into the future, but, as Fitzgerald reminds us, “we beat on, boats against the current, borne back ceaselessly into the past.” It was Roger’s distinctive calling as a teacher and scholar to encourage us to embrace that backwards pull, to use it to help us understand where we have come from, and to do honor to those who went before us. As we read in Ecclesiastucus: “Leaders of the people by their counsels, and by their knowledge of learning meet for the people, wise and eloquent are their instructions: … All these were honoured in their generations, and were the glory of their times.” Roger, who died suddenly last week, has now joined this great company.
The last communication I received from Roger, several weeks ago, concluded with these words:
Thanks, deep thanks, for your kindness, which is rooted in a friendship that now reaches back 31 years!
My beloved friend Brett Foster never wanted to leave anything behind. My wife Teri took this picture of him on the streets of Durham, I believe, in the summer of 2011, when we spent the summer in England leading a study tour. You can see the bag handing from his shoulder, and the scarf, and top of the coffee cup, but not the enormous backback that’s sitting just outside the frame. He was always encumbered in this way, because he never wanted to leave anything behind that he might need.
Now he has left us all behind.
Wheaton College’s Pierce Chapel was not made for funerals, especially Anglican ones, but it served, and hundreds of Brett’s friends arrived there on Saturday to commend his soul to God. November in Chicagoland is typically an unkind month, but the day was sunny and clear and warm enough that someone had opened windows in the chapel. Father Martin Johnson, the rector of All Souls’ Church, where Brett and I and our families were members for a decade, had brought the church’s butcher-block altar over and set it up in front of the stage. That altar had been made for the church by David Hooker, who became one of Brett’s dearest friends. Martin was grieving hard and perhaps working even harder — to make the day right for Brett, and for us.
After the tributes had been made — beautiful words, by some of the many who loved Brett — it was time for Communion. Brad Cathey had made the loaves and laid crosses upon them before baking, and Father Martin and Father Paul stood before the long lines and gave the bread of life, with the words of life.
But then some small chunks of bread fell to the ground at Martin’s feet. One piece was stepped on. I noticed that the poet Scott Cairns, who was sitting right in front of me, saw this, and was looking on in some discomfort. Then, when the last communicant had served, Scott and I at the same moment hunched forward, knelt, picked up every piece, and ate — I quickly and in some embarrassment, he, back in his seat, slowly and with reverence. This is my body, broken for you.
It was time for the final commendations. Martin circled Brett’s coffin with the censer. The sun was low enough now that its light slanted through the open windows, and I could see by that light the smoke of the incense caressing the embroidered pall that covered the coffin. A quiet breeze wafted it away as Martin, his censing completed, stood directly before the coffin. Then he bowed and, with infinite gentleness, kissed it.
We moved then to the cemetery, to commend Brett’s body to the earth, in the hope of the resurrection. As Brett’s wife Anise, and their children Avery and Gus, and Brett’s mother Suzanne, and Anise’s mother Sharie, and all the rest of the family and the great circle of friends looked on, Martin spoke the ancient and beautiful words, and, according to custom, cast the first handfuls of earth onto the coffin.
Though I love those words, I did not, this time, hear them. I looked around and saw that Brett’s corner of the cemetery was surrounded mostly by pines, though one old oak reached out a limb over the grave. It still had its leaves; soon it will shed them over the broken earth there. I remembered the words that Scott Cairns had read earlier, in the time of tributes at the chapel, words that he had written years ago for his father but that could not be more right for this grievous occasion, full of its own strange hope.
And this is the consolation — that the world
doesn’t end, that the world one day
opens up into something better, and then we
one day open up into something far better.
Maybe like this: one morning you finally wake
to a light you recognize as the light you’ve wanted
every morning that has come before. And the air
itself has some light thing in it that you’ve always
hoped the air might have. And One is there
to welcome you whose face you’ve looked for during all
the best and worst times of your life. He takes you to himself
and holds you close until you fully wake.
And it seems you’ve only just awakened, but you turn
and there we are, the rest of us, arriving just behind you.
We’ll go the rest of the way together.
The books above are most of the ones I’ve assigned for my Great Texts of the Twentieth Century course (missing are Art Spiegelman’s Maus and the daily poems I’ll be reading to my students). It’s a pretty heterogeneous group, but taken together these books touch on a great many of the key issues of our time, and those of all times: not just racism, sexism, and colonialism, but also the rise of biological science as First Philosophy, the various ways cultures constitute identity, the furthest reaches of human barbarity, the transformation of culture by electronic media, and the miraculous power of the writings of a first-century Jew to illuminate and interpret modern consciousness. Something to offend everyone, you might say.
And this is the point, or at least one of the chief points. Here at Baylor, I want my students—most but not all of whom are Christians; some are simply unbelievers, some are uncertain and struggling—to encounter the texts, and through those texts the experiences, that served to undermine Christian faith and practice in the twentieth century. But I also want them to encounter Christians (Barth, Merton, Bonhoeffer, Eliot, Auden, Simone Weil in her unique way) for whom the twentieth century’s challenges provided an impetus to re-think and re-live Christianity in fresh ways.
I could, of course, work to protect them from this violent clash of powerful and contradictory ideas; I could—I am free to do this—build a syllabus that focused on Christian writers and perhaps other religious believers and presented anti-religious writers whose work is cartoonish or in other ways simplistic. And perhaps if I did that some of my students would feel safer. But that, I am convinced, would be a false sense of safety, and would leave them underprepared for an adult world in which their ideas and beliefs will receive daily challenges. What kind of teacher would I be if I let that happen?
There is one point on which the crusade for the imposition of trigger warnings is absolutely right. It is not for nothing that reading was always feared throughout history. It is indeed a risky activity: reading possesses the power to capture the imagination, create emotional upheaval and force people towards an existential crisis. Indeed, for many it is the excitement of embarking on a journey into the unknown that leads them to pick up a book in the first place.
Can one read Proust’s In Search of Lost Time or Tolstoy’s Anna Karenina ‘without experiencing a new infirmity or occasion in the very core of one’s sexual feelings?’ asked the literary critic George Steiner in Language and Silence: Essays 1958–1966. It is precisely because reading catches us unaware and offers an experience that is rarely under our full control that it has played, and continues to play, such an important role in humanity’s search for meaning. That is also why it is so often feared.
And reading should be feared; particular books and authors should be feared. But it is not always—indeed, it is not often—best to flee from what we fear. Better to master the fear, to approach what scares us, but to do so with care and preparation and in an environment where those around you wish you well.
My dear friend Brett Foster, whose death earlier this week hangs over me heavily, wrote a poem that I have been thinking about a great deal. It’s called “Back-to-School Rondeau”, and I think it beautifully describes the fears and excitements of genuine education, the ways the pursuit of knowledge takes up and involves the whole of our being. I’ll leave you with it.
It’s almost time to set aside the waning
distractions of first youth, the life contained
for years at home. What’s home? The place you grow
out of, everything receding slowly,
fading like a chalked sidewalk in the rain.
Leave childish things behind, said a certain
fellow. (Others afterward.) Don’t remain:
the friends gone late in summer let you know
it’s almost time.
Don’t leave behind new clothes, impromptu plans —
they’ll match surroundings well, remind again
of shining coming: new homes to let go
of, too; the best things said; mind’s overflow;
surprising callings; time for love, and pain.
It’s almost time.
Richard Rodriguez, in his great memoir Hunger of Memory, movingly recounts the day when a nun, one of his teachers at his parochial school in Sacramento, asks his Mexican parents to speak English at home in order to encourage Richard to improve his English, to be more confident speaking it in school. He was in first grade.
Not a request that would be made today, I suspect. His parents agreed, of course—a nun had asked them! And while young Richard missed very much the sounds of Spanish at home, his English did get better. He became more comfortable at school; indeed, eventually his public identity came to be closely associated with his academic success. And he became strangely grateful for that nun’s request. It set him on the road to manhood: “I became a man by becoming a public man.”
This experience (and others like it) led eventually to Rodriguez, as a graduate student, becoming notorious for his opposition to bilingual education programs. He may or may not have been right in that opposition—there should be, and there is, serious debate about when young people need to make that essential transition from the private comforts of home to the sometimes challenging but also rewarding demands of public life—but no one has ever articulated more precisely the essential principle at stake here:
While one suffers a diminished sense of private individuality by becoming assimilated into public society, such assimilation makes possible the achievement of public individuality.
I think Rodriguez’s point is essential for understanding the current kerfuffle at Yale University, where students and alumni are—track with me here—demanding the resignation of of the master of a college and the assistant master because the master is refusing to apologize for not having exercised dictatorial authority over other students’ Halloween costumes. To most outside observers this will seem pretty silly. But let’s ask where the kind of reaction the students and alumni are having comes from.
The key may be found in this op-ed in the Yale Herald, significantly titled “Hurt at Home.” Jencey Paz writes,
As a Silimander, I feel that my home is being threatened. Last week, Erika Christakis, the associate master of Silliman College, sent an email to the Silliman community that called an earlier entreaty for Yalies to be more sensitive about culturally appropriating Halloween costumes a threat to free speech. In the aftermath of the email, I saw my community divide. She did not just start a political discourse as she intended. She marginalized many students of color in what is supposed to be their home.
But Silliman College is not “supposed to be their home.” It is a residential college in a university, a place where people from all over the world, from a wide range of social backgrounds, and with a wide range of interests and abilities, come to live together temporarily, for about 30 weeks a year, before moving on to their careers. It is an essentially public space, though with controls on ingress and egress to prevent chaos and foster friendship and fellowship.
It is possible, of course, that Yale sells their residential college system to students as a kind of “home”; I don’t know. The official description seems to me to strike an appropriate note without over-promising: “The residential colleges allow students to experience the cohesiveness and intimacy of a small school while still enjoying the cultural and scholarly resources of a large university; the residential colleges do much to foster spirit, allegiance, and a sense of community at Yale.”
Now, to be sure, this “cohesiveness and intimacy” can for some students be very powerful—their college can even be a better and healthier environment for them than their actual home. The great theater critic Kenneth Tynan loved Magdalen College, Oxford (where C.S. Lewis was his tutor) so much that he wanted his ashes to be interred there. But it was not his home, and could not have been, because there were other people there who didn’t even know him, or who knew him but didn’t like him, or whose preferences were radically different than his, and who had no long-term bond with him to force them to come to some mutually agreeable terms beyond basic tolerance for three years or so.
Residential colleges have long been defended as transitional spaces between the world of home and a fully independent adult life, and it would be a great mistake to think of them as merely continuing the ethos of home. That would leave young people totally unprepared for that “adult life,” which I think we might, for the purposes of this discussion, define as that period of one’s existence during which there is no one to run to to demand control over other people’s Halloween costumes. When one only has, to return to Rodriguez’s terms, “private individuality,” it is quite natural, if not altogether admirable, to seek out an authority figure when someone’s holiday costume offends you. But by the time one gets to college one’s “public individuality” should be sufficiently developed that the wearing of costumes should be seen as an essentially trivial matter that students can deal with among themselves. If they can’t, then the university needs to acknowledge that they’re dealing with some serious cases of arrested development.
Let me wrap this up by simply repeating a passage from a post I wrote some months ago: In a fascinating article called “The Japanese Preschool’s Pedagogy of Peripheral Participation,” Akiko Hayashi and Joseph Tobin describe a twofold strategy commonly deployed in Japan to deal with preschoolers’ conflicts: machi no hoiku and mimamoru. The former means “caring by waiting”; the second means “standing guard.” When children come into conflict, the teacher makes sure the students know that she is present, that she is watching—she may even add, kamisama datte miterun, daiyo (the gods too are watching)—but she does not intervene unless absolutely necessary. Even if the children start to fight she may not intervene; that will depend on whether a child is genuinely attempting to hurt another or the two are halfheartedly “play-fighting.”
The idea is to give children every possible opportunity to resolve their own conflicts—even past the point at which it might, to an American observer, seem that a conflict is irresolvable. This requires patient waiting; and of course one can wait too long—just as one can intervene too quickly. The mimamoru strategy is meant to reassure children that their authorities will not allow anything really bad to happen to them, though perhaps some unpleasant moments may arise. But those unpleasant moments must be tolerated, else how will the children learn to respond constructively and effectively to conflict—conflict which is, after all, inevitable in any social environment? And if children don’t begin to learn such responses in preschool when will they learn it?
Imagine if at university they had developed no such abilities and were constantly dependent on authorities to ease every instance of social friction. What a mess that would be.
Damon Linker, with whom I seem to be debating a lot these days:
Professors in the humanities and social sciences engage in highly specialized research, attempting to push knowledge into new areas — and many view this effort as a project that involves and requires liberating individuals from the dead weight of received prejudices.
The result is that academics usually end up pursuing scholarly agendas that are the furthest thing from anything that could be described as “conservative.” The imperative to advance knowledge demands that research contributes something new. Meanwhile, the tendency to relegate all received truth claims to the category of prejudice leads to suspicion even of the established findings of the previous generation of scholars.
This would be a more convincing argument if “the established findings of the previous generation of scholars” were never oriented towards or tended to reinforce political liberalism; and if liberals had a tendency to relegate their own preferred truth claims, the ones they have received from previous generations of liberals, to the category of prejudice. (I’m using “liberalism” in a loose and conventional sense here; one could of course ask whether liberalism is really liberal, and so on.)
Linker wrote in a recent column devoted to defining contemporary conservatism that “If you ask conservatives what this comprehensive moral outlook consists of, they’ll likely say one of several things: Devotion to individual freedom. Constitutionalism. A concern with limited government. Fear of tyranny.” In many different academic disciplines, there’s an enormous body of scholarly work that, in the view of the conservatives Linker describes, neglects or attacks these values. Wouldn’t conservatives then have an inclination to critique that scholarship and defend those values? Would they be piously reverent towards scholarship that endorses, and often enforces, liberalism simply because it’s “established”?
On the other side of the fence: as Linker acknowledges, an essentially if vaguely liberal outlook on the world simply is the default position of the majority of professors in the majority of academic disciplines, and has been for several professorial generations; so where among these liberals is the suspicion of the past that Linker identifies with liberalism? It turns out that such suspicion is highly circumscribed: it’s limited to some findings of previous generations of scholars, not extended to those scholars’ overall political outlook.
Linker wants to argue that conservatives have, because of their core values, selected themselves out of the academy. And no doubt that happens sometimes. But a great many more who would like to make a contribution are ruled out — quietly and behind the scenes — for exactly the same reason that genuinely radical leftists are ruled out: they would rock our gently swaying boat, here on our calm, calm lake.