In John Steinbeck’s novel Winter of Our Discontent, businessman Ethan Hawley is haunted by the presence of the town drunkard—his childhood friend, Danny Taylor. Once like brothers, the two men’s paths slowly diverged over time. Now, Hawley watches his old friend succumb to squalor and self-abuse. He cannot forget his guilt.
It reminded me of old Scrooge: Dicken’s “squeezing, wrenching, grasping, scraping, clutching, covetous, old sinner,” who lived with squalor and illness on his doorstep, yet didn’t care. He couldn’t see over his insurmountable mountain of greed.
Our current lives are filled with Ethans and Dannys, Scrooges and Tiny Tims. Take Mike Remboulis: he and his half-brother grew up in the same Queens apartment, but their fortunes have changed drastically over the years. New York Times reporter Rachel Swarns shared their story last Sunday:
Even amid lighthearted banter and savory plates, reality almost always comes rushing back: Beyond those restaurant doors, his big brother is teetering on the edge, barely hanging on. Mr. Remboulis, 51, is an aerospace engineer with a graduate degree. His half-brother, Glenn Yuzzi, 59, is a carpenter with a high school diploma. They grew up in the same Queens apartment, but today they live on opposite sides of the economic divide. It is a twist of fortune that haunts Mr. Remboulis. “I don’t want him to scrape by,” he said. “I don’t want my brother to drown.”
It seems these unfair disparities become more prominent in the American mind around Christmastide. After all, it’s freezing cold outside in most of America. Most of us are huddled in coats and scarves, drinking coffee, thinking about presents, food, and parties. In the midst of this happy holiday reverie, we see a man slumped on the street corner, McDonalds cup sitting empty at his side. Even if one isn’t surrounded by the homeless, it’s almost impossible to live in a town or city without knowing at least one family whose fortunes have fallen on bad times. We grow up hearing so much about “Christmas spirit”—the spirit of giving that supposedly animates and transports us during the Christmas season. We read The Christmas Carol and How the Grinch Stole Christmas, frown disapprovingly at the money-grubbing fiends.
Steinbeck set the advent of his novel on Good Friday. There’s death in the air. Over Friday and Saturday, change stirs in Ethan Hawley. But it isn’t change of a good kind. His money lust is awakening. From Saturday evening to Sunday morning, the morning of resurrection, Hawley begins to see himself as a snake that has just shed its skin. He’s transformed.
It reminded me of old Scrooge: he awakens on Christmas morning after a haunted nighttime vigil with ghosts. He is also transformed—but by losing his old self, completely rebirthed into newness: “I don’t know what day of the month it is!” said Scrooge. “I don’t know how long I’ve been among the Spirits. I don’t know anything. I’m quite a baby. Never mind. I don’t care. I’d rather be a baby.” Scrooge scatters his money like dust to the wind, overflowing with generosity and joy. To Tiny Tim, “he was a second father.”
Who will we be this Christmas? I scurry down cold city streets and see the pleading eyes, listen to stories of families “getting by.” Part of me wants to bark out like Cain, “Am I my brother’s keeper?”
But that’s what the Christmas story is about: the One who had everything became nothing, to save the lost brother. He bore poverty, sorrow, and shame in the winter of our discontent. And He promised us that, like Scrooge, we could be “born again.” He wouldn’t let us drown.
He truly was “better than his word. He did it all, and infinitely more.”
Reports that A- is the median grade in Harvard College have reopened the debate about grade inflation. Many of the arguments offered in response to the news are familiar. The venerable grade hawk Harvey “C-” Mansfield, who brought the figures to public attention, describes the situation as an “indefensible” relaxation of standards.
More provocative are defenses of grade inflation as the natural result of increased competition for admission to selective colleges and universities. A new breed of grade doves point out that standards have actually been tightened in recent years. But the change has been made to admissions standards rather than expectations for achievement in class.
According to the editorial board of the Harvard Crimson, “high grades could be an indicator of the rising quality of undergraduate work in the last few decades, due in part to the rising quality of the undergraduates themselves and a greater access to the tools and resources of academic work as a result of technological advances, rather than unwarranted grade inflation.” Matt Yglesias, ’03, agrees, arguing that “it is entirely plausible that the median Harvard student today is as smart as an A-minus Harvard student from a generation ago. After all, the C-minus student of a generation ago would have very little chance of being admitted today.”
There’s a certain amount of self-congratulation here. It’s not surprising that Harvard students, previous and current, think they’re smarter than their predecessors—or anyone else. But they also make an important point. The students who earned the proverbial gentleman’s Cs are rarely found at Harvard or its peers. Dimwitted aristocrats are no longer admitted. And even the brighter scions of prominent families can’t take their future success for granted. Even with plenty of money and strong connections, they still need good grades to win places in graduate school, prestigious internships, and so on.
The result is a situation in which the majority of students really are very smart and very ambitious. Coursework is not always their first priority. But they are usually willing to do what’s necessary to meet their professors’ expectations. The decline of core curricula has also made it easier for students to pick courses that play to their strengths while avoiding subjects that are tough for them. It’s less common to find Chemistry students struggling through Shakespeare than it was in the old days.
According to the Harvard College Handbook for Students, an A- reflects ”full mastery of the subject” without “extraordinary distinction”. In several classes I taught as an instructor and teaching fellow at Harvard and Princeton, particularly electives, I found that around half the students produced work on this level. As a result, I gave a lot of A-range grades.
Perhaps my understanding of “mastery” reflects historically lower demands. For example, I don’t expect students writing about Aristotle to understand Greek. Yet it’s not my impression that standards in my own field of political theory have changed a lot in the last fifty years or so. In absence of specific evidence of lowered standards, then, there’s reason to think that grade inflation at first-tier universities has some objective basis.
But that doesn’t mean grade inflation isn’t a problem. It is: just not quite the way some critics think. At least at Harvard and similar institutions, grades are a reasonably accurate reflection of what students know or can do. But they are a poor reflection of how they compare to other students in the same course. In particular, grade inflation makes it difficult to distinguish truly excellent students, who are by definition few, from the potentially much larger number who are merely very good.
Here’s my proposal for resolving that problem. In place of the traditional system, students should receive two grades. One would reflect their mastery of specific content or skills. The other would compare their performance to the rest of the class. Read More…
Raise your hand if you’re a conservative who has cited Edmund Burke without actually having read him closely.
Really—you’re all scholars of the Irish-born MP and oft-celebrated “father of modern conservatism”?
Okay, what did Burke mean by the phrase “the little platoon”?
Yuval Levin explains in his wonderful new book The Great Debate: Edmund Burke, Thomas Paine, and the Birth of Right and Left:
The division of citizens into distinct groups and classes, Burke writes, “composes a strong barrier against the excesses of despotism,” by establishing habits and obligations of restraint in ruler and ruled alike grounded in the relations of groups or classes in society. To remove these traditional restraints, which hold in check both the individual and the state, would mean empowering only the state to restrain the individual, and in turn restraining the state with only principles and rules, or parchment barriers. Neither, Burke thought, could be stronger or more effective than the restraints of habit and custom that grow out of group identity and loyalty. Burke’s famous reference to the little platoon—“To be attached to the subdivision, to love the little platoon we belong to in society, is the first principle (the germ as it were) of public affections”—is often cited as an example of a case for local government or allegiance to place, but in its context in the Reflections, the passage is very clearly a reference to social class.
Still feeling Burkean? Ready to go the pipe-and-slippers, Brideshead cultist route and declare yourself a loyal subject of the queen?
Levin reminds us that the context in which Burke wrote those words was a long-running intellectual dispute with a European-born radical, a man who was cheering on the secular revolution in France—and, oh, by the way, also one of the forefathers of our own revolution, favored by none other than Ronald Reagan himself—the Common Sense and The Crisis pamphleteer Thomas Paine.
That the rivalry between Burke and Paine cuts both ways through our hearts—this is precisely the kind of dialectic, if you will, that Levin hopes to provoke in the reader.
Make no mistake, though; Levin is a Burkean. In fact, the most eloquent exponent of Burkean conservatism, properly understood, since George Will circa 1983’s Statecraft as Soulcraft.
While scholarly and measured in tone, The Great Debate is a readable intellectual history that fairly crackles with contemporary relevance.
Indeed, The Great Debate is the must-read book of the year for conservatives—especially those conservatives who are profoundly and genuinely baffled by the declining popularity of the GOP as a national party. How can America, these conservatives ask, the land of the rugged individual, the conquerors of the frontier, choose statism and collectivism over freedom and liberty?!
Levin’s book provides the answer: You’re looking at the Democratic Party all wrong. It’s just as individualist as you are—maybe more so.
And that is the problem!
If you are literate today, it does not mean you can write — not even close to it in many cases. But if you were literate in 1863, even if you could not spell, you often could write descriptively and meaningfully. In the century and a half since, we have evolved from word to image creatures, devaluing the power of the written word and turning ourselves into a species of short gazers, focused on the emotions of the moment rather than the contemplative thoughts about consequences and meaning of our actions. Many everyday writers in the mid-19th century were far more contemplative, far more likely to contextualize the long-term meaning of their actions. They meticulously observed and carefully described because, although photography was the hot new medium during the Civil War, words remained the dominant way of communicating thought, memory, aspiration, hope.
Raasch’s theory is not a new one. Back in the 1980’s, when Internet was still in its primordial days and television was king, Neil Postman wrote Amusing Ourselves to Death. His book cautioned against the developing “Age of Show Business,” fed by television’s sensory, visual medium.
Postman believed three “ages” were prominent throughout information’s history: first, ancient oral cultures encouraged the preservation of information through spoken records and stories. When the printing press and writing became more prominent, oral cultures dissolved into the “Age of Exposition”: a time when written records were perceived as holding the greatest truth. Then as photography and videography developed, media began to change again—for the worse. Postman believed we would lose more than writing ability in the wake of the entertainment era: he warned of a depleting mental and emotional capacity. He believed we would become as obsessed with pleasure as the humans in Aldous Huxley’s Brave New World. Postman’s descriptions of distracted, sensationalistic consumers relate well to Raasch’s “species of short gazers.”
We can only imagine what Postman would have said about the Internet; perhaps Raasch gives us a taste when he describes it as “the Great Din”: “Today, throwing barbs and brickbats into the Great Din of the Internet has become as second nature as breathing … The Great Din requires no forethought, no real calculation of purpose or result, no contemplative brake, no need to seek angles or views beyond those that reaffirm or reassure what we think right now.”
Is this truly the future of media? Will we lose any true, deep, thoughtful communication in its havoc of pixels and pictures?
One interesting counter-opinion comes from former Daily Beast editor Tina Brown. Having recently left the world of journalism for event production, Brown has told reporters that she no longer reads magazines herself—in fact, she thinks “the whole writing fad is so twentieth century” (in the words of New York Magazine). But rather than warning of impending havoc and din, Brown calls people back to oral communication: “I think you can have more satisfaction from live conversations,” she said, adding that we are “going back to oral culture where the written word will be less relevant.”
If we experience the “death of writing,” as Rassch puts it, could we come full-circle and return to the age of oral communication? Will grandfathers sit down with their grandchildren and tell them stories, like our ancestors so long ago? One can only hope; but if such an experience were truly to flower from “The Great Din,” it would be rather surprising.
Our modern world is obsessed with specialization. Yet this specialization is often unhealthy, both culturally and personally. Aeon Magazine contributor Robert Twigger suggests we need a new area of study—“polymathics”—to counter this monopathic obsession:
Polymathics might focus on rapid methods of learning that allow you to master multiple fields. It might also work to develop transferable learning methods. A large part of it would naturally be concerned with creativity — crossing unrelated things to invent something new. But polymathics would not just be another name for innovation. It would, I believe, help build better judgment in all areas.
The study methods Twigger prescribes have actually existed for quite some time, although in a slightly altered form, as the classical liberal arts. First developed in ancient Greece, the liberal arts encompassed those skills necessary for civic and personal freedom. The Greeks even emphasized the importance of physical athleticism, as Twigger suggests in his article. The “arête” they sought fostered excellence of mind, soul, and body.
In modern academia, the liberal arts usually include a core curriculum that enables students to “master multiple fields.” The liberal arts classically emphasized a progression from rudimentary learning (the “grammar”) to practical application in communication and circumstance (similar to Twigger’s idea of developing “transferable learning methods,” “crossing unrelated things to invent something new,” and ultimately building “better judgment”).
Twigger’s emphasis on learning many fields is good. Specialization, while useful in many job settings, can disparage the interconnected and complimentary nature of learning. Twigger reminds us that the humanities do advance educational growth in multiple subjects:
An intriguing study funded by the Dana foundation and summarised by Dr. Michael Gazzaniga of the University of California, Santa Barbara, suggests that studying the performing arts—dance, music and acting—actually improves one’s ability to learn anything else. Collating several studies, the researchers found that performing arts generated much higher levels of motivation than other subjects.
But note that Twigger is arguing from a claim of practicality: “polymathics will make you smarter, more creative, more humorous,” etc. This is a common attitude toward the humanities: if learning isn’t practical, it shouldn’t be practiced. Twigger points out the quantifiable practicalities of the humanities in his article, but there is no mention of the purpose the Greeks sought after: their vision of arête and freedom.
In a Wednesday New Yorker article, Lee Siegel argues that recent studies on literature—the ones claiming literature makes you more empathetic—soil the beauty of reading literature for its own sake. “Fiction’s lack of practical usefulness is what gives it its special freedom,” Siegel writes. “When Auden wrote that ‘poetry makes nothing happen,’ he wasn’t complaining; he was exulting. Fiction might make people more empathetic—though I’m willing to bet that the people who respond most intensely to fiction possess a higher degree of empathy to begin with. But what it does best is to do nothing particular or specialized or easily formulable at all.”
Perhaps the same is true for Twigger’s “polymathics,” the humanities, and the liberal arts. Despite their quantifiable benefits, one shouldn’t denigrate the “special freedom” of learning for its own sake.
No one would deny that American philanthropy is grounded in good motives. But as The New Atlantis contributor William Schambra points out in a detailed article, philanthropy can become as poisoned as any other human venture:
America’s first general-purpose philanthropic foundations — Russell Sage (founded 1907), Carnegie (1911), and Rockefeller (1913) — backed eugenics precisely because they considered themselves to be progressive. After all, eugenics had begun to point the way to a bold, hopeful human future through the application of the rapidly advancing natural sciences and the newly forming social sciences to human problems. By investing in the progress and application of these fields, foundations boasted that they could delve down to the very roots of social problems, rather than merely treating their symptoms … According to the perspective of philanthropic eugenics, the old practice of charity — that is, simply alleviating human suffering — was not only inefficient and unenlightened; it was downright harmful and immoral. It tended to interfere with the salutary operations of the biological laws of nature, which would weed out the unfit, if only charity, reflecting the antiquated notion of the God-given dignity of each individual, wouldn’t make such a fuss about attending to the “least of these.” Birth-control activist Margaret Sanger, a Rockefeller grantee, included a chapter called “The Cruelty of Charity” in her 1922 book The Pivot of Civilization, arguing that America’s charitable institutions are the “surest signs that our civilization has bred, is breeding and is perpetuating constantly increasing numbers of defectives, delinquents and dependents.” Organizations that treat symptoms permit and even encourage social ills instead of curing them.
Schambra traces the history of “philanthropic” eugenics through the years, along with its roots and causes. “Philanthropy’s involvement in eugenics should forever remind us that, for all our excellent intentions and formidable powers, we are unable to eradicate our flaws once and for all by some grand, scientific intervention,” he writes.
The article highlights some inherent flaws in philanthropy that conservatives, and isolationists specifically, are very sensitive to: namely, the extent to which our “compassion” is motivated by a desire to control, fix, and regulate things not our business. It’s the “nanny state” paradigm, manifested in individuals like Mayor Bloomberg. It’s typically an accusation leveled at “compassionate conservatives.”
One commenter on a recent human trafficking article said he feared “a strong sense of self righteous, neo colonialist domination inherent in this kind of ‘cause’.”
However, in our fear of becoming meddlesome welfare statists, conservatives run the danger of becoming heartless. Paul Krugman accused Republicans of such sentiments in a Thursday column—he writes, “Republican hostility toward the poor and unfortunate has now reached such a fever pitch that the party doesn’t really stand for anything else — and only willfully blind observers can fail to see that reality.” Patrick Deneen excellently defined the problem with our attitudes in this regard in a recent TAC blog post:
The motivation of charity is deeply suspect by both the Right and the Left. The Right—the heirs of the early modern liberal tradition—regards the only legitimate motivation to be self-interest and the profit motive. They favor a profit-based health-care system (one explored to devastating effect in this recent article on health care in the New Yorker), and a utilitarian university (the “polytechnic utiliversity” ably explored by Reinhard Huetter in the most recent issue of First Things). The Left—while seemingly friends of charity and “social justice”—are deeply suspicious of motivations based on personal choice and religious belief. They desire rather the simulacrum of charity in the form of enforced standardization, homogeneity, and equality, based on the motivation of abstract and depersonalized national devotions and personal fear of government punishment.
I do not think most conservatives (unless they really are Randian to the core) want to forsake true “compassionate conservatism”—just its current manifestation in political circles. How, then, does one exercise philanthropic sentiment properly?
The key, according to Schambra, is personal caritas (love). He writes,
loving personal concern is at the heart of charity traditionally understood. It can only be practiced immediately and concretely, within the small, face-to-face communities that Tocqueville understood to be essential to American self-government. There, the seemingly minor and parochial concerns of everyday citizens are taken seriously and treated with respect, rather than being dismissed as insufficiently self-conscious emanations of deeper problems that only the philanthropic experts can grasp.
One could say this is the localism movement’s compassionate conservatism. It is based in present needs, rather than remote philanthropic endeavors. It seeks to love one’s neighbor, and not to “fix” him.
Does this mean true conservatives shouldn’t get involved in global crises? It might depend on the person and situation. The concerns of Coptic Christians in Egypt are of immense and immediate concern to me, because I consider them my brothers and sisters in Christ. Do I believe all secular American should intervene on their behalf? No—it is not their responsibility as it is mine. But Schambra is right: perhaps we should first focus where we are planted, and then slowly, thoughtfully spread from there.
It is sad that the Republican Party, a political group filled with religious folk, often shows callousness toward the impoverished—either by refusing to help them, or by offering only conditional charity. The Christian faith is filled with instructions to unconditionally love and help the unfortunate. Consider this passage from Isaiah 58, in which God rebukes the nation of Israel for being “religious” by fasting, but neglecting the poor:
Is this not the fast that I have chosen: to loose the bonds of wickedness, to undo the heavy burdens, to let the oppressed go free, and that you break every yoke? Is it not to share your bread with the hungry, and that you bring to your house the poor who are cast out; when you see the naked, that you cover him, and not hide yourself from your own flesh? Then your light shall break forth like the morning, your healing shall spring forth speedily, and your righteousness shall go before you; the glory of the Lord shall be your rear guard. Then you shall call, and the Lord will answer; you shall cry, and He will say, ‘Here I am.’ … If you extend your soul to the hungry and satisfy the afflicted soul, then your light shall dawn in the darkness, and your darkness shall be as the noonday.
Notice that it doesn’t say, “Only cover the naked or feed the hungry if they aren’t being lazy.” God doesn’t say, “Undo the heavy burdens—unless they really need to work harder.” Neither does he say, “Make sure they show a change of heart or get converted before you help them.” This isn’t a gospel of cutting food stamps. Perhaps it is a gospel in which food stamps never should have been necessary in the first place. If conservatives—especially Christian conservatives—want to say “government should mind its own business,” then perhaps it is time they start minding theirs.
A friend once described conservatives as people who agreed about one important thing—that at some point in the past, something went terribly wrong. After that, conservatives splinter into untold numbers of camps, since they disagree ferociously about the date of the catastrophe.
Most conservatives today agree that America has taken a terrible turn—that something went wrong at some point in the past. Most believe that America was well-founded by the Framers of the Constitution, but that something bad happened that corrupted the sound basis of the Founding. A few—generally unpopular—believe that Lincoln is to blame, that he introduced the beginnings of centralized State and the imperial Presidency. Many point to the catastrophe of the 1960s as the main source of current woes (a striking number of these constitute the neoconservative faction). But, at least in the circles in which I travel, an increasing number have settled on the Progressive era at the turn of the 20th-century as the source of today’s troubles, and see President Obama as the direct inheritor of this philosophical and political movement that was born in the late-19th and early-20th centuries.
The dominant narrative about the rise of Progressivism, both in the halls of academe and its distillation in the popular media expressed by figures such as Glenn Beck, is that Progressivism was a virus that was incubated in a foreign (particularly German) laboratory and was transported to America by intellectual elites, often educated at German universities and influenced by thinkers such as Kant and Hegel (such intellectuals include the likes of Herbert Croly, Woodrow Wilson, and John Dewey). These Progressives despised the classical liberal philosophy of the Founding, and sought either an explicit rejection of the Constitution or an effective change by re-defining it as a “living” document.
This is a plausible case – and, the fact is that major progressive figures turned often to German and other foreign sources in developing their intellectual critique of the classical liberal philosophy of the Founding. Thus, by attributing the rise of Progressivism to a foreign contagion, it can be comfortably maintained that the Founding was good and true and was corrupted by a fifth column.
However, what this argument overlooks is that the greatest analysis of American democracy—Democracy in America, published in two volumes in 1835 and 1840, a full half-century before the flowering of Progressivism—already perceived the seeds of Progressivism’s major tenets already embedded in the basic features and attributes of liberal democracy as established at the Founding. Of particular note, while the major figures of Progressivism would directly attack classical liberalism, Tocqueville discerned that Progressivism arose not in spite of the classical liberal tradition, but because of its main emphasis upon, and cultivation of, individualism.
Individualism is a distinctive phenomenon arising in liberal democracy, notes Tocqueville. The idea of the individual is at least as old as Christianity, but individualism is a new experience of self that arises with the passing of the experience of embeddedness in a familial, social, religious, generational, and cultural setting that is largely fixed and unchanging—the basic features of an aristocratic society. The rise of liberal democracy, by contrast, is premised upon a view of the individual deriving from the social contract tradition, which conceives of human beings in their natural state as defined, above all, by the total absence of such constitutive bonds, inherited roles and given identities (a philosophy that Bertrand de Jouvenel said was developed by “childless men who forgot their childhood”). Instead, as Tocqueville describes, in a democracy, “the chain” that once bound a peasant all the way to a king is “shattered,” throwing each individual in their freedom and equality finally “into the solitude of their own own hearts.”
Yesterday morning, Micah Mattix asked “is it impossible for the humanities to thrive in a secular society,” one dominated by a philosophical materialism? In one way yes, yet there is nevertheless hope.
Insofar as our secular society has become infatuated with materialism, denying the possibility of anything (except subatomic particles) which we cannot calculate or mathematically describe, the humanities will cease to matter—we will only care about the knowledge and corollary power to be sapped from the world by scientific and mathematical dominance. At this point, humans themselves will cease to matter, as our humanity is degraded into a chemical and molecular equation, much like the code and circuit boards of a computer.
The Canadian philosopher, Charles DeKoninck, describes the tension between the sciences and humanities in this way:
The problems of philosophy, when distinguished from those which Bertrand Russell calls scientific, will remain forever in debate. Should the day ever come when Leibniz has his way: when, to settle their problems, philosophers will merely have ‘to take their pens in their hands, to sit down to their desks and to say to each other (with a friend as witness, if they liked), “Let us calculate” ‘, there will be no more problems, for there will be no one to raise them. Meantime, the calculators have their use, while philosophers are forever in need of being debunked, a thing no one knew better than Socrates. Nevertheless, as Aristotle suggested, no one can deny philosophy without at least implying a philosophy of his own, and his own may prove to be a very foolish one.
What did we know of man before we found out that he is a throng of electric charges? And that he is composed of multitudinous cells? And that the circulation of his blood is an exquisite piece of chemistry and mechanics? Is it possible that, having learned all this, we may remain far more ignorant of him than Sophocles, or Shakespeare? Or the people who believe they know what these writers meant? [Bold added.-M.O.]
Biology and chemistry can offer insights into the chemical reactions of anger—why the face flushes red and the heart beats faster—but science cannot offer the insight into “the anger of Achilles” as Homer does nor the words of rage Shakespeare places in Hotspur’s mouth, “an if the devil come and roar for them,/I will not send them: I will after straight/And tell him so; for I will ease my heart,/Albeit I make a hazard of my head.”
The academy sets aside the humanities for the sciences because of a pre-imposed philosophical approach. To embrace materialism, as Dr. Ronald McArthur has noted in a speech at Thomas Aquinas College, is to embrace “nihilism. That means nobody knows anything. It doesn’t matter whether you affirm something or deny something….Education then turns to the practical…There is hardly any education that is ordered, institutionally, to anything that is sound intellectually.”
In a post for Prospect, Christopher Fear asks why academic political theory is so remote from political practice. He concludes that it’s because political theorists devote themselves to eternal riddles that he dubs “Wonderland questions” rather than today’s problems. Consider justice, perhaps the original topic of political theorizing:
One of the central questions of academic political philosophy, the supposedly universal question “What is justice?” is a Wonderland question. That is why only academics answer it. Its counterpart outside the rabbit-hole is something like “Which of the injustices among us can we no longer tolerate, and what shall we now do to rectify them?” A political thinker must decide whether to take the supposedly academic question, and have his answers ignored by politicians, or to answer the practically pressing question and win an extramural audience.
Fear is right about the choice that political theorists face between philosophical abstraction and making an impact on public affairs. But he doesn’t understand why they usually pick the former. The reason is simple. Academic political theorists ask academic questions because…they’re academics.
In other words, political theorists are members of a closed guild in which professional success depends on analytic ingenuity, methodological refinement, and payment of one’s intellectual debts through copious footnoting. They devote their attention to questions that reward these qualities. Winning an extramural audience for political argument requires different talents, including a lively writing style, an ear for the public discourse, and the ability to make concrete policy suggestions. But few professors have ever won tenure on the basis of those accomplishments.
Another reason academic political theorists avoid the kind of engagement Fear counsels is that they have little experience of practical politics. Most have spent their lives in and around universities, where they’ve learned much about writing a syllabus, giving a lecture, or editing a manuscript—but virtually nothing about governing or convincing lay readers. How does expertise on theories of distributive justice, say, prepare one to make useful suggestions about improving the healthcare system? Better to stick with matters that can be contemplated from the comfort of one’s desk.
In this respect, political theorists are at a considerable disadvantage compared to professors of law or economics. Even when their main work is academic, lawyers and economists have regular chances to practice in the fields in the fields they study. Within political science, many scholars of international relations pass through a smoothly revolving door that connects the university with the policy community. Political theorists have few such opportunities.
Fear points out that it wasn’t always this way. Before the 20th century, many great political theorists enjoyed extensive political influence. But Fear forgets the main difference between figures like Machiavelli, Locke, Montesquieu, Madison, Burke, Hume, Mill, or Marx and their modern epigones. The former were not professors. Although all devoted to the philosophical truth as they understood it, they were also men of affairs with long experience of practical politics.
The “brilliant and surreal tragedy of academic political theory,” then, is not that political theorists have been diverted into the wrong questions. It’s that political theory is an uncomfortable fit with the university. Academic political theorists gravitate toward the kind of questions that career scholars are in a position to answer.
Earlier this week I began a series of lectures in one of my classes on the thought of the Anti-Federalists. I began by echoing some of the conclusions of the great compiler and interpreter of the Anti-Federalist writings, Herbert Storing, whose summation of their thought is found in his compact introductory volume, What the Anti-Federalists Were For. I began with the first main conclusion of that book, that in the context of the debate over the Constitution, the Anti-Federalists were the original American conservatives. I then related a series of positions that were held by the Anti-Federalist opponents of the proposed Constitution. To wit:
They insisted on the importance of a small political scale, particularly because a large expanse of diverse citizens makes it difficult to arrive at a shared conception of the common good and an overly large scale makes direct participation in political rule entirely impracticable if not impossible. They believed that laws were and ought to be educative, and insisted upon the centrality of virtue in a citizenry. Among the virtues most prized was frugality, and they opposed an expansive, commercial economy that would draw various parts of the Union into overly close relations, thereby encouraging avarice, and particularly opposed trade with foreign nations, which they believed would lead the nation to compromise its independence for lucre. They were strongly in favor of “diversity,” particularly relatively bounded communities of relatively homogeneous people, whose views could then be represented (that is, whose views could be “re-presented”) at the national scale in very numerous (and presumably boisterous) assemblies. They believed that laws were only likely to be followed when more or less directly assented to by the citizenry, and feared that as distance between legislators and the citizenry increased, that laws would require increased force of arms to achieve compliance. For that reason, along with their fears of the attractions of international commerce and of imperial expansion, they strongly opposed the creation of a standing army and insisted instead upon state-based civilian militias. They demanded inclusion of a Bill of Rights, among which was the Second Amendment, the stress of which was not on individual rights of gun ownership, but collective rights of civilian self-defense born of fear of a standing army and the temptations to “outsource” civic virtue to paid mercenaries.
As I disclosed the positions of the Anti-Federalists, I could see puzzlement growing on the faces of a number of students, until one finally exclaimed—”this doesn’t sound like conservatism at all!” Conservatism, for these 18-to-22-year-olds, has always been associated with George W. Bush: a combination of cowboy, crony capitalism, and foreign adventurism in search of eradicating evil from the world. To hear the views of the Anti-Federalists described as “conservative” was the source of severe cognitive dissonance, a deep confusion about what, exactly, is meant by conservatism.
So I took a step back and discussed several ways by which we might understand what is meant by conservatism—first, as a set of dispositions, then as a response to the perceived threats emanating from a revolutionary (or even merely reformist) left, and then as a set of contested substantive positions. And, I suggested, only by connecting the first and third, and understanding the instability of the second, could one properly arrive at a conclusion such as that of Storing, who would describe the positions of the Anti-Federalists as “conservative.”
First, there is the conservative disposition, one articulated perhaps most brilliantly by Russell Kirk, who described conservatism above all not as a set of policy positions, but as a general view toward the world. That disposition especially finds expression in a “piety toward the wisdom of one’s ancestors,” a respect for the ancestral that only with great caution, hesitancy, and forbearance seeks to introduce or accept change into society. It is supremely wary of the only iron law of politics—the law of unintended consequences (e.g., a few conservatives predicted that the introduction of the direct primary in the early 1900′s would lead to increasingly extreme ideological divides and the increased influence of money in politics. In the zeal for reform, no one listened). It also tends toward a pessimistic view of history, more concerned to prevent the introduction of corruption in a decent regime than driven to pursue change out a belief in progress toward a better future.