In one of the most ingenious interviews of “The Colbert Report” (at 5:00 below), Rep. Lynn Westmoreland (R-Ga.) tells Stephen Colbert about legislation he co-sponsored to have the Ten Commandments displayed in both houses of Congress.
“The Ten Commandments is not a bad thing for people to understand and respect,” Westmoreland explains.
Colbert deadpans: “What are the Ten Commandments?”
“What are all of them?” Westmoreland asks, dread creeping into his eyes. “You want me to name them all?”
It’s so embarrassing it’s hard to watch: Westmoreland ums and ahs through a few before admitting, “I can’t name them all.” (Quick quiz: can you?)
New York Times columnist David Brooks recently had his own Westmoreland moment. Lamenting the neglect of biblical principles in American life, Brooks wrote,
In Corinthians, Jesus tells the crowds, “Not many of you were wise by worldly standards; not many were influential; not many were of noble birth. …”
That this sentence didn’t catch the eye of Brooks or his fact-checkers denotes a basic ignorance of the New Testament, as Jesus did not travel to Greece. (The author of the quote is Paul, who did).
Alan Jacobs and friends had some fun with Brooks’ error this week:
“How surprised his disciples must have been when Moses walked on the Red Sea.” #nytimesbible
— Alan Jacobs (@ayjay) June 24, 2013
— Erik Gregersen (@erikgregersen) June 24, 2013
The last time the Times flubbed a Bible quote, Eric Metaxas went into full-on culture wars mode:
In the world of Manhattan cultural elites, the Bible is mostly thought of as a quaint and useless artifact…Is the secular bias at the Times so pervasive that it has affected not just the writers but the fact-checkers too?
The Times is “out of touch with middle America,” he wrote.
It’s easy to beat up on Brooks, but there is in fact little indication that “middle Americans” crack open the Good Book much more than those “cultural elites” who fact-check the Times.
In recent nationwide survey, nearly a fourth polled believed that “the values and morals of America are declining” due to “a lack of Bible reading,” and 56 percent said the Bible should have a greater role in American society. But how many read the Bible regularly? About a fifth. As Christianity Today put it, Americans revere their Bibles so much that they keep them in pristine, unopened condition.
David Brooks, House Republicans, and everyday Americans alike love to appeal to “Biblical principles” or “Judeo-Christian morality.” Actually reading the Bible, not so much. As the Virgin Mary told the Thessalonians, “The spirit is willing, but the flesh is weak.”
The philosopher Carlos Fraenkel has a beautiful piece in the Jewish Review of Books about his experience teaching an “underground seminar” on philosophy to Hasidic Jews. Although most were active businessmen with large families and heavy religious obligations, Fraenkel’s students sat up nights reading and discussing Spinoza, Maimonides, and Nietzsche. In addition to the demands on their time, these amateur philosophers faced a serious risk of ostracism from the ultra-Orthodox world that circumscribes their lives. ”From the point of view of our community,” one explains, “studying these books is much worse than having an extramarital affair or going to a prostitute. That’s weakness of the flesh, but here our souls are on the line-apikorsus (heresy) means losing our spot in olam ha-ba (the world to come).”
It’s easy to ridicule the limits on intellectual freedom imposed by ultra-Orthodox Judaism. But they may be partly redeemed by their consequence: an overriding sense of the importance of thought, study, and right conduct. For Fraenkel, the emphasis on things of the soul over things of the body that traditional Judaism shares with philosophy is the core of Hasidim’s paradoxical appeal. Asked why he is interested in the ultra-Orthodox community, Fraenkel explains that “while I am not attracted to its content, I am intrigued by its form—a world that revolves around wisdom and God, rather than wealth, sex, power, and entertainment. [The students] are surprised when I say that from Plato to Spinoza most philosophers endorsed this ranking, if not the same accounts of wisdom and God. And they are stunned to learn that I would be very disappointed if my 2-year-old daughter grew up to value lipstick, handbags, and boys in sports cars more than education and ethics.”
But it often takes an outsider to envy the members of a closed community. Where Fraenkel sees an authentic thirst for wisdom in his rebellious students, they speak first hand of the superstition, prejudice, and conformity against which they struggle day after day. Fraenkel would rather see his daughter follow the Satmar Rebbe than Kim Kardashian. But some of his students have stopped having children to avoid enlarging their sects.
Many academic philosophers bristle when their work is associated with broad claims about the “meaning of life”. Fraenkel and his students provide a vivid reminder that, while philosophy is more than a dorm-room bull session or self-help mantra, its significance is based on reflection on the purposes of human existence rather than feats of technical sophistication or analytic ingenuity. As a college teacher, I wish that I could impart to my bright, hard-working, well-credentialed students even a small portion of the seriousness and maturity that Fraenkel’s black-hatted friends brought to their reading and discussion. Although not scholars, the members of the underground seminar seem to be philosophers in the original and highest sense: lovers of wisdom.
Undecided voters lay inside a sleek fMRI machine in late 2007. The magnetic coil pulsed, scanning the blood flow in their brains. Images of Hilary Clinton, Mitt Romney, John Edwards, and other primary contestants flashed before their eyes.
The UCLA neuroscientists and Washington political operatives who ran the study presented their findings in a New York Times article, “This Is Your Brain on Politics.” The “voter impressions on which this election may turn” were displayed in a colorful slideshow of (statistical combinations of) brains: here the medial orbital prefrontal cortex is orange, indicating an “emotional connection” with Democrats; here the amygdala flashes, betraying anxiety about Mitt Romney.
At an American Enterprise Institute panel event last Monday, psychiatrist Sally Satel told the audience that this story alerted her to just how vulgarized neuroscience was becoming in popular culture.
“It really was a bit of a fiasco,” she told NYT columnist David Brooks, who was moderating a conversation with Satel and psychologist Scott Lilienfeld. “The fact that something lights up doesn’t mean you hate Hillary Clinton, or you’re going to vote for someone else. It almost read like a parody, the way they had boiled it down to an almost stick figure kind of narrative.”
It’s the “stick figure narrative” that’s being formed out of neuroscientific research and inserted into law, politics, and culture, that Satel and co-author Lilienfeld seek to dismantle in their authoritative, accessible new book Brainwashed: the Seductive Appeal of Mindless Neuroscience.
The wild exaggeration of neuroscience—both of specific findings, and of the field’s primacy in understanding human nature more generally—has drawn the ire of savvy bloggers and tome-writing intellectuals for years. The exposure of Jonah Lehrer, neuroscience’s most prominent popularizer, as a plagiarist and a fabricator also occasioned a critical look at the popsci genre he championed. But Satel and Lilienfeld’s book may represent the high water mark of anti-pop neuroscience writing so far: it is widely reviewed, and Brooks plugged Brainwashed in his weekly New York Times column. (Brooks freely admits he himself has succumbed to neuromania: “I wrote a book a couple years ago of mindless neuroscience, and it did really well!” he quipped at the AEI event.)
Anti-pop neuroscience, as opposed anti-neuroscience, is the key distinction. Satel and Lilienfeld want to clean up the riffraff because neuroscience done right is a sophisticated and promising field of inquiry. They are not here to argue, as Brooks did in his recent column, that the mind is not the brain, or even that we have free will outside of the causal chain of our neural firings.
Rather, the uses and abuses of neuroscience are more illustrative as a story of our tendency to get ahead of ourselves. Our perennial thirst for elegant mechanisms and overarching narratives, noble in its own right, can lead us to take lazy shortcuts and place our hope in the Next Big Explanation, whether phrenology, Freud, or Freakonomics. Culture, history, and politics are complicated, confusing, and mostly boring. With the recent successes of neuroscience, it’s easy to wish that the chatter of narratives, prejudices, habits, and emotion could be replaced with the clinical pings of the fMRI machine.
But for now, at least, it would seem that neuroscience has a long way to go before it supersedes our other ways of knowing: “This Is Your Brain on Politics” identified two 2007 candidates who had failed to fire up the neurons of swing voters, indicating impending trouble for their campaigns. They were John McCain and Barack Obama.
The New York Times Magazine includes a regular advice column under the title “The Ethicist”. Readers submit accounts of their dilemmas, to which the Ethicist proposes solutions. The founding Ethicist was the comedy writer Randy Cohen. In 2011, Cohen was replaced by the novelist Chuck Klosterman.
Cohen’s version of ethics was little more than the application of liberal politics to matters of lost wallets or unintentional eavesdropping. For Cohen, persons were neither good nor evil and acts neither right nor wrong, at least in any categorical sense. Rather, society was to blame for putting people in uncomfortable situations.
Cohen’s politicized version of situational ethics resembled a BoBo version of the Eichmann defense. Even so, it was preferable to Klosterman’s rudderless speculations. Last week, Klosterman advised a reader that it was ethical to submit the same paper to two college classes, even though this was likely against the academic integrity policies of the student’s university. The reason: “I can’t isolate anything about this practice that harms other people, provides you with an unfair advantage or engenders an unjustified reward.”
As a number of commenters observed, this argument is mind-bogglingly stupid. To mention only the most obvious objection, it ignores the probability that the student had already committed to not doing things like turning in the same paper for different classes. Many universities make observance of an academic integrity policy a condition of enrollment, and sometimes for the submission of specific assignments.
So the very first thing the student should have done was check the standards he had either expressly or tacitly accepted. At the colleges where I’ve taught, students are informed early and often of the policies for this sort of thing, and sometimes have to sign documents indicating their understanding.
Moreover, it’s pretty clear that the turning in the same paper for different classes does provide an unfair advantage over students who follow the rules. After all, they have to do much more work to complete their courses. In addition to his refusal to consider the ethical significance of the student’s voluntary commitments, Klosterman didn’t think very hard about the way college works.
In an old piece making the rounds on Twitter, the British sociologist Michael Young, who coined the term “meritocracy”, urges its removal from the public lexicon. Although a lifelong man of the Left, Young sounds remarkably like Charles Murray:
Underpinning my argument [in The Rise of the Meritocracy] was a non-controversial historical analysis of what had been happening to society for more than a century before 1958, and most emphatically since the 1870s, when schooling was made compulsory and competitive entry to the civil service became the rule.
Until that time status was generally ascribed by birth. But irrespective of people’s birth, status has gradually become more achievable.
It is good sense to appoint individual people to jobs on their merit. It is the opposite when those who are judged to have merit of a particular kind harden into a new social class without room in it for others.
Ability of a conventional kind, which used to be distributed between the classes more or less at random, has become much more highly concentrated by the engine of education.
A social revolution has been accomplished by harnessing schools and universities to the task of sieving people according to education’s narrow band of values.
With an amazing battery of certificates and degrees at its disposal, education has put its seal of approval on a minority, and its seal of disapproval on the many who fail to shine from the time they are relegated to the bottom streams at the age of seven or before.
The new class has the means at hand, and largely under its control, by which it reproduces itself.
Young’s diagnosis of pretensions of the modern elite seems unimpeachable to me. As I and others have argued on this site, the current system of educational credentialing has the function of preserving and transmitting privilege, even though it was designed for much the opposite end. The conceptual hinge of this transformation is the ambiguity of the term “merit”. If we’re not careful to specify what we mean by merit, the (strong) instrumental argument for distributing tasks and responsibilities to those best able to fulfill them tend to slips into the (weak) moral argument that the most capable few deserve greater power and wealth.
What’s more, the close association of merit with educational achievement tends to depreciate abilities and dispositions that may be more suited to many positions. Consider what happened when bankers learned to consider themselves the smartest guys in the room.
It’s too late to get rid of meritocracy: both the word and the ideal it represents have been too deeply ingrained in our ethical culture. What we can do is insist that its ambiguities and disadvantages be acknowledged, particularly by those who claim to act in the public interest. As Young’s example indicates, this is a task in which many socialists, libertarians, and traditionalist conservatives can cooperate. As much as we disagree on other matters, we know that meritocracy is a dangerous illusion.
Reason’s Nick Gillespie flags Gallup polling data showing 54 percent of Americans believe the “federal government has too much power.” Of all things, this put me in mind of St. Paul’s second letter to the Corinthians, in which the apostle presented us with a lovely paradox: “For when I am weak, then I am strong.” Don’t get me wrong; I know Paul meant this in the sense that Christians should rely less on their own devices and recognize their spiritual impoverishment outside of Christ.
But I find Paul’s paradox useful when thinking about the size and scope of the federal government. On the one hand, yes: you can look at the overreach at the Internal Revenue Service and the Department of Justice and see a central government unaccountably trampling on civil liberties.
On the other hand, look at Apple’s illustrative case of tax avoidance. Charles Duhigg and David Kocieniewski write in the New York Times:
Setting up an office in Reno is just one of many legal methods Apple uses to reduce its worldwide tax bill by billions of dollars each year. As it has in Nevada, Apple has created subsidiaries in low-tax places like Ireland, the Netherlands, Luxembourg and the British Virgin Islands — some little more than a letterbox or an anonymous office — that help cut the taxes it pays around the world.
Call me crazy if you like, but I think Robert Reich is exactly right: “global capital, in the form of multinational corporations as well as very wealthy individuals, is gaining enormous bargaining power over nation states.”
Wouldn’t you know it, Gallup polling (from 2011) also shows that an even bigger majority of Americans believe that lobbyists, major corporations, and banks and financial institutions have too much power, along with the federal government.
What this means is that many Americans often feel dwarfed by big institutions and concentrated power, no matter whether those institutions are in the public or private sector. I understand and appreciate the libertarian solution to break up “bigness” wherever it’s found—to sever the bonds between corporate lobbyists, lawmakers, and bureaucrats. However, I think it’s reductive to look at those bonds as having been forged by years of “scratch my back and I’ll scratch yours” corporatist conspiracy. There will always be unseemly stories to be found on the money-in-politics beat. But as a theory of government, I find it less satisfying.
The truth is, as figures from Teddy Roosevelt to John Kenneth Galbraith recognized, government institutions grew larger as a means of “countervailing” private-sector power that grew endogenously in a free economy.
There is very little that can be done about this, it seems to me. Ideally, as individuals and as private citizens, we like to think we can bring this complex postmodern economy to heel. The reality is that we’re going to have to learn to muddle through it.
As Prime Minister David Cameron was visiting the U.S. last week, his party stumbled into another public-relations pit when an unnamed high-ranking associate of the prime minister’s was quoted characterizing the party’s activists as “mad, swivel-eyed loons.” (The comment was made in private, but within earshot of journalists.) That followed on UKIP’s impressive showing in local elections and a Tory backbench revolt against Cameron’s push for same-sex marriage, which passed with more members of the prime minister’s party voting against it than for it.
Hugo Rifkind in the Spectator traces these troubles to a generational divide:
Speaking almost factually, it’s pretty unlikely that the Tory grass roots are ‘swivel-eyed loons’. By most estimates, their average age is somewhere between 65 and 70, and swivelling your eyes behind bifocals rather defeats the purpose. Meanwhile, according to a poll by Survation this week, 33 per cent of those over 65 would vote Ukip tomorrow, and 13.4 per cent of everybody else would. Similarly, according to an ICM poll earlier this year, 37 per cent of over-65s fancy the idea of gay marriage, compared with 72 per cent of those below. This is glaring stuff.
… let us recognise Mr Cameron’s difficulty for what it is. Which is, essentially, that he faces a concerted fightback from an older generation that feels the world slipping from its fingers, and has had enough.
Or if you prefer, we can put the bellicosity on the other side, and identify an aggressive move … by a younger generation — or, to be more accurate, a couple of younger generations — who realise the world is finally theirs and wish to brand it with their stamp. This is why equal marriage, in particular, has become such a big deal, despite affecting relatively few people. It’s not just a symbol. It’s an early skirmish, between those who feel it is time to stop just living in their country and start owning it, and those who have owned it until now and don’t want to let it go.
It’s not just a matter of assertiveness, though. In the UK as well as the U.S., the cultural backdrops against which the older and the younger cohorts have grown up have been so different as to lead quite naturally to widely divergent views—indeed, opposite views of what constitutes stability and crisis, or who is aggressing against whom.
The Republican Party is in the same boat as Cameron’s Conservatives: however “grand” the GOP may or may not be, it is certainly the Old Party. National Journal‘s Charlie Cook observed earlier this year: Read More…
The New Yorker‘s George Packer can’t decide what to think about 21st-century America.
On the one hand, Packer likes developments that enhance the lifestyles of the educated upper middle class: “marriage equality, Lipitor, a black President, Google searches, airbags, novelistic TV shows, the opportunity for women to be as singlemindedly driven as their male colleagues, good coffee, safer cities, cleaner air, photographs of the kids on my phone, anti-bullying, Daniel Day Lewis, cheap communications, smoke-free airplanes, wheelchair parking, and I could go on.” On the other hand, he’s sorry that these benefits aren’t more broadly shared. Life is pretty good in brownstone Brooklyn and its spiritual counterparts. But it’s gotten harder and harder in “urban cores like Youngstown, Ohio; rural backwaters like Rockingham County, North Carolina; and the exurban slums outside Tampa…”
So how can this be the best of times for gays, sufferers from cardiovascular disease, African American politicians, TV fans, ambitious women, and so on, but among the worst for the urban poor, agricultural workers, and overleveraged homeowners? Packer can’t quite figure it out:
We usually think of greater inclusiveness as a blow struck for equality. But in our time, the stories of greater social equality and economic inequality are unrelated. The fortunes of middle-class Americans have declined while prospects for many women and minorities have risen. There’s no reason why they couldn’t have improved together—this is what appeared to be happening in the late nineteen-sixties and early seventies. Since then, many women and minorities have done better than in any previous generations, but many others in both groups have seen their lives and communities squeezed by the economic contractions of the past generation.
Although his economic generalizations are accurate, Packer’s remark is historically and politically obtuse. Rather than shedding light on the profound divergence in Americans’ fortunes and expectations over the last few decades, it reflects a spiritual crisis of the BoBo elite, which is unwilling even to contemplate the possibility that its commitments to individual autonomy and expressive consumerism are incompatible with the egalitarianism that it pretends to favor.
As Jordan Bloom mentioned yesterday, Corey Robin has a provocative essay on the connection between between Nietzsche and the “Austrian” economists in The Nation. The piece is titled “Nietzsche’s Marginal Children”, as if Menger, Mises, Hayek and Schumpeter were Nietzsche’s direct heirs. The actual argument is more subtle: “the relationship between Nietzsche and the free-market right…is thus one of elective affinity rather than direct influence, at the level of idiom rather than policy.”
According to Robin, both Nietzsche and the Austrians saw value as a subjective commitment under conditions of constraint rather than an objective contribution by labor. For this reason, they endorsed agonistic social relations in which individuals struggle to express and impose valuations to the limits of their differential strength, while rejecting egalitarian arrangements that attempt to give producers a fair share of the value they have generated. Although he was most interested in philosophy and art, Nietzsche also described the conditions necessary for cultural renewal as “great politics”. For the Austrians, by contrast, the marketplace was the setting for contestation over value.
Like Robin’s argument in The Reactionary Mind, this interpretation is bound to appeal to leftists who are already convinced that there’s something sinister about conservative and libertarian thought (see the comments at Crooked Timber here). But it has serious problems, which Brian Doherty and Kevin Vallier have already begun to point out.
For one thing, there’s nothing unique to Austrian economics about the subjective theory of value. As Robin acknowledges, the foundations of the so-called marginal revolution were laid by the Frenchman Walras and the Englishman Jevons, as well as the Austrian Menger. That wouldn’t matter if the influence of these writers had been especially strong in the milieu that eventually produced Mises and Hayek. But in fact, almost all modern economists, whatever their theoretical or political orientation, accept some descendant of Walras, Jevons, and Mengers’ arguments. What’s more, Robin generally ignores the technical mathematical background of the marginal revolution, which he presents primarily as debate in moral philosophy. That decision obscures the most important cause of the transformation of economic thought in the 19th century: the demand that economics become a science on the model of physics.
Robin is also evasive in his chronology. He acknowledges that “[a]round the time—almost to the year—that Nietzsche was launching his revolution of metaphysics and morals, a trio of economists [Walras, Jevons, and Menger], working separately across three countries, were starting their own.” But he doesn’t deal explicitly with the possibility that this temporal coincidence makes any connection between Nietzsche and marginal economics circumstantial.
It’s true that Hayek and his Austrian contemporaries received the new theories of value in economics in a cultural context influenced by Nietzsche. But that tells us nothing about those theories’ original inspiration—let alone their truth. In any case, the fact that marginal economics became dominant in a setting where Nietzsche had little or no influence, such as the British academy, suggests that the heroic individualism he so brilliantly articulated was by no means a necessary condition of the transformation of economics. And given the variety of reactions to Nietzsche in the 20th century, it’s clearly not a sufficient one.
It’s also crucial to remember that Nietzsche was not the only 19th century thinker who challenged the leveling tendencies of democracy and socialism. On the contrary, this concern is among the major themes of Tocqueville, Carlyle, Mill, Kierkegaard, Burkhardt, Freud, Dostoyevksy, and Pareto, to name only a few. Robin knows too much to ignore these names, some of which occur in the piece. But Robin’s focus suggests that they served, at most, as adjuncts or supplements to Nietzsche.
Robin’s central error, in other words, is an uncritical acceptance of Nietzsche’s evaluation of himself as a “fate” rather than an articulator, however brilliant, of ideas that were very much in the air of the 19th century. In this respect, Robin shows an odd affinity for Leo Strauss, who tended to reduce intellectual history to a decontextualized dialogue among great thinkers.
Noah Millman takes issue with some of the excerpts I’ve given from Schumpeter. But Schumpeter’s full view of the relationship between family, individualism, and capitalism can’t be captured in a brief quotation. If I had to give a close approximation of it, though, this is probably the most apt passage from Capitalism, Socialism, and Democracy:
In the preceding chapter it was observed that the capitalist order entrusts the long-run interests of society to the upper strata of the bourgeoisie. They are really entrusted to the family motive operative in those strata. The bourgeoisie worked primarily in order to invest, and it was not so much a standard of consumption as a standard of accumulation that the bourgeoisie struggled for and tried to defend against governments that took the short-run view. With the decline of the driving power supplied by the family motive, the businessman’s time-horizon shrinks, roughly, to his life expectation. And he might now be less willing than he was to fulfill that function of earning, saving and investing even if he saw no reason to fear that the results would but swell his tax bills. He drifts into an anti-saving frame of mind and accepts with an increasing readiness anti-saving theories that are indicative of a short-run philosophy.
People once accumulated capital largely for the sake of their progeny. Now that they are less inclined to think about progeny, that is one more reason (among others Schumpeter gives in his book) for greater concern with immediate satisfactions rather than long-term capital development. The motive for defending accumulation against redistribution by the state has also been weakened. These are not only changes that affect actual economic practices, but they also condition the public’s receptivity to ideas that promise to solve short-run problems such as, say, unemployment by means of some sacrifice of capital. (To what extent unemployment actually is a short-run problem is something to think about, but this is where Schumpeter is coming from.) Read More…