Conservatives are out to lampoon and destroy Common Core. Their reasons for this are legion, but their main objections boil down to one giant fear: the centralization of education. As George Will stated in a Thursday Washington Post column, “What begins with mere national standards must breed ineluctable pressure to standardize educational content. Targets, metrics, guidelines and curriculum models all induce conformity in instructional materials.” But “must” this centralization truly happen? Are these founded predictions, or phantasmagoric fancies?
Homeschoolers are getting worried, too: Dr. Susan Berry wrote at Breitbart, “When SAT, ACT, and GED exams are “aligned” with Common Core, homeschooled students—as well as students educated in private schools—may not be able to ‘opt out’ of the federally incentivized standards if they want to apply for college. These students could be pressured into adopting Common Core for curriculum at home so that they are familiar with the presentation of material on the newly aligned college entrance exams.”
First: all proponents and opponents of Common Core should take a good look at the standards themselves. When one actually looks at the material, it becomes clear that they focus on skills, not content. To use one analogy: the Common Core does not say how you should teach your child to ride a bike. That is up to your discretion. But Common Core will test your child on their ability to ride the bike proficiently. Thus, one cannot really “adopt Common Core for curriculum”—it doesn’t really provide curricular content. One could use it to measure the difficulty and proficiency of one’s curriculum, but that’s slightly different.
For some (myself included) this focus on skills is somewhat concerning. Questions of centralization and impending educational doom are somewhat questionable. But Common Core does prioritize grading and test scores over the actual content of learning. For those who believe quality content should be the end of education, and not merely means to a good GPA, this is rather disconcerting. In application, this emphasis could be harmless—or it could push teachers and students to work merely for the grade (especially since teacher evaluations will now be tied to student performance on the new achievement tests). This would devalue the truest, fullest definition of education—a view that constitutes learning for its own sake. As Patrick Deneen put it in a past TAC article, “A basic utilitarian mindset now dominates the definition and understanding of education and how it thereby constrains, limits, and narrows the scope of education’s purposes solely to the debased end of work.”
One hopes the new tests may function similarly to the SAT—measuring a students’ ability to read well, rather than dictating what students should read. This would hopefully give teachers the flexibility required to administer proper materials to their students, in alignment with their learning levels and context. But fear of grade repercussions could prevent teachers from studying materials with the desired scope and depth. Rather than ruminating over important, interesting problems, they may rush to study for the next exam.
Standards are important. And Common Core isn’t necessarily the boogey man painted by some in the conservative press. But in considering the standards’ adoption, we must ask ourselves important questions about education’s true meaning and telos. If we know what education is for, we can determine the best means to procure that end.
After collapsing on her kitchen floor from an apparent blood clot, Marlise Munoz was declared brain dead by a Fort Worth hospital. Her parents and husband told the doctors that Munoz would not want to be kept on life support. But as the family prepared to say goodbye to wife and daughter, the doctor gave them sudden and shocking news: the hospital would not terminate Munoz’s life, because she was 14 weeks pregnant. In a Tuesday New York Times article, authors Mary Fernandez and Erik Eckholm write,
More than a month later, Mrs. Munoz remains connected to life-support machines on the third floor of the I.C.U., where a medical team monitors the heartbeat of the fetus, now in its 20th week of development. Her case has become a strange collision of law, medicine, the ethics of end-of-life care and the issues swirling around abortion—when life begins and how it should be valued.
Munoz’s parents and husband do not want the baby. They want the doctors to pull the plug, per their original instructions. Munoz’s father, Ernest Machado, told the Times, “All she is is a host for a fetus. I get angry with the state. What business did they have delving into these areas? Why are they practicing medicine up in Austin?”
Munoz and her husband have a 14-month-old son, Mateo. The Times story includes a picture of the three of them: parents’ arms curled close around their infant son, smiling softly at him.
The hospital, in refusing to terminate Munoz’s life, is following Texas state law: it is one of two dozen states, according to the International Business Times, that prohibits doctors from cutting off life support to pregnant patients. The Texas law was passed in 1989, and amended in 1999.
Now NARAL Pro-Choice America has launched a petition to take Munoz off life support. ABC news quoted their petition in a Wednesday story: “The Munoz family deserves better than this—and it’s up to Texas attorney general Greg Abbott to show them that the state of Texas respects their wishes and their privacy.”
“If your goal is to legally enshrine the notion that pregnant women are incubators first and humans second, keeping their bodies alive to grow babies long after their minds are gone is a perfect way to do it,” wrote Slate author Amanda Marcotte. She said the family has expressed some fear that “the loss of oxygen that was enough to destroy Marlise Munoz’s brain probably did serious damage to her fetus.”
This story throws the difference between pro-life and pro-choice advocates into sharp relief. Views on life’s meaning, origin, and purpose weigh heavy in such debate—and few would deem this an easy decision for the family (or hospital) to make.
But at the same time, after viewing the picture of Erick and Marlise Munoz with their infant son, one can’t escape a feeling of bitter and painful irony. How could a father, so obviously loving and cherishing one child, want to terminate the life of another? Would it be wrong to extend Munoz’s life 19 (or fewer) weeks, to perhaps save her last child? Read More…
Students and faculty are not required to think anymore, laments spiked Education Editor Joanna Williams:
Too often it seems that universities today actually seek to prevent criticality and instead try to coerce groupthink among academics and students alike. One way this happens is through the enculturation of particular collective values. In the strange world of academia, an individual’s values and principles are no longer a private affair. Rather, they’re to be ‘given’ to you, which means you can be explicitly told which opinions to hold.
Williams has a good point. One wonders how Plato or Nietzsche would fare as professors in a modern university. Students who read the Republic are told not to mind Plato’s strange ideas about “sharing” women and children. But if Plato stood in a modern lecture hall, dressed in tweed suit and tie, would he be allowed to speak such revolutionary words? Or what of Nietzsche’s controversial “Ubermensch”: in today’s world, still shocked and frightened in the aftermath of WWII, should such ideas even be voiced to the young 18 to 22-year-olds who fill America’s classrooms? These two philosophers had brilliant and revolutionary ideas. Would modern audiences be willing to hear them for their brilliance, without rejecting them for their strangeness?
Williams critiqued professional standards laid out by the Higher Education Academy in her piece, mainly because they “[prescribe] values for lecturers to hold at all.” She writes, “That the HEA expects lecturers to demonstrate collective values … suggests criticality is no longer considered a fundamental part of the academic enterprise.” The real end of education, she reminds us, should not be inculcation of collective values. It should be the ongoing pursuit of truth and knowledge. But this criticism of “collective values” raises some questions: First, is it ever proper for a higher institution to promote “collective values”? If so, in what context?
Take, for instance, the small private college: especially at religious institutions, many lecturers hold certain “values” in common. Often, such values are a required component for teaching at or attending such an institution. Is this wrong, according to Williams’ thesis? Should the small Catholic college hire an atheist, to fight “groupthink” amongst its faculty and students? Surely, if a student received an F for writing an excellent paper on atheism, just because her school and professor are Catholic, we would consider it unjust. If the school threw out a professor who wanted to teach a literature class on the Koran, we should also see that as unfair.
Much depends on the way knowledge and values are taught and judged. Williams writes, “Although knowledge and values undoubtedly influence each other, losing the distinction between the two should be considered a major problem facing academics today … Expecting people to demonstrate they hold values that have been determined for them, irrespective of whether they individually agree with those values or not, creates a climate of uncriticality which is the exact opposite of what a university should be about.”
This topic can easily become “sticky,” especially when one considers the wide and disparate swath of ideas perpetuated in our culture. Questions of tolerance riddle today’s higher education world, especially at the private level. Determining where to draw boundaries takes wisdom and specificity. Nietzsche would not have “fit in” as a professor at just any university. But his work is important to all of us, and should not be denigrated merely because of his personality. True criticality, one might suggest, enables us to determine which “collective values” are important for the furthering of knowledge, and which are personal or opaque enough to deserve flexibility and analysis.
Many modern readers push themselves through speed-reading courses. If they haven’t time for a full course, there are Youtube videos and websites on the subject. Now, in the age of digital reading, there are additional speed-reading gadgets: tools like Feedly and Twitter enable users to absorb small bits of text in a speedy fashion. A bunch of speed-reading apps have come into vogue, enabling users to read on the clock with greater efficiency.
Efficiency. It’s the word of the age, according to Alasdair MacIntyre: in After Virtue, he said “efficiency” is our most prized virtue. While the Greeks of Homer’s time valued virtues like duty and honor, our age preferred more “managerial” virtues, efficiency being the most valued of all.
But have we lost something in our endeavors for efficiency? Curator contributor Brett Beasley says yes. In a Monday post, he mused on our changing reading patterns, as our culture passes from speed-reading to half-reading, to complete negligence:
“Sharing” is the buzzword of our age, in which nearly all of what we read can be linked to, tweeted, emailed, attached, and downloaded within seconds. Mass digitization projects like Google Books and the Digital Public Library of America place more words within our grasp each hour, yet meanwhile we continue to hear reports that nearly a third of Americans did not read as much as one book in the past year. It’s strange, isn’t it? Reading often feels as easy as breathing. When I go on a road trip, I don’t have to make myself read the words written on the road signs and billboards. It just happens. But when it comes to anything longer than a few hundred words, the text seems to thicken and we have to push back against a surprising amount of resistance.
Why is it that we can spend significant time browsing menus and reading Buzzfeed articles, but roll our eyes when we scroll to the bottom of a new story and see the words “Page 1 of 12”? What is it about length that intimidates and frustrates us so?
At least in part, this annoyance is rooted in that modern striving for “efficiency.” We want our laughs, lunches, and letters as quickly as possible. We are increasingly aware of time’s incessant ticking: from the days in centuries past when church bells heralded the hours, minutes now flick by on our phone and computer screens. We own it, in a way that our ancestors did not. Countdown apps show us every millisecond whizzing past, set to our own schedules and deadlines.
Thus we, along with Beasley, remember “many thinkers and artists throughout history who have written and worked with a momento mori, or reminder of death, nearby. While we might pride ourselves on the nearly instantaneous speed with which we can deploy and make use of information online, in the end our time and our attention are finite, and we have to make difficult decisions about what is valuable enough to spend our time on.”
This is our second problem: in the age of information, reading is literally everywhere. News stands, Google news, even the nearest coffee shop—one needn’t go far to become bombarded with words. What shall you read first? What’s worth reading? What if you waste time reading something awful, when you could have been reading something else? “Stream, cloud, dust; now more than ever our text and our reading times are in need of a shape and an architecture,” writes Beasley. “Intentionally or unintentionally, each of us has a reading practice that shapes the way we live, think, and interact.”
What shape and architecture should we give to our reading life? One thing is certain: the way we read will affect the way we absorb information, and thus will shape the very way we live. Should we read Drudge-style—absorbing the most interesting headlines of the moment, discarding length for sensationalism? Or should we read in a more bookish, antiquated style: throwing “efficiency” to the wind, cherry-picking books we think will be the best, ignoring anything that isn’t timeless or classic?
Some would assume the latter is the “conservative” position. But it seems the best path is a narrow trail between the two extremes. There is pertinent everyday information we must absorb to make pertinent decisions. But essential information doesn’t always foster deeper human flourishing and intellectual growth. We should digest the pertinent with efficiency, but not for efficiency’s sake.
Efficiency is a means to a greater end, a greater virtue: that of wisdom. Wisdom is “the quality of having experience, knowledge, and good judgment.” To have wisdom, therefore, one must have basic knowledge of the pertinent. But “good judgment” doesn’t come from gulping down news in a frenzied fashion. Good judgment requires thoughtful, prolonged, and careful meditation. It requires outside opinions, secondary sources, and at least some research. It requires a depth of reading inspired by thoughtfulness, as well as inquisitiveness. In order to get wisdom, slow reading is necessary: a careful, deliberate inculcation of timeless truths.
The real trick, then, is determining what to read fast, and what to read slowly. Beasley references the reading traditions of the monks, and we can learn something from their style. Their reading during the Middle Ages, he says, “was the centerpiece of daily life. Several hours of reading (or lectio) often fell in the middle of the day, with the rest of the day devoted to periods of meditation, prayer, and contemplation. While lectio ‘puts whole food in the mouth,’ meditation ‘chews it and breaks it up,’ prayer ‘extracts its flavor,’ and finally contemplation ‘is the sweetness itself which gladdens and refreshens.’”
Perhaps our modern reading challenge lies in learning when lectio is the best course, and when deeper meditation or contemplation is required. We must differentiate between the times when efficiency leads us to wisdom—and when efficiency becomes a distracting end in and of itself.
In John Steinbeck’s novel Winter of Our Discontent, businessman Ethan Hawley is haunted by the presence of the town drunkard—his childhood friend, Danny Taylor. Once like brothers, the two men’s paths slowly diverged over time. Now, Hawley watches his old friend succumb to squalor and self-abuse. He cannot forget his guilt.
It reminded me of old Scrooge: Dicken’s “squeezing, wrenching, grasping, scraping, clutching, covetous, old sinner,” who lived with squalor and illness on his doorstep, yet didn’t care. He couldn’t see over his insurmountable mountain of greed.
Our current lives are filled with Ethans and Dannys, Scrooges and Tiny Tims. Take Mike Remboulis: he and his half-brother grew up in the same Queens apartment, but their fortunes have changed drastically over the years. New York Times reporter Rachel Swarns shared their story last Sunday:
Even amid lighthearted banter and savory plates, reality almost always comes rushing back: Beyond those restaurant doors, his big brother is teetering on the edge, barely hanging on. Mr. Remboulis, 51, is an aerospace engineer with a graduate degree. His half-brother, Glenn Yuzzi, 59, is a carpenter with a high school diploma. They grew up in the same Queens apartment, but today they live on opposite sides of the economic divide. It is a twist of fortune that haunts Mr. Remboulis. “I don’t want him to scrape by,” he said. “I don’t want my brother to drown.”
It seems these unfair disparities become more prominent in the American mind around Christmastide. After all, it’s freezing cold outside in most of America. Most of us are huddled in coats and scarves, drinking coffee, thinking about presents, food, and parties. In the midst of this happy holiday reverie, we see a man slumped on the street corner, McDonalds cup sitting empty at his side. Even if one isn’t surrounded by the homeless, it’s almost impossible to live in a town or city without knowing at least one family whose fortunes have fallen on bad times. We grow up hearing so much about “Christmas spirit”—the spirit of giving that supposedly animates and transports us during the Christmas season. We read The Christmas Carol and How the Grinch Stole Christmas, frown disapprovingly at the money-grubbing fiends.
Steinbeck set the advent of his novel on Good Friday. There’s death in the air. Over Friday and Saturday, change stirs in Ethan Hawley. But it isn’t change of a good kind. His money lust is awakening. From Saturday evening to Sunday morning, the morning of resurrection, Hawley begins to see himself as a snake that has just shed its skin. He’s transformed.
It reminded me of old Scrooge: he awakens on Christmas morning after a haunted nighttime vigil with ghosts. He is also transformed—but by losing his old self, completely rebirthed into newness: “I don’t know what day of the month it is!” said Scrooge. “I don’t know how long I’ve been among the Spirits. I don’t know anything. I’m quite a baby. Never mind. I don’t care. I’d rather be a baby.” Scrooge scatters his money like dust to the wind, overflowing with generosity and joy. To Tiny Tim, “he was a second father.”
Who will we be this Christmas? I scurry down cold city streets and see the pleading eyes, listen to stories of families “getting by.” Part of me wants to bark out like Cain, “Am I my brother’s keeper?”
But that’s what the Christmas story is about: the One who had everything became nothing, to save the lost brother. He bore poverty, sorrow, and shame in the winter of our discontent. And He promised us that, like Scrooge, we could be “born again.” He wouldn’t let us drown.
He truly was “better than his word. He did it all, and infinitely more.”
Reports that A- is the median grade in Harvard College have reopened the debate about grade inflation. Many of the arguments offered in response to the news are familiar. The venerable grade hawk Harvey “C-” Mansfield, who brought the figures to public attention, describes the situation as an “indefensible” relaxation of standards.
More provocative are defenses of grade inflation as the natural result of increased competition for admission to selective colleges and universities. A new breed of grade doves point out that standards have actually been tightened in recent years. But the change has been made to admissions standards rather than expectations for achievement in class.
According to the editorial board of the Harvard Crimson, “high grades could be an indicator of the rising quality of undergraduate work in the last few decades, due in part to the rising quality of the undergraduates themselves and a greater access to the tools and resources of academic work as a result of technological advances, rather than unwarranted grade inflation.” Matt Yglesias, ’03, agrees, arguing that “it is entirely plausible that the median Harvard student today is as smart as an A-minus Harvard student from a generation ago. After all, the C-minus student of a generation ago would have very little chance of being admitted today.”
There’s a certain amount of self-congratulation here. It’s not surprising that Harvard students, previous and current, think they’re smarter than their predecessors—or anyone else. But they also make an important point. The students who earned the proverbial gentleman’s Cs are rarely found at Harvard or its peers. Dimwitted aristocrats are no longer admitted. And even the brighter scions of prominent families can’t take their future success for granted. Even with plenty of money and strong connections, they still need good grades to win places in graduate school, prestigious internships, and so on.
The result is a situation in which the majority of students really are very smart and very ambitious. Coursework is not always their first priority. But they are usually willing to do what’s necessary to meet their professors’ expectations. The decline of core curricula has also made it easier for students to pick courses that play to their strengths while avoiding subjects that are tough for them. It’s less common to find Chemistry students struggling through Shakespeare than it was in the old days.
According to the Harvard College Handbook for Students, an A- reflects ”full mastery of the subject” without “extraordinary distinction”. In several classes I taught as an instructor and teaching fellow at Harvard and Princeton, particularly electives, I found that around half the students produced work on this level. As a result, I gave a lot of A-range grades.
Perhaps my understanding of “mastery” reflects historically lower demands. For example, I don’t expect students writing about Aristotle to understand Greek. Yet it’s not my impression that standards in my own field of political theory have changed a lot in the last fifty years or so. In absence of specific evidence of lowered standards, then, there’s reason to think that grade inflation at first-tier universities has some objective basis.
But that doesn’t mean grade inflation isn’t a problem. It is: just not quite the way some critics think. At least at Harvard and similar institutions, grades are a reasonably accurate reflection of what students know or can do. But they are a poor reflection of how they compare to other students in the same course. In particular, grade inflation makes it difficult to distinguish truly excellent students, who are by definition few, from the potentially much larger number who are merely very good.
Here’s my proposal for resolving that problem. In place of the traditional system, students should receive two grades. One would reflect their mastery of specific content or skills. The other would compare their performance to the rest of the class. Read More…
Raise your hand if you’re a conservative who has cited Edmund Burke without actually having read him closely.
Really—you’re all scholars of the Irish-born MP and oft-celebrated “father of modern conservatism”?
Okay, what did Burke mean by the phrase “the little platoon”?
Yuval Levin explains in his wonderful new book The Great Debate: Edmund Burke, Thomas Paine, and the Birth of Right and Left:
The division of citizens into distinct groups and classes, Burke writes, “composes a strong barrier against the excesses of despotism,” by establishing habits and obligations of restraint in ruler and ruled alike grounded in the relations of groups or classes in society. To remove these traditional restraints, which hold in check both the individual and the state, would mean empowering only the state to restrain the individual, and in turn restraining the state with only principles and rules, or parchment barriers. Neither, Burke thought, could be stronger or more effective than the restraints of habit and custom that grow out of group identity and loyalty. Burke’s famous reference to the little platoon—“To be attached to the subdivision, to love the little platoon we belong to in society, is the first principle (the germ as it were) of public affections”—is often cited as an example of a case for local government or allegiance to place, but in its context in the Reflections, the passage is very clearly a reference to social class.
Still feeling Burkean? Ready to go the pipe-and-slippers, Brideshead cultist route and declare yourself a loyal subject of the queen?
Levin reminds us that the context in which Burke wrote those words was a long-running intellectual dispute with a European-born radical, a man who was cheering on the secular revolution in France—and, oh, by the way, also one of the forefathers of our own revolution, favored by none other than Ronald Reagan himself—the Common Sense and The Crisis pamphleteer Thomas Paine.
That the rivalry between Burke and Paine cuts both ways through our hearts—this is precisely the kind of dialectic, if you will, that Levin hopes to provoke in the reader.
Make no mistake, though; Levin is a Burkean. In fact, the most eloquent exponent of Burkean conservatism, properly understood, since George Will circa 1983’s Statecraft as Soulcraft.
While scholarly and measured in tone, The Great Debate is a readable intellectual history that fairly crackles with contemporary relevance.
Indeed, The Great Debate is the must-read book of the year for conservatives—especially those conservatives who are profoundly and genuinely baffled by the declining popularity of the GOP as a national party. How can America, these conservatives ask, the land of the rugged individual, the conquerors of the frontier, choose statism and collectivism over freedom and liberty?!
Levin’s book provides the answer: You’re looking at the Democratic Party all wrong. It’s just as individualist as you are—maybe more so.
And that is the problem!
If you are literate today, it does not mean you can write — not even close to it in many cases. But if you were literate in 1863, even if you could not spell, you often could write descriptively and meaningfully. In the century and a half since, we have evolved from word to image creatures, devaluing the power of the written word and turning ourselves into a species of short gazers, focused on the emotions of the moment rather than the contemplative thoughts about consequences and meaning of our actions. Many everyday writers in the mid-19th century were far more contemplative, far more likely to contextualize the long-term meaning of their actions. They meticulously observed and carefully described because, although photography was the hot new medium during the Civil War, words remained the dominant way of communicating thought, memory, aspiration, hope.
Raasch’s theory is not a new one. Back in the 1980’s, when Internet was still in its primordial days and television was king, Neil Postman wrote Amusing Ourselves to Death. His book cautioned against the developing “Age of Show Business,” fed by television’s sensory, visual medium.
Postman believed three “ages” were prominent throughout information’s history: first, ancient oral cultures encouraged the preservation of information through spoken records and stories. When the printing press and writing became more prominent, oral cultures dissolved into the “Age of Exposition”: a time when written records were perceived as holding the greatest truth. Then as photography and videography developed, media began to change again—for the worse. Postman believed we would lose more than writing ability in the wake of the entertainment era: he warned of a depleting mental and emotional capacity. He believed we would become as obsessed with pleasure as the humans in Aldous Huxley’s Brave New World. Postman’s descriptions of distracted, sensationalistic consumers relate well to Raasch’s “species of short gazers.”
We can only imagine what Postman would have said about the Internet; perhaps Raasch gives us a taste when he describes it as “the Great Din”: “Today, throwing barbs and brickbats into the Great Din of the Internet has become as second nature as breathing … The Great Din requires no forethought, no real calculation of purpose or result, no contemplative brake, no need to seek angles or views beyond those that reaffirm or reassure what we think right now.”
Is this truly the future of media? Will we lose any true, deep, thoughtful communication in its havoc of pixels and pictures?
One interesting counter-opinion comes from former Daily Beast editor Tina Brown. Having recently left the world of journalism for event production, Brown has told reporters that she no longer reads magazines herself—in fact, she thinks “the whole writing fad is so twentieth century” (in the words of New York Magazine). But rather than warning of impending havoc and din, Brown calls people back to oral communication: “I think you can have more satisfaction from live conversations,” she said, adding that we are “going back to oral culture where the written word will be less relevant.”
If we experience the “death of writing,” as Rassch puts it, could we come full-circle and return to the age of oral communication? Will grandfathers sit down with their grandchildren and tell them stories, like our ancestors so long ago? One can only hope; but if such an experience were truly to flower from “The Great Din,” it would be rather surprising.
Our modern world is obsessed with specialization. Yet this specialization is often unhealthy, both culturally and personally. Aeon Magazine contributor Robert Twigger suggests we need a new area of study—“polymathics”—to counter this monopathic obsession:
Polymathics might focus on rapid methods of learning that allow you to master multiple fields. It might also work to develop transferable learning methods. A large part of it would naturally be concerned with creativity — crossing unrelated things to invent something new. But polymathics would not just be another name for innovation. It would, I believe, help build better judgment in all areas.
The study methods Twigger prescribes have actually existed for quite some time, although in a slightly altered form, as the classical liberal arts. First developed in ancient Greece, the liberal arts encompassed those skills necessary for civic and personal freedom. The Greeks even emphasized the importance of physical athleticism, as Twigger suggests in his article. The “arête” they sought fostered excellence of mind, soul, and body.
In modern academia, the liberal arts usually include a core curriculum that enables students to “master multiple fields.” The liberal arts classically emphasized a progression from rudimentary learning (the “grammar”) to practical application in communication and circumstance (similar to Twigger’s idea of developing “transferable learning methods,” “crossing unrelated things to invent something new,” and ultimately building “better judgment”).
Twigger’s emphasis on learning many fields is good. Specialization, while useful in many job settings, can disparage the interconnected and complimentary nature of learning. Twigger reminds us that the humanities do advance educational growth in multiple subjects:
An intriguing study funded by the Dana foundation and summarised by Dr. Michael Gazzaniga of the University of California, Santa Barbara, suggests that studying the performing arts—dance, music and acting—actually improves one’s ability to learn anything else. Collating several studies, the researchers found that performing arts generated much higher levels of motivation than other subjects.
But note that Twigger is arguing from a claim of practicality: “polymathics will make you smarter, more creative, more humorous,” etc. This is a common attitude toward the humanities: if learning isn’t practical, it shouldn’t be practiced. Twigger points out the quantifiable practicalities of the humanities in his article, but there is no mention of the purpose the Greeks sought after: their vision of arête and freedom.
In a Wednesday New Yorker article, Lee Siegel argues that recent studies on literature—the ones claiming literature makes you more empathetic—soil the beauty of reading literature for its own sake. “Fiction’s lack of practical usefulness is what gives it its special freedom,” Siegel writes. “When Auden wrote that ‘poetry makes nothing happen,’ he wasn’t complaining; he was exulting. Fiction might make people more empathetic—though I’m willing to bet that the people who respond most intensely to fiction possess a higher degree of empathy to begin with. But what it does best is to do nothing particular or specialized or easily formulable at all.”
Perhaps the same is true for Twigger’s “polymathics,” the humanities, and the liberal arts. Despite their quantifiable benefits, one shouldn’t denigrate the “special freedom” of learning for its own sake.
No one would deny that American philanthropy is grounded in good motives. But as The New Atlantis contributor William Schambra points out in a detailed article, philanthropy can become as poisoned as any other human venture:
America’s first general-purpose philanthropic foundations — Russell Sage (founded 1907), Carnegie (1911), and Rockefeller (1913) — backed eugenics precisely because they considered themselves to be progressive. After all, eugenics had begun to point the way to a bold, hopeful human future through the application of the rapidly advancing natural sciences and the newly forming social sciences to human problems. By investing in the progress and application of these fields, foundations boasted that they could delve down to the very roots of social problems, rather than merely treating their symptoms … According to the perspective of philanthropic eugenics, the old practice of charity — that is, simply alleviating human suffering — was not only inefficient and unenlightened; it was downright harmful and immoral. It tended to interfere with the salutary operations of the biological laws of nature, which would weed out the unfit, if only charity, reflecting the antiquated notion of the God-given dignity of each individual, wouldn’t make such a fuss about attending to the “least of these.” Birth-control activist Margaret Sanger, a Rockefeller grantee, included a chapter called “The Cruelty of Charity” in her 1922 book The Pivot of Civilization, arguing that America’s charitable institutions are the “surest signs that our civilization has bred, is breeding and is perpetuating constantly increasing numbers of defectives, delinquents and dependents.” Organizations that treat symptoms permit and even encourage social ills instead of curing them.
Schambra traces the history of “philanthropic” eugenics through the years, along with its roots and causes. “Philanthropy’s involvement in eugenics should forever remind us that, for all our excellent intentions and formidable powers, we are unable to eradicate our flaws once and for all by some grand, scientific intervention,” he writes.
The article highlights some inherent flaws in philanthropy that conservatives, and isolationists specifically, are very sensitive to: namely, the extent to which our “compassion” is motivated by a desire to control, fix, and regulate things not our business. It’s the “nanny state” paradigm, manifested in individuals like Mayor Bloomberg. It’s typically an accusation leveled at “compassionate conservatives.”
One commenter on a recent human trafficking article said he feared “a strong sense of self righteous, neo colonialist domination inherent in this kind of ‘cause’.”
However, in our fear of becoming meddlesome welfare statists, conservatives run the danger of becoming heartless. Paul Krugman accused Republicans of such sentiments in a Thursday column—he writes, “Republican hostility toward the poor and unfortunate has now reached such a fever pitch that the party doesn’t really stand for anything else — and only willfully blind observers can fail to see that reality.” Patrick Deneen excellently defined the problem with our attitudes in this regard in a recent TAC blog post:
The motivation of charity is deeply suspect by both the Right and the Left. The Right—the heirs of the early modern liberal tradition—regards the only legitimate motivation to be self-interest and the profit motive. They favor a profit-based health-care system (one explored to devastating effect in this recent article on health care in the New Yorker), and a utilitarian university (the “polytechnic utiliversity” ably explored by Reinhard Huetter in the most recent issue of First Things). The Left—while seemingly friends of charity and “social justice”—are deeply suspicious of motivations based on personal choice and religious belief. They desire rather the simulacrum of charity in the form of enforced standardization, homogeneity, and equality, based on the motivation of abstract and depersonalized national devotions and personal fear of government punishment.
I do not think most conservatives (unless they really are Randian to the core) want to forsake true “compassionate conservatism”—just its current manifestation in political circles. How, then, does one exercise philanthropic sentiment properly?
The key, according to Schambra, is personal caritas (love). He writes,
loving personal concern is at the heart of charity traditionally understood. It can only be practiced immediately and concretely, within the small, face-to-face communities that Tocqueville understood to be essential to American self-government. There, the seemingly minor and parochial concerns of everyday citizens are taken seriously and treated with respect, rather than being dismissed as insufficiently self-conscious emanations of deeper problems that only the philanthropic experts can grasp.
One could say this is the localism movement’s compassionate conservatism. It is based in present needs, rather than remote philanthropic endeavors. It seeks to love one’s neighbor, and not to “fix” him.
Does this mean true conservatives shouldn’t get involved in global crises? It might depend on the person and situation. The concerns of Coptic Christians in Egypt are of immense and immediate concern to me, because I consider them my brothers and sisters in Christ. Do I believe all secular American should intervene on their behalf? No—it is not their responsibility as it is mine. But Schambra is right: perhaps we should first focus where we are planted, and then slowly, thoughtfully spread from there.
It is sad that the Republican Party, a political group filled with religious folk, often shows callousness toward the impoverished—either by refusing to help them, or by offering only conditional charity. The Christian faith is filled with instructions to unconditionally love and help the unfortunate. Consider this passage from Isaiah 58, in which God rebukes the nation of Israel for being “religious” by fasting, but neglecting the poor:
Is this not the fast that I have chosen: to loose the bonds of wickedness, to undo the heavy burdens, to let the oppressed go free, and that you break every yoke? Is it not to share your bread with the hungry, and that you bring to your house the poor who are cast out; when you see the naked, that you cover him, and not hide yourself from your own flesh? Then your light shall break forth like the morning, your healing shall spring forth speedily, and your righteousness shall go before you; the glory of the Lord shall be your rear guard. Then you shall call, and the Lord will answer; you shall cry, and He will say, ‘Here I am.’ … If you extend your soul to the hungry and satisfy the afflicted soul, then your light shall dawn in the darkness, and your darkness shall be as the noonday.
Notice that it doesn’t say, “Only cover the naked or feed the hungry if they aren’t being lazy.” God doesn’t say, “Undo the heavy burdens—unless they really need to work harder.” Neither does he say, “Make sure they show a change of heart or get converted before you help them.” This isn’t a gospel of cutting food stamps. Perhaps it is a gospel in which food stamps never should have been necessary in the first place. If conservatives—especially Christian conservatives—want to say “government should mind its own business,” then perhaps it is time they start minding theirs.