George Vanderbilt, a late 19th century heir to the family fortune and builder of the Biltmore Estate, reportedly read 3,159 books during his lifetime (approximately 80 books per year). He kept a list of the books he had read in a diary; his last book was Henry Adams’ third U.S. history volume.
Most of us wish we could amass the knowledge that represents. Books give us insights into the perceptions and perspectives of foreign minds. They widen our horizons, and foster our understanding of beauty. But few of us will surpass Vanderbilt’s reading achievements (unless we inherit large fortunes and thus become able to amass and devour the contents of a 10,000-book library). We lack the time available to Vanderbilt; he had neither work nor Twitter to distract him from his reading. Reading takes time—and in our technological, time-driven age, we’ve become ever more aware of how time-consuming reading can be. New Yorker contributor Rachel Arons wrote Monday of a recent proliferation of speed reading apps on the market:
As we’ve transitioned from print to screens, we’ve started clocking how long reading takes: Kindles track the “time left” in the books we’re reading; Web sites like Longreads and Medium include similar estimates with their articles (total reading time for “Anna Karenina”: eighteen hours and twenty-two minutes); in June, Alexis Ohanian, a co-founder of Reddit, published a book with a stamp on the cover advertising it as a “5 hour read.” … The fact is that little of what we read on the Web today is formatted in discrete pages, so it seems logical that, as reading online continues to supplant reading in print, hours and minutes will become increasingly useful units for measuring our progress.
I used to think that, if I tried really hard, perhaps I could read as many books as Vanderbilt. When I realized that this was probably an impossible goal, it felt something like a punch in the stomach: it was a moment of finitude.
Because of that moment, I could empathize with a girl I recently overheard talking with friends at Capitol Hill Books. Browsing the overstuffed shelves, she mentioned that bookstores often scared her, because she realized she “would never be able to read them all.” It’s only a matter of time before we realize that our to-read booklists can easily surpass the bounds of reason. There are so many tantalizing stories lying outside our grasp, and never enough time to read them all.
Some reject the infinitude stretching before them by deciding not to care. There’s too much to ever possibly absorb, and we become frightened and disheartened by the realization that we cannot have it all. Some become reading automatons, determined to absorb as much information as possible before they die. Speed reading apps, despite their usefulness, can turn reading into a personal competition or race to win. This often takes the joy out of reading, and makes it a chore (though for some, competition may enhance the experience).
When we can, we should read for quality’s sake: savoring every book, re-reading the ones that enchant us most. Yet at the same time, not every essential read is worth savoring. Speed reading is useful for the accumulation of necessary knowledge. Slow reading is essential for the appreciation of written beauty. Perhaps our best reading choices lie at the junction of quality and quantity: we can speed read tedious or secondary works, then slowly absorb the masterpieces worth relishing.
Few of us will meet or surpass Vanderbilt’s incredible standard—even if the speed reading apps may help. But do we really want to read 3,159 books? The Preacher observes in Ecclesiastes, “Of making many books there is no end; and much study is a weariness of the flesh.” At some point, even the greatest bookworms must set down their books and live the life that enriches our readings with understanding. After all, that’s the lesson of the bookstore’s infinitude: our lives aren’t long enough to chase after the endless.
If you are literate today, it does not mean you can write — not even close to it in many cases. But if you were literate in 1863, even if you could not spell, you often could write descriptively and meaningfully. In the century and a half since, we have evolved from word to image creatures, devaluing the power of the written word and turning ourselves into a species of short gazers, focused on the emotions of the moment rather than the contemplative thoughts about consequences and meaning of our actions. Many everyday writers in the mid-19th century were far more contemplative, far more likely to contextualize the long-term meaning of their actions. They meticulously observed and carefully described because, although photography was the hot new medium during the Civil War, words remained the dominant way of communicating thought, memory, aspiration, hope.
Raasch’s theory is not a new one. Back in the 1980’s, when Internet was still in its primordial days and television was king, Neil Postman wrote Amusing Ourselves to Death. His book cautioned against the developing “Age of Show Business,” fed by television’s sensory, visual medium.
Postman believed three “ages” were prominent throughout information’s history: first, ancient oral cultures encouraged the preservation of information through spoken records and stories. When the printing press and writing became more prominent, oral cultures dissolved into the “Age of Exposition”: a time when written records were perceived as holding the greatest truth. Then as photography and videography developed, media began to change again—for the worse. Postman believed we would lose more than writing ability in the wake of the entertainment era: he warned of a depleting mental and emotional capacity. He believed we would become as obsessed with pleasure as the humans in Aldous Huxley’s Brave New World. Postman’s descriptions of distracted, sensationalistic consumers relate well to Raasch’s “species of short gazers.”
We can only imagine what Postman would have said about the Internet; perhaps Raasch gives us a taste when he describes it as “the Great Din”: “Today, throwing barbs and brickbats into the Great Din of the Internet has become as second nature as breathing … The Great Din requires no forethought, no real calculation of purpose or result, no contemplative brake, no need to seek angles or views beyond those that reaffirm or reassure what we think right now.”
Is this truly the future of media? Will we lose any true, deep, thoughtful communication in its havoc of pixels and pictures?
One interesting counter-opinion comes from former Daily Beast editor Tina Brown. Having recently left the world of journalism for event production, Brown has told reporters that she no longer reads magazines herself—in fact, she thinks “the whole writing fad is so twentieth century” (in the words of New York Magazine). But rather than warning of impending havoc and din, Brown calls people back to oral communication: “I think you can have more satisfaction from live conversations,” she said, adding that we are “going back to oral culture where the written word will be less relevant.”
If we experience the “death of writing,” as Rassch puts it, could we come full-circle and return to the age of oral communication? Will grandfathers sit down with their grandchildren and tell them stories, like our ancestors so long ago? One can only hope; but if such an experience were truly to flower from “The Great Din,” it would be rather surprising.
Brandon Stanton has made a name for himself by taking random pictures of people in New York City. How? That’s the question many are asking, as the New York Times reports that his new book Humans of New York “has become an instant publishing phenomenon.” The book is a compilation of hundreds of his images and interviews.
The project originally started on Stanton’s Tumblr blog and website, HumansofNewYork.com. Stanton, a 29-year-old Georgia native, began walking up to complete strangers on the street, and would ask permission to take their picture. He usually published the pictures with a brief snippet of their conversation.
The stories are often humorous and somber. Take, for existence, two contrasting recent posts—the first features a man unloading boxes of Coronas:
“What was the saddest moment of your life?”
“Probably when I got arrested after a bar fight in Vegas.”
“What was the fight about?”
“I don’t even remember.”
“Do you remember the most frightened you’ve ever been?”
“Probably when I had a gun pulled on me in Canada.”
“I think it started with us trying to pull a tree out of the ground, and ended in us chanting “USA! USA! Then somebody pulled out a gun.”
“I’m noticing a pattern.”
“Yeah, same group of guys.”
The second picture only shows a man’s shoes:
“My girlfriend and I aborted a child a couple of weeks ago.”
“I’m sorry for your loss.”
“We didn’t lose anything. It was a choice.”
“Were both of you equally on board with the decision?”
“She followed my lead, which made it tougher I guess. But I’ve got so much going on right now, and she just opened her own theater show. It’s just not the right time.”
“How’s the aftermath been?”
“You know, I always thought of abortion as a common thing. I’m a liberal guy. Pro-choice and everything. But I never imagined how bloody painful it was going to be.”
“Do you mind if I post your story?”
“With my picture? I’d prefer not.”
Why do people enjoy reading Stanton’s stories and looking at the pictures? Stanton himself doesn’t seem sure, though he credits it to empathy built through the images: “It seemed like a stupid idea, just taking pictures of people on the street,” he told the New York Times. “But there’s a comfort, an affirmation, a validation in being exposed to people with similar problems.”
This explanation may hold true for some—but it also seems a somewhat narcissistic motivation. Do we really only care to read the stories of people who affirm and validate our own problems? Or is there something deeper at root here?
Some people can identify a moment when they have an overwhelming sense of the “other”: a moment in which our alienation from the mental, emotional, and physical existence of billions becomes sharp and poignant. Perhaps this moment happens while driving down a highway or staring out the window of an airplane. But in that moment, you suddenly realize: there are millions of lives, stories, and perspectives that you will never know. All human beings yearn for connection, for a destruction of mental and emotional alienation. We often have a sense that, if only we could, strangers on the street could become our kith and kin.
One of the greatest beauties of Stanton’s project is that it breaks down this barrier of finitude, if only for a brief moment in time. He enables readers to see and experience the perspective of a stranger. The viewer will probably never meet Stanton’s subjects in person—but in that moment, the reader and stranger share a connection. This is one of the beauties of literature: it bridges the divide of “other,” enabling us to taste and experience life beyond our own. Stanton’s project, with its juxtaposition of image and text, does the same. By cataloguing shared features of humanness, replete with longing, regret, and joy, he identifies a hidden community bridging all geographical, intellectual, social, and racial divides: a hidden community of simple humanness.
No one would deny that American philanthropy is grounded in good motives. But as The New Atlantis contributor William Schambra points out in a detailed article, philanthropy can become as poisoned as any other human venture:
America’s first general-purpose philanthropic foundations — Russell Sage (founded 1907), Carnegie (1911), and Rockefeller (1913) — backed eugenics precisely because they considered themselves to be progressive. After all, eugenics had begun to point the way to a bold, hopeful human future through the application of the rapidly advancing natural sciences and the newly forming social sciences to human problems. By investing in the progress and application of these fields, foundations boasted that they could delve down to the very roots of social problems, rather than merely treating their symptoms … According to the perspective of philanthropic eugenics, the old practice of charity — that is, simply alleviating human suffering — was not only inefficient and unenlightened; it was downright harmful and immoral. It tended to interfere with the salutary operations of the biological laws of nature, which would weed out the unfit, if only charity, reflecting the antiquated notion of the God-given dignity of each individual, wouldn’t make such a fuss about attending to the “least of these.” Birth-control activist Margaret Sanger, a Rockefeller grantee, included a chapter called “The Cruelty of Charity” in her 1922 book The Pivot of Civilization, arguing that America’s charitable institutions are the “surest signs that our civilization has bred, is breeding and is perpetuating constantly increasing numbers of defectives, delinquents and dependents.” Organizations that treat symptoms permit and even encourage social ills instead of curing them.
Schambra traces the history of “philanthropic” eugenics through the years, along with its roots and causes. “Philanthropy’s involvement in eugenics should forever remind us that, for all our excellent intentions and formidable powers, we are unable to eradicate our flaws once and for all by some grand, scientific intervention,” he writes.
The article highlights some inherent flaws in philanthropy that conservatives, and isolationists specifically, are very sensitive to: namely, the extent to which our “compassion” is motivated by a desire to control, fix, and regulate things not our business. It’s the “nanny state” paradigm, manifested in individuals like Mayor Bloomberg. It’s typically an accusation leveled at “compassionate conservatives.”
One commenter on a recent human trafficking article said he feared “a strong sense of self righteous, neo colonialist domination inherent in this kind of ‘cause’.”
However, in our fear of becoming meddlesome welfare statists, conservatives run the danger of becoming heartless. Paul Krugman accused Republicans of such sentiments in a Thursday column—he writes, “Republican hostility toward the poor and unfortunate has now reached such a fever pitch that the party doesn’t really stand for anything else — and only willfully blind observers can fail to see that reality.” Patrick Deneen excellently defined the problem with our attitudes in this regard in a recent TAC blog post:
The motivation of charity is deeply suspect by both the Right and the Left. The Right—the heirs of the early modern liberal tradition—regards the only legitimate motivation to be self-interest and the profit motive. They favor a profit-based health-care system (one explored to devastating effect in this recent article on health care in the New Yorker), and a utilitarian university (the “polytechnic utiliversity” ably explored by Reinhard Huetter in the most recent issue of First Things). The Left—while seemingly friends of charity and “social justice”—are deeply suspicious of motivations based on personal choice and religious belief. They desire rather the simulacrum of charity in the form of enforced standardization, homogeneity, and equality, based on the motivation of abstract and depersonalized national devotions and personal fear of government punishment.
I do not think most conservatives (unless they really are Randian to the core) want to forsake true “compassionate conservatism”—just its current manifestation in political circles. How, then, does one exercise philanthropic sentiment properly?
The key, according to Schambra, is personal caritas (love). He writes,
loving personal concern is at the heart of charity traditionally understood. It can only be practiced immediately and concretely, within the small, face-to-face communities that Tocqueville understood to be essential to American self-government. There, the seemingly minor and parochial concerns of everyday citizens are taken seriously and treated with respect, rather than being dismissed as insufficiently self-conscious emanations of deeper problems that only the philanthropic experts can grasp.
One could say this is the localism movement’s compassionate conservatism. It is based in present needs, rather than remote philanthropic endeavors. It seeks to love one’s neighbor, and not to “fix” him.
Does this mean true conservatives shouldn’t get involved in global crises? It might depend on the person and situation. The concerns of Coptic Christians in Egypt are of immense and immediate concern to me, because I consider them my brothers and sisters in Christ. Do I believe all secular American should intervene on their behalf? No—it is not their responsibility as it is mine. But Schambra is right: perhaps we should first focus where we are planted, and then slowly, thoughtfully spread from there.
It is sad that the Republican Party, a political group filled with religious folk, often shows callousness toward the impoverished—either by refusing to help them, or by offering only conditional charity. The Christian faith is filled with instructions to unconditionally love and help the unfortunate. Consider this passage from Isaiah 58, in which God rebukes the nation of Israel for being “religious” by fasting, but neglecting the poor:
Is this not the fast that I have chosen: to loose the bonds of wickedness, to undo the heavy burdens, to let the oppressed go free, and that you break every yoke? Is it not to share your bread with the hungry, and that you bring to your house the poor who are cast out; when you see the naked, that you cover him, and not hide yourself from your own flesh? Then your light shall break forth like the morning, your healing shall spring forth speedily, and your righteousness shall go before you; the glory of the Lord shall be your rear guard. Then you shall call, and the Lord will answer; you shall cry, and He will say, ‘Here I am.’ … If you extend your soul to the hungry and satisfy the afflicted soul, then your light shall dawn in the darkness, and your darkness shall be as the noonday.
Notice that it doesn’t say, “Only cover the naked or feed the hungry if they aren’t being lazy.” God doesn’t say, “Undo the heavy burdens—unless they really need to work harder.” Neither does he say, “Make sure they show a change of heart or get converted before you help them.” This isn’t a gospel of cutting food stamps. Perhaps it is a gospel in which food stamps never should have been necessary in the first place. If conservatives—especially Christian conservatives—want to say “government should mind its own business,” then perhaps it is time they start minding theirs.
Often discussed in different sections of the newspaper or the blogosphere, the twin crises of health care and higher education are extraordinary in their similarities. Both are regarded as necessary goods for human flourishing whose costs are spiraling out of control. Both rely on a professional class that is becoming more specialized, losing the generalist who once cared for the “whole person.” Both have seen expanding intervention by the central government which has sought to provide access to the lower and middle classes. Both are believed by many conservatives to be properly reformed by means of market-based solutions. Both are the subject of intense contemporary political debate.
And both were once almost exclusively the province of the Church, and, indeed, can trace their institutional origins—hospitals and universities—as part of the Church’s charitable ministry.
This latter fact, it seems to me, sheds bright light on the common roots of the contemporary crisis of each area. The dominant voices in the debate in both areas—health and education—cleave closely to the contemporary party lines. On the Right, the case is made that a competitive market model will solve the ills of both health care and education. By allowing prices to be driven by supply and demand, and the motivations of the primary actors—doctors and professoriate, on the one hand, patients and students, on the other—to be largely self-interested, the market will resolve how best to allocate the relatively limited access to the best health care and the best institutions of higher education. On the Left, it is believed that the State should rest a heavy hand on the scales of the market, enforcing widespread access, suppressing costs (or providing subsidies), and forcing providers to conform to state-mandated expectations and standards.
Yet there is something fundamentally amiss with making provision of health and higher education contingent on market models and profit calculus, as both seem to be goods that are not subject to the same kind of calculus as automobiles and bubble gum. The very idea that doctors and teachers are or ought to act out of the motivations of self-interest, and provide services to their “consumers,” seems fundamentally contradictory to the kind of work and social role performed by each. The decline of the “generalist” in each sphere is indicative of a deeper crisis of the willingness to act on behalf of a broader conception of the good intrinsic to each profession and on behalf of the person being served, in favor of the specialization encouraged by modern canons of efficiency, productivity, profit, and rationalization.
At the same time, the State is rightly suspected of being unable to fundamentally improve or even maintain the quality of either sphere. It is doubtless the case that it can assure access by the heavy hand of threats, but many rightly worry that, as a consequence, the quality of care and education will deteriorate as a result. The State takes on the ersatz role of “generalist,” seemingly concerned for the good of the whole. It can only pursue that good by seeking to control pricing and access while influencing the ways “care” is provided, but it fails necessarily in caring for the vision of the whole that the actors of the professions are no longer willing or able to perform.
The debate as currently constituted represents a pincer movement aimed ultimately at the re-definition of each area—as we have seen in so many areas of contemporary life. While superficially opposites, proponents of each position in fact share a fundamental hostility to the original presuppositions that had informed the foundation of both institutions—the corporal works of charity central to the Church’s earthly mission.
In fact, it seems increasingly evident that practices such as health care and education are likely to fail when wholly uninformed by their original motivation of religious charity. Neither functions especially well based on the profit-motive or guided by large-scale national welfare policies. As the failure of the market model in each area becomes evident, the demands for the second—government intervention and control—have quickly followed. That both are reaching crises at the same time is hardly coincidental: both benefitted for a long time from the “social capital” accumulated as Church institutions, a legacy of cultures and practices that persisted for a long time even after the practitioners had ceased to embrace them. However, in both cases, the social capital is now depleted, and each now operates on a nonsensical combination of self-interested market motivations and taxation and threat-based national welfare policy. Neither is a fitting motivation or model for either sphere.
After Sunday mass at Holy Trinity, the parents left their four boys in Georgetown to drive to Griffith Stadium to join 27,000 fans to watch “Slingin’ Sammy” Baugh take on the Philadelphia Eagles.
Already a legend, Baugh was the greatest football player of his era. Record-setting passer, runner, punter, place kicker, defensive back. Yet, not until the fourth quarter did Sammy throw for a pair of touchdowns to finish off the Eagles 20-14.
Something else was happening that Sunday. As the scoreless tie went on, there came a series of public service announcements calling on admirals, generals, and officials to leave the stadium and report to their posts. Only when mom and dad left did they learn why.
It was Dec. 7, 1941, and the headline on the extra edition of the tabloid press sold outside Griffith Stadium read in big war type: “Japs Bomb Pearl Harbor!”
Seven years on, after a black Tuesday in the family on my 10th birthday, Nov. 2, 1948, the day Harry Truman waxed Tom Dewey, I was the privileged son taken out to see the Redskins face the same Eagles.
But now the Eagles had the NFL’s leading running back Steve Van Buren and the great All-Pro end Pete Pihos.
Surfing the web to confirm my memories, I came across some things I did not know then. Van Buren, an NFL immortal who would set all-time rushing records, had been orphaned as a boy in Louisiana.
Pihos had a more arresting story. His father had been murdered. An All-American at Indiana, he had his career interrupted. He had been with the 35th Infantry under Gen. George Patton, took part in D-Day, was commissioned a second lieutenant on the battlefield, and won a Bronze Star and a Silver Star for bravery.
And for a tiny fraction of what players make today, these tough men were battling it out in the ’40s in a boys’ game in leather helmets.
Washington was another city then, a deeply rooted city, not the cosmopolitan world capital of today where our multicultural elites all seem to come from somewhere else.
Yet, one still recalls from boyhood that when the Redskins would score the fans would all take up the team’s fight song written by Corinne Griffith, wife of owner George Preston Marshall. Redskin bandleader Barnee Breeskin wrote the music in the ’30s. Here is how it went:
Hail to the Redskins!
Braves on the warpath!
Fight for old D.C.!
Yeah, I know. Pure unadulterated racism. We just didn’t know it.
Fortunately, we now have sensitive souls like Ray Halbritter of the Oneida Indian Nation to tutor us in our depravity.
“By changing his team’s name,” Redskin owner Dan Snyder “can create a better historical legacy for himself—one of tolerance and mutual respect,” says Halbritter: “Native Americans do not want their people to be hurt by such painful epithets.”
Hurt? Native Americans are “hurt” by the Redskins’ name?
Years ago, I recall hearing a line I thought a magnificent tribute to the toughness, bravery, and perseverance of these peoples that the Europeans encountered and fought on American soil for centuries.
“There is no whine in the Indian,” the writer said.
What he meant was that these were people who stood, fought and died, and did not whimper. And it is that character trait so many teams from the Fighting Sioux of North Dakota to the Cleveland Indians of the Cuyahoga seek to capture in their adopted names.
Anonymous’ #OpMaryville #Justice4Daisy is in full swing, promoting the cause of Daisy Coleman, who at age 14 had non-consensual sex with 17 year old Matthew Barnett, a popular senior football player at Maryville High. While the rape charges were later dropped, a 14 year-old with, seven hours after the encounter, a blood alcohol content of 0.13 cannot possibly consent. Even though Daisy was left, unconscious and sexually assaulted, in front of her house in 30-degree weather for hours, and there was the rape kit’s evidence, as well as testimony from her 13 year-old friend, who reports saying “no” multiple times while a 15 year-old forced her to have sex with him, the charges against Barnett were dropped. When the Kansas City Star broke this story, many were angered at the victim-shaming, lack of justice, and lack of virtue shown not only by the teenage boys, but by the town as a whole. Some members of the internet hacktavist group Anonymous appear to have been as enraged as I felt, and have turned the force of their digital prowess on the alleged-rapists, the Maryville authorities, and the town itself.
This is not the first time Anonymous has come out swinging to defend the helpless: anti-rape and –cyber-bullying operations such as#OpJustice4Rhetaeh, #OpAntiBully, #OpRollRedRoll, and taking down child pornography websites with #OpDarknet. The hacktavists obviously know what they are doing, combining social media campaigns with digital leaks and cyber attacks. Their digital literacy gives them an immense amount of strength; moreover, they can join up with others who share that strength. They are not unlike an army, albeit one with anarchical tendencies and no centralized leadership or digital meeting space; if they are not a political movement in the traditional sense, they are certainly political. Given groups like this, our society must engage traditional questions of political philosophy, bringing them into dialogue with modern life.
When Plato asked “what is justice?” at the beginning of The Republic, the Socratic interlocutor Thrasymachus answered by claiming that justice “is nothing other than the advantage of the stronger,” in other words, might makes right. One could likewise pose this question to Anonymous, a group that demands justice for victims, demands followed by the reminder: “We are Anonymous. We are Legion. We do not forgive. We do not forget. Expect us” and that “justice is coming.” The members of Anonymous have a striking focus on justice, but it is unclear thoroughly they have thought the concept through.
In an age where digital skills allow individuals, wearing both “white” and “black” hats, to access troves of information we must ask: when is that right? When does it cross a line? Who is to judge and on what grounds? Is crashing a website at the least a semi-legitimate form of political protest, like the Occupy movements, or is it criminal behavior? Is releasing the names of rapists and hacking the private accounts on which they admitted to their crime deserving of a longer sentence than that faced by the rapists themselves?
After perusing some of the tumblrs and twitters associated with this group, it is clear that politically, many members have made legitimate critiques of our political culture today, one that so often neglects to account for policies that best serve voters. Further, their critique of the NSA and the stripping away of American freedoms is not without merit. However, anarchy for its own sake, revolt without an end, rarely creates the necessary circumstances for sustained and intelligent dialogue.
Americans have epicurean tendencies, according to Steven L. Jones of the Bird and Babe blog. Epicureanism was first espoused by 4th century Greek philosophy Epicurus, and posits “pleasure” as the supreme good of life. In Jones’ mind, Americans are epicurean in three key ways:
Aponia, the epicurean doctrine of avoiding bodily pain or exertion, explains the American fascination with efficiency and technology … Ataraxia, the epicurean doctrine of avoiding mental anguish (literally not getting yourself worked up), explains the perpetual criticism of Americans being apathetic, anti-intellectual or uninvolved in our political life … Agnosticism explains the general lack of concern of Americans for deep thinking about religion.
Peter Lawler has some corresponding thoughts at Big Think Tuesday, suggesting “people are less and less obsessed with the past, maybe especially (but not only) in America. We have no historical sense, a sense which could give us a sense of place, purpose, and limits, as well as the chastening that comes with reflection on experience.” A Thursday piece by Carina Chocano in Aeon Magazine complemented Lawler’s observations and tied them into Jones’ definition of Ataraxia. Her piece, “Je Regrette,” explored society’s reluctance to express regret:
Life is not mysterious, it’s mathematics. All we have to do is track our productivity, our spending, our steps, our calorie intake. All we have to do is count our friends and likes and follows. The illusion of control that these tools grant us over every aspect of our lives is powerful. There is always something we can do today to avoid regret tomorrow. To admit regret is to admit to a previous failure of self-control. ‘In the reigning economic models of decision, human beings are “calculating machines” who decide their preferences based on calculations of utilities and probabilities,’ Landman writes. ‘We deny regret in part to deny that we are now or have ever been losers.’ In a culture that believes winning is everything, that sees success as a totalising, absolute system, happiness and even basic worth are determined by winning. It’s not surprising, then, that people feel they need to deny regret — deny failure — in order to stay in the game.”
According to these authors, modern culture is largely obsessed with pleasure. Thus we must avoid pleasure’s evil twin, pain, in all its physical, emotional, intellectual, and spiritual manifestations. This involves a purposeful forgetfulness, as well as a present fixation on positive sensations that undergird feelings of control.
Chocano thoughtfully points out that “The point of regret is not to try to change the past, but to shed light on the present. This is traditionally the realm of the humanities. What novels tell us is that regret is instructive. And the first thing regret tells us (much like its physical counterpart — pain) is that something in the present is wrong.” Ergo, regret rests a blazing hot finger on areas of regress or foulness in our history. It alerts us to pain, and often to sin. No wonder such feeling is antithetical to Epicureanism.
Having just finished The Scarlet Letter by Nathaniel Hawthorne, this idea of regret stood out to me. The book’s protagonist, Hester Prynne, is caught in adultery and forced to wear a scarlet “A” (for “adulteress”) as a public, permanent chastisement for her sin. As the years unfold, regret molds her thoughts and actions: she grows humble, penitent, and servant-hearted. The Puritans begin to soften toward her. Hawthorne writes that “not the sternest magistrate of that iron period would have imposed” the scarlet letter upon her any longer. Yet she continues to wear it. Why?
Hester, it seems, did not want to forget. Her life had been shaped and marked by this letter. Though it would undoubtedly have brought more pleasure to cast it away (as she does so temporarily in a moment of passion), she continued to wear it. Perhaps she recognized the good wrought through evil signified in that letter. Perhaps she understood she was wiser and stronger for that letter.
Historical characters have a different perspective on the past than our modern populace. Their everyday activities: sewing, weaving, cooking, cleaning, farming, etc., all emphasized the permanence and inter-connectedness of things. Garden vegetables found their way into evening dinners and canning jars. Wool became yarn, cloth, quilts and socks. Nearby trees became fences and firewood. Life functioned as a tapestry: stretching from its origins into the future, never wasted, never fully forgotten. Even the simplest error became opportunity to regrow, reuse, or remake something.
Should parents control what children read? Writer Neil Gaiman says no: in a lecture published in Tuesday’s Guardian, he argues the idea of “bad books” for children is “tosh,” “snobbery and foolishness”:
There are no bad authors for children, that children like and want to read and seek out, because every child is different. They can find the stories they need to, and they bring themselves to stories. A hackneyed, worn-out idea isn’t hackneyed and worn out to them. This is the first time the child has encountered it. Do not discourage children from reading because you feel they are reading the wrong thing. Fiction you do not like is a route to other books you may prefer. And not everyone has the same taste as you. Well-meaning adults can easily destroy a child’s love of reading: stop them reading what they enjoy, or give them worthy-but-dull books that you like, the 21st-century equivalents of Victorian “improving” literature. You’ll wind up with a generation convinced that reading is uncool and worse, unpleasant.
Gaiman has a point. Many parents take comics away from their children, fearing they’ll never read “real stuff”—yet often comics serve as a bridge into deeper material. Victorian morality tales usually present cardboard, cookie cutter pictures of childhood life. Little girls and boys don’t want to read it—they can see it’s fake.
But should children have free rein over reading material? While exploration is key to fostering imagination, it is also true that strong ingredients build a healthy mind: just as most parents won’t allow their children to only eat junk food, so children should not be allowed to only consume sugar-coated reads. Parents should encourage children to pick up substantive books. This encouragement needn’t “destroy a child’s love of reading,” as Gaiman writes. The very best books merge excellent content with delightful style. They foster knowledge while appealing to the imagination.
In case it might be helpful to readers, here is a list of 10 titles—many Newbery winners and classics—that seem to fit this description. The books are organized loosely by complexity.
The Little Prince, by Antoine de Saint-Exupéry
This book tells the story of a stranded airplane pilot, who meets an otherworldly little prince in the Sahara desert. The little prince is a fascinating character, and the author’s watercolor illustrations are beautiful. There is a gorgeous pop-up book version.
The Phantom Tollbooth, by Norman Juster
The Phantom Tollbooth was a personal childhood favorite. Its protagonist, Milo, is a bored little boy who discovers a “phantom tollbooth”—and with it, an imaginative world in which numbers, words, music, and sounds come to life.
A Wrinkle In Time, by Madeleine L’Engle
This work is another imagination-stretching tale: Meg Murray’s father, a government scientist, has disappeared. Along with her little brother Charles, Meg sets out on an intergalactic adventure to find him.
At the Back of the North Wind, by George MacDonald
George MacDonald has written several other lovely children’s books—The Princess and the Goblin is a must-read. But this book is perhaps his sweetest. A little boy named Diamond meets the mysterious lady North Wind one blustery night, and she brings him on many adventures through the night sky.
The Tale of Desperaux, by Kate DiCamillo
DiCamillo’s book was recently adapted for film, but of course the book is better. It has lovely, interesting characters: perhaps most notably, the deeply conflicted rat Chiaroscuro.
Recent studies have shown that an increasing amount of Americans live alone: the number has more than tripled since 1970 to 33.2 million. Many younger adults leave home for college and never return: a 2008 Pew poll showed that college graduates are far less likely to live in their birth state, and most young people still living in their hometown want to move in the next five years. Many are moving to America’s urban centers—but this city life tends to encourage a lonelier, more isolated existence.
It is true that not everyone wants to live in the city. In a Mere Orthodoxy article, Jake Meador pointed to a net increase in farms that—though small—could herald a return to more landed, agricultural communities. Meador’s story suggests some in the younger generation view this rural lifestyle as both economically and socially gratifying. Perhaps these moves signal a discontent with the polarizing atmosphere of cities. The difficulty of getting “plugged in” becomes apparent when considering these common facets of city life:
First, the urban center’s “commons” is often too crowded and busy to foster public discourse. In small towns, even a grocery store becomes a meeting place for locals. But in the modern city, stores are usually crowded with busy customers. The city’s very magnitude creates time and spatial inconveniences, making it unlikely for acquaintances to meet in passing. The closest urban equivalent to a “commons,” it seems, might be the modern bar or coffee shop. Take the TV show “How I Met Your Mother”: the show’s starring foursome always congregate at their favorite New York City bar. The modern Starbucks may serve a similar function: a recent Boston study revealed that local Starbucks fostered sociability and friendliness amongst its customers. Customers were encouraged to loiter, sit, and talk. Sadly, Boston’s local indie coffee shops were described as less open and welcoming. While they still had the best “atmosphere,” the baristas and even customers were described as “aloof”. Such places offer a niche for coffee connoisseurs, but they create less of a commons than a club.
Another roadblock to fostering community connectedness is the city’s prevalent focus on international and national news. Perhaps this arises from urban diversity: a city with panoramic demographics reflects global tastes and interests. However, it is difficult when reading the front page of city newspapers to find out what is going on locally. Most city media fixates on the national arena—especially if one lives in the D.C. metro. Without knowledge of local goings-on or concerns, the community becomes disoriented. Its inhabitants focus on other places and other people.
Private institutions can help lend a sense of context and community to the city. Those who attend church or some other community organization on a weekly basis should be able to build local acquaintances. But it is also true that the size of city churches does not always lend itself to such mingling. Metro churches often have huge congregations. It is difficult to establish true friendships in such crowds.
After considering these difficulties, it makes sense that urban dwellers often live alone. Battling polarization is a difficult feat. One must find loopholes in at least some of the above difficulties in order to begin establishing urban community. Perhaps the growth of such local endeavors as the farmer’s market and (hopefully friendly) local stores will foster a more open, active commons. Promulgation of the “slow food movement” could encourage a more relaxed and community-centric style of gathering. Personally, finding a commons in my community has been difficult. When walking into a local coffee shop, one isn’t likely to see a familiar (or even a particularly friendly) face. But there is a wonderful local farmer’s market, and it’s probably the closest I’ve gotten to actual dialogue with other locals. True, my last attempt didn’t last much longer than “Hello, how are you?” “Fine, thank you.” But there’s always next Saturday.