Arlington has become the place to live, if you’re a millennial, reports Patricia Sullivan for the Washington Post. Census numbers show that young adults between 25 and 34 years of age make up more than a fourth of the city’s population.
What draws them to Arlington? There’s the luxury apartments, millennial-friendly Whole Foods, the city’s “yoga boom,” its plethora of local bars and restaurants. There’s also an abundance of trails and easy metro access into D.C. But as the millennials grow up and start having babies, all the consumer goods that drew them to Arlington may not be enough to keep them there:
As millennials have children, their priorities expand to include good schools and family-friendly neighborhoods. A condo or townhouse can suddenly seem too small, and a bigger place in the same neighborhood may be too pricey.
Six in 10 surveyed Arlington residents between the ages of 25 to 34 told the county’s affordable-housing study group last year that it’s somewhat likely or very likely that they will move out of the county within five years because of housing costs.
… Historically, young adults have moved into Arlington starting around age 20 and left for other suburbs in large numbers once they reach their early 30s.
… Elizabeth Hardy, Arlington’s planner and demographer, said the county is planning a study of whether the out-migration of Arlington’s 30-somethings will continue. “Once we have an understanding of the factors that cause people to come or go, we could address how to keep them here,” she said.
This is a dilemma that the entire D.C. metro area seems to struggle with. The city is known for being transitory, a place whose political nature encourages a sort of temporary atmosphere. People come and go, and it’s difficult for any distinct culture to develop around anything excepting politics. Young people, especially, tend to migrate out of the city once they begin to develop a desire for roots. And who can blame them? Not only is housing exorbitantly expensive in the metro area—it is also difficult to build community in a place where people are so often moving and rootless.
Alexandria is perhaps one well-known exception to this rule: its history, rich local community, and vibrant urban fabric has helped to keep a distinctive—and permanent—culture alive. It is the second most millennial-populated city in the metro area, according to Sullivan’s article, yet it also boasts a diverse array of ages and ethnicities. The millennials I know who’ve come to Alexandria are determined to stay, if they can. Yet they face the same housing prices, the same tight-space issues, that those in Arlington face. What convinces them to stay?
The main difference between Alexandria and Arlington seems to be that the former has developed a community of stickers: people who love Alexandria for its own sake, and stay if it is at all in their power to do so. They’ll deal with the cramped spaces and high costs, because they love the city and believe it is a neighborhood and community worth treasuring.
Arlington, however, has developed a culture of boomers: people who circulate according to the jobs available, housing costs, and their time of life. While that isn’t necessarily bad in the short-term, it does create long-term consequences—it means that while Alexandria has rich local traditions and a strong fabric of community, Arlington struggles to develop these things. Its “culture” is a bit more commercial, reliant on trends and fashions. It’s built itself into a different sort of community hub, but it’s one almost exclusively for singles and young couples.
The difference between the two cities may also stem from the fact that Arlington is a bit more sprawling, without a defined historic center that draws community and commerce in the way Alexandria does. The heart of Alexandria still seems to draw people together: it’s not just a place for tourists, but also a hub for arts and commerce, community events and family outings. This helps people connect in a meaningful and lasting fashion. But Arlington’s downtown mainly consists of commercial areas: Clarendon, Courthouse, Rosslyn, as well as Pentagon City and Crystal City. These areas offer a variety of shops, restaurants, bakeries, a couple bookstores, some nice parks—but they are also populated with traffic-laden roads, imposing glass and concrete buildings, sterile sidewalks. They aren’t necessarily beautiful, walkable, or quiet, in the way Alexandria is.
It’s worth considering how to make the millennials stay in Arlington: both for the millennials’ sakes, and for Arlington’s. It is difficult to build a lasting community when you constantly have to uproot yourself and move to a more affordable place—and if that place is the suburbs, it may present other difficulties for these young people, as well. Meanwhile, cities like Arlington need to figure out ways to build this rich heritage and community fabric: it develops their commerce, gives them a more vibrant local culture, encourages a more diverse and connected populace. These are the sorts of things that help a city last.
On Wednesday morning, I woke up and started making coffee. I checked our website, and saw Rod’s beautiful blogpost about the passing of his father. You can see in his writing the love and joy, coupled with heart-melting grief, that makes the death of a Christian so beautifully painful. We hate their absence, yet are joyful for them. We know they are tasting better things than anything this world has given them.
As I smelled the coffee, took in the morning light, I remembered that it was this exact time last year when my grandmother was dying—when we were gathered around her bedside, spending our last days with her.
As I reflected on the year that has past, I could see that her death has in many ways changed me—encouraged me to make hard decisions, ones that are life-altering and even frightening. I’ve been trying to live in a way that would delight her and give her joy, if she were still here. It’s easier for me to live in a mode of independence—shirking all unnecessary commitments and relationships, sinking into work, spending my free hours in solitude—than it is for me to embrace family with open arms, make myself vulnerable, reach out to those I don’t know or understand as I ought, open my life to new loves, obligations.
It’s been a year of trying to emulate her virtues, and it’s been difficult. To be a hostess on an Elaine level is no easy task. I think it will take me a lifetime to learn her art, the effortless way in which she blessed through food and drink, beauty and rest. As of now, I often feel that I’m following the forms, without quite capturing the spirit. But this is, I’m reminded, how spiritual discipline often works, as well: we say the prayers, even when we don’t feel them or understand them with all the crispness and beauty and devotion that we ought. We start with form, and grow in grace. The virtues of comfort, grace, sweetness, and service threaded through her life. It seemed effortless, but I am sure it was not. After all, was she not also a quiet and independent person, oftentimes? Private and thoughtful, slow to share her feelings? I think she must have grown in it, too.
Earlier this week, I read David Brooks’s column about how we make big decisions—life-altering ones. How do we know whether something so transformative and challenging will give us joy? How do we decide to pursue it? At the end, he suggests a method:
We’re historical creatures. We have inherited certain life scripts from evolution and culture, and there’s often a lot of wisdom in following those life scripts. We’re social creatures. Often we undertake big transformational challenges not because it fulfills our desires, but because it is good for our kind.
… Most important, we’re moral creatures. When faced with a transformational choice the weakest question may be, What do I desire? Our desires change all the time. The strongest questions may be: Which path will make me a better person? Will joining the military give me more courage? Will becoming a parent make me more capable of selfless love?
… Which brings us to the core social point. These days we think of a lot of decisions as if they were shopping choices. When we’re shopping for something, we act as autonomous creatures who are looking for the product that will produce the most pleasure or utility. But choosing to have a child or selecting a spouse, faith or life course is not like that. It’s probably safer to ask “What do I admire?” than “What do I want?”
What do I admire? Or rather, who do I admire? When making tough life choices, I find myself reckoning by the history I know. I call to mind the lives that have guided and loved me. And when I begin to shake in my shoes with the enormity of commitments, potential failures, unknowable transformations, I think of those saints who’ve shaped and blessed me in the past. Motherhood, for instance, is a frightening thought: a whole new world of commitment, challenge, potential failure. But I think of my mother—the professional ballerina who hung up her pointe shoes and settled into family life without a regret. I think of my grandmother, resting back in her armchair, watching her living room swarm with children and children-in-law, myriads of grandchildren young and old, all the voices warm with laughter and conversation. And I remember the sweet, joyous smile on her face.
Life scripts. They guide us. Rod found himself beckoned home through the life script of his sister, Ruthie. Despite all the challenges, he has found opportunities to bless and grow in that community. In this year, I’ve found myself increasingly guided by the life script of my grandmother. Her absence has left a gaping hole in my heart, in many ways—a yearning for her, and a yearning for what lies beyond. But there’s also a new resolve in my heart, to follow in the footsteps of the giants who’ve come before me, to grow in grace, and to always ask that question: who do I admire, and how can I emulate them?
Frank Bruni wrote a forceful column against evangelical Christians voting for Trump on Monday. It’s worth reading the entire thing, but here’s an excerpt:
Let me get this straight. If I want the admiration and blessings of the most flamboyant, judgmental Christians in America, I should marry three times, do a queasy-making amount of sexual boasting, verbally degrade women, talk trash about pretty much everyone else while I’m at it, encourage gamblers to hemorrhage their savings in casinos bearing my name and crow incessantly about how much money I’ve amassed?
Seems to work for Donald Trump.
Polls show him to be the preferred candidate among not just all Republican voters but also the party’s vocal evangelical subset.
He’s more beloved than Mike Huckabee, a former evangelical pastor, or Ted Cruz, an evangelical pastor’s son, or Scott Walker, who said during the recent Republican debate: “It’s only by the blood of Jesus Christ that I’ve been redeemed.”
… What’s different and fascinating about the Trump worship is that he doesn’t even try that hard for a righteous facade — for Potemkin piety. Sure, he speaks of enthusiastic churchgoing, and he’s careful to curse Planned Parenthood and to insist that matrimony be reserved for heterosexuals as demonstrably inept at it as he is.
But beyond that? He just about runs the table on the seven deadly sins. He personifies greed, embodies pride, radiates lust. Wrath is covered by his anti-immigrant, anti-“losers” rants, and if we interpret gluttony to include big buildings and not just Big Macs, he’s a glutton through and through. That leaves envy and sloth. I’m betting that he harbors plenty of the former, though I’ll concede that he exhibits none of the latter.
… I must not be watching the same campaign that his evangelical fans are, because I don’t see someone interested in serving God. I see someone interested in being God.
Trump’s widespread and enthusiastic following, along with his near-incessant domination of the news lately, has baffled me as well—considering his record and bombastic comments. Yet if this article from the Daily Caller is correct, then nothing seems to be able to stand in Trump’s way… not even his very un-conservative past.
Seeking answers, I’ve been trying to pull the pattern together. Slowly, conversations and news interviews begin to reveal that pattern: I’ve heard and read that Trump is different, refreshingly so. He doesn’t deal in the regular political BS. He’s not perfect, but no politician is. He’s not right on a lot of things, but on the things he gets right, he would actually get stuff done. People just want someone who will be a “fighter.”
Yet these people may fall into the trap of supporting someone merely because he seems refreshing, straightforward, and different. In doing so, they forget that the devil is truly in the details. They get so captivated by the simplicity, they refuse to look for the snares.
The reasons for not supporting Trump have seemed obvious on the face of it—he’s not really a conservative, by any measure of the word. Yet I’ve learned that his lack of conservatism does not deter everyone—because he’s different, refreshing, promises to get things done. He promises to overturn the Washington gridlock that people so thoroughly detest. And it’s definitely a tempting thought, isn’t it? Everyone hates Washington stagnation, the way nothing gets done, the way reform constantly gets stalled. It’s maddening.
But we must consider what a president is supposed to be—and why Trump is not suited for that office. Beyond issues of gentlemanliness, decorum, and diplomacy—skills in which he’s sadly lacking—Trump also lacks a necessary humility and appreciation of limits. The executive branch of our government is not meant to be dictatorial; the president is not supposed to be able to initiate top-down reforms according to his every whim and fancy. We need a president who respects the balance of powers put in place. We need a president who understands what his constitutional constraints are, and respects them. Do we really think Trump would be that man? There are few constraints he seems willing to respect. He’s more of a steamroller than he is a preserver or upholder.
Sometimes I think we like crusaders a little too much. We like their charisma and power, the way they promise to make all things right. We may not agree with everything they say, but at least they say it well. At least they promise to move us in a direction, to prevent us from sinking into stasis—or into quicksand. But action and power should never be praised for their own sakes. When a character such as Trump avows that he has changed from his more liberal days—yet says and does nothing to explain such a substantial transformation—it seems best to distrust him. Especially when he’s running for the most powerful and precarious office in the nation.
I don’t think that Christians should just vote for a Christian because they’re Christian. Just because Trump isn’t a virtuous candidate doesn’t mean we need to vote for Ben Carson or Ted Cruz, both of whom have a more thoroughly Christian background. But neither do I think we should ignore the moral characteristics of the candidates, or dismiss that consideration lightly.
Additionally, we must consider how our desire for “simplicity” may, in fact, have damaging repercussions for our larger witness to a hurting and broken world. There are people out there who can’t skip over the details, who can’t ignore the everyday realities that Trump may dismiss with a wave of his hand. There are people for whom the mass deportation of illegal immigrants would have immediate, painful, even tragic results. This doesn’t mean that we don’t consider things from a careful policy perspective. It doesn’t mean that we reject important reforms just because they’re painful. But it does mean that we bring a needed seriousness, grace, and concern to the conversation. It means that we don’t just embrace the funny, charismatic, easy-solution candidate. Rather, it should urge us to seek out as much information as possible. It should force us to research deeply, talk with a diverse set of people, ponder carefully the possibilities set before us. It should in fact turn us away from sound bites, not toward them.
Miya Tokumitsu explores the meaning and implications of curation in her new TNR article:
… The qualifier “curated,” begins to appear with increasing frequency in published books in the early 1970s, precisely during the era of post-war economic liberalization andThe ‘Me’ Decade, during which, according to Tom Wolfe, it became acceptable and good to spend time “…polishing one’s very self…and observing, studying, and doting on it.” (Indeed, in this passage, Wolfe describes the self as akin to a museum object.) The appearance of “curated” in print tracks steadily upward during the individualist, body-sculpting, self-improving, “no such thing as society” 1980s. The great value placed on the individual as the only valid social institution naturally elevated the consequence of previously quotidian things generated by the simple act of living, like lists and opinions. These things began to be worthy of the same white-gloved treatment and cultural esteem once reserved for fine art.
Essential to personalization is the aura of control. Curation of the commonplace not only elevates preference but also implies a sense of order that is determined by the individual. It imparts a sense of self-determination and dominant power much in the manner of 401-k investment portfolios and small-business entrepreneurship. Under neoliberalism, every individual is his own capitalist, his own world-maker. “Freedom” isn’t security in a just society, but the ability to shop—for a healthcare plan on market exchanges, for primary schooling, for stocks in your retirement plan (if you’re lucky enough to have one of those). We’re all masters of our tiny, curated realms.
Tokumitsu considers the ways in which self-curation have become incredibly commonplace on websites such as Facebook, Twitter, and Instagram. The curation of the common, meanwhile, can be easily seen on platforms such as Pinterest, Tumblr, and Spotify. You may pin great works of art to your Pinterest board, or listen to Bach on Spotify, but most users find these tools useful to collect blog recipes, home and garden decor ideas, or to discover new pop singles or indie hits.
As the above passage probably hints at, Tokumitsu does not look kindly on our modern patterns of curation—indeed, she writes that “In bestowing great importance to ‘just picking stuff,’ curation in its contemporary, ecumenical sense reinforces many of the personal values promoted by neoliberalism: atomized individualism, the thrall of personalization, aestheticized control, and, of course, consumption-as-authenticity.”
But people have always been curators of the common and quotidian. Before Pinterest, people put together scrapbooks in the 1800s and 1900s, cutting out their favorite images from magazines. Before Spotify, music book publishers put together favorite song collections for performance at social gatherings. Others compiled abridged renditions of classic or popular books for curious readers. And one can’t forget that the personal library is one of the oldest and most widespread methods of curation we’ve used over the centuries.
The main difference, of course, is that private curation used to be exactly that: private. Unless you were a celebrity, your favorite music, book, and style collections were not of interest to the general public. You may have had a wide collection of recipes, gathered from relatives, old friends, and cookbooks—but you wouldn’t have means (or usually desire) to share them with the world. What internet and social media have done are to take our private collections, and make them globally shareable. Tokumitsu writes,
The personalization and creativity connoted by today’s popular understanding of curation also relate to the projection of certain kind of authenticity [sic]—one that is publicly visible and determined by consumption. Hence the eager embrace of “curation” within the spheres of social media and retail shopping. These are the arenas in which we can most easily construct microcosms and publicly projected pastiches of our selves, structured entirely by our own preferences.
Despite the problems with widespread curation, there is also a real need for people to provide such services in the midst of massive creation. Every day, we face an incessant outpouring of news, music, books, and fashions. Within those categories, there are numerous genres, trends, beats, preferences. It is often necessary to help people navigate these swaths of information. The best way to do this, it would seem, would be to provide the populace with knowledgeable experts (and/or enthusiastic amateurs) who can choose worthwhile pieces of information for their attention.
This is what the news is supposed to do, although the growing popularity of preference-based news—picking and choosing who and what you read according to your own personal agenda—is rapidly changing the way in which we read and curate media. If you’re a staunch Tea Party Republican, you may watch Fox News. If you’re an east coast liberal city dweller, you may read Vox. But few will regularly get their news from both. (I should note that many TAC readers break such stereotypes, and enjoy reading what they don’t agree with. I’m always thankful for their input on this and other blogs.)
Tokumitsu makes an important point about the role our desire for control plays in modern curation. This would especially ring true in the obvious self-curation of such platforms as Facebook and Instagram—but it’s also true that all curation contributes to our public persona, the way people perceive us. The fashions and recipes you put on Pinterest, the bizarre tunes you may listen to on Spotify—people begin to see you according to your tastes. The stakes are higher when all curation is public. And this is why, it would seem, we begin curating ourselves: hiding the aspects of ourselves we fear won’t be accepted, ignoring the things we don’t like or agree with, filling our lives instead with all the pretty, affirmative, “nice” things that are most likely to garner support from the people in our social circles.
But truly objective curation can be a gift, when used not for the promulgation of the curator’s persona, but rather for the larger goal of promoting the beautiful, thoughtful, or good. If Tumblr isn’t just a place to make ourselves look cool or creative, but rather a place to conserve or share lovely and useful things, then it serves a good purpose. If Spotify isn’t just a place where you listen to music you want others to see you listening to, but rather a place where you savor treasures old and new, it avoids the pitfalls that Tokumitsu is identifying.
When we curate lists, Pinterest boards, playlists, or other things publicly, it would seem that at least one important question we should ask ourselves is this: are we more concerned about how this or that item makes us look, or about the item itself? Sometimes the best test of such a question is to make our curation completely private, and to see if we delight in it to the same extent that we did before.
Many people had no idea Ashley Madison, a “dating website marketed at would-be adulterers,” existed before this week. Now, they’re not likely to forget it. The site’s hacking has resulted in the release of information from 32 million users of the site. As these users’ emails—and thus, their identities—have come to light, a manhunt has begun.
The site’s very existence seems rather odd, at least at first glance: who would give their personal information to a website publicly set up to help you cheat on your spouse? It seems the risk is hardly worth taking. It seems some, at least, would fear that their membership would come back to haunt them.
But while it’s impossible to know exactly why so many signed up for Ashley Madison accounts—with their work emails, no less—one can imagine that there was an extent to which the website’s mere existence, its promise of a sheltering and complicit community, soothed many consciences.
Because that’s what Ashley Madison did: it organized and fostered a community around cheating. We speak of the importance of private associations, their ability to inculcate habits of virtue. But here, we see the opposite: we see an association fostering and even facilitating vice. And this is the dark side of community that we forget about: we forget that peer support and approval will motivate us to do things we may otherwise have avoided—or at least felt guilty about.
But in the past several days, in the massive manhunt for guilty parties, we see a social ethic of truth and moral indignation (tinged by revenge and a lust for sensationalism) engulf Ashley Madison’s community of complicity. Across the globe, people are searching for names they know. Journalists are seeking out the well-known, eager to bring them to justice. One such person is Josh Duggar—a lobbyist for the Family Research Council, whose conservative Christian family starred on TLC’s now-cancelled “19 Kids and Counting” show. It’s bad enough to see political and military emails starring with such frequency in the files hacked from Ashley Madison. But to see a man whose vocation was to defend and promote conservative family values amongst them is definitely saddening.
The whole episode reminds me of Nathaniel Hawthorne’s Scarlet Letter—a book whose community may seem like the opposite of Ashley Madison’s. Yet they are not entirely different: Hawthorne’s Puritan town casts out Hester Prynne because she has a child out of wedlock—yet unbeknownst to them, the father of her child is the town minister. While Hester continues to live on the outskirts of town, viewed with disdain or suspicion, Dimmesdale is revered and loved by many. When the truth finally comes out, they are shocked. The New England town has its own comfortable vices, which it condones or overlooks because they’re held in common. Solidarity can foster vice, as well as virtue—in this case, the continued maltreatment and punishment of a single mother.
This week, I shared a Mere Orthodoxy blogpost from Alastair Roberts, in which he compares our online community to a Jane Austen village: both, he wrote, have a propensity to become echo chambers, affirming our vices and encouraging us in typical forms of behavior. Both urge us to pick up our pitchforks and castigate the unpopular or disliked, when the opportunity presents itself.
Here, too, we see the village venom brewing. The Puritan’s judgment of Hester and corresponding ignorance of Dimmesdale isn’t so far removed from our own eagerness to eschew Josh Duggar and his Ashley Madison compatriots, coupled with our desire to overlook other moral woes simmering beneath our societal surface. It’s always easier to pick out and publicly shame the cheaters and adulterers than it is to look at the vices within.
This isn’t to say that the Ashley Madison users shouldn’t receive some scrutiny—especially those who have put themselves in the public eye, and are therefore subject to public accountability.
But I can’t help but wonder what is spurring on this massive investigation into Ashley Madison users: is it a desire for justice, for public accountability? Is it a desire for sensationalism, for tawdry details to come to light? Or is it a desire for self-satisfaction, for an opportunity to shake our heads at those naughty and stupid people who used the dating website?
I’m sure all of the above feelings are involved in the exposures going on. And I can’t help but think of Dimmesdale, hiding beneath the surface of his town’s consternation. It makes me wonder—in the midst of our public outcry over the obvious and embarrassing, what hidden sins are we forgetting, or choosing to forget?
Ever felt a sense of nostalgia for the quiet country villages of yesteryear? Well you need not feel too deprived, argues Alastair Roberts in a Mere Orthodoxy blogpost: modern life, awash as it is with social media interactions, is quite similar to a Jane Austen’s Meryton—at least, similar in its pernicious effects.
The ‘density’ of [our social] environments and the closeness of the bonding within them produce a cosiness that is welcome for many, but which is generally quite resistant to contradiction, conflict, criticism, and genuine difference. Such characteristics and behaviours as likeability, empathetic connection, mutual vulnerability and mutual affirmation, personal resonance, relatability, and inoffensiveness are essential to the operation of such environments, but these characteristics and behaviours largely preclude openness to criticism and challenge of the group and its conforming members. Those who make firm criticisms will readily be classed as ‘haters’ or enemies of the group and driven out with hostility, while the group reaffirms itself and its members of their rightness and the vicious character of all opponents, reinforcing all of their prejudices and steadily inuring all members to criticism.(4) Such communities will also often engage in rigorous ‘policing’ of deviant viewpoints and, like the stereotypical mediaeval villagers, will frequently enact swift and merciless mob justice upon those who do not conform as they ought.
… Austen insightfully recognized the manner in which our delight in tight-knit, pleasant, and agreeable communities—and in conversations marked by ‘mutual satisfaction’—renders us susceptible to deep distortions of communal discourse, knowledge, and judgment. When we are all so relationally cosy with each other, we will shrink back from criticizing people in the way that we ought, voluntarily muting disagreement … . In such contexts, a cloying closeness stifles the expression of difference and conversations take on a character akin to the ‘positive feedback loop’ that existed in Wickham and Elizabeth’s conversation, where affirmation and assent merely reinforced existing prejudices. In such contexts, communities become insular (a tendency that can be exacerbated by algorithms), echo chambers of accepted opinion, closed to opposing voices.
We can see these tendencies on social media today—in the quick spread of viral hashtag campaigns, in the way people congregate around (or against) major celebrities or politicians, and bully all who do not join them. How do we combat these tendencies?
In order to escape “social saturation,” Roberts calls for opposing voices (in the case of Elizabeth and Meryton, a Mr. Darcy who will speak the truth with ruthless persistence) and for a separation from the toxic influences of the village (Elizabeth’s journey to Charlotte Lucas becomes a crucial turning point in the novel, as she confronts her biases and prejudices for the first time). The first may be discoverable, even online. Much depends on the social media community that we curate. The second, however, is difficult to accomplish: “Our communities can follow us almost everywhere we go and, unless we are determined to escape them and to resist their encroachments, the privacy and solitude that we require for self-presence and introspection will no longer so naturally afford themselves to us.”
These words reminded me of a book review I read yesterday, one that addressed both a need for opposing voices, and for places of introspection and mental quiet. In his review of The Meaning of the Library, Brian Bethune specifically considers one librarian’s explanation for our continued (if not growing) need for the space:
[Libraries] had better survive, argues James Billington, who considers them crucial in the defence of global democracy, for the librarian-less Internet is no substitute. Billington, head of the world’s largest “foraging ground for the pursuit of truth,” the 158-million-item Library of Congress, writes that guidance through a knowledge jungle is invaluable. Even more important, online life resembles an echo chamber, while in a library, contradictory arguments sit side by side on a shelf. That makes the latter, Billington proclaims, the world’s best “antidotes to fanaticism.”
Both Billington and Roberts see online life as an echo chamber, a place of escalating conflict or social homogeneity, a place where it seems increasingly impossible to find or formulate objective, thoughtful, nuanced opinions. But Billington has at least a partial solution to Roberts’s problem: he sees the library, with its physicality, its volumes of Darcy-like truth-telling, and its quiet sense of presence, as an antidote to our problem.
The problem is, of course, that few people frequent libraries with any sort of regularity. But it seems that bookstores could in fact create a similar environment, and provide a similar antidote. The key elements to Billington’s solution seem to be the physicality of books, and their straightforward presentation of arguments or opinions that may be completely novel—and/or irksome—to the reader. This, coupled with an environment that encourages quiet reflection, may help prompt the visitor to overcome their social biases. They can spur the visitor to read and reflect.
This is something that television cannot do—saturated as it is with advertisements and entertainment. Even when we see opinion presented in the form of a documentary, it rarely comes in a form that is suited to thoughtful consideration. To keep the watcher engrossed, the filmmaker must use elements of sensationalism: emotive music, frightening or emotional images, hyperbolic statements, dazzling camera shots. All of these things impress the viewer, often beyond the actual subject matter of the argument itself. Moreover, television’s otherness encourages a sort of mental and physical separation on the part of the watcher. It fosters passivity and acceptance, perhaps even more so than the Internet.
This is also something that radio cannot do, though it encourages a greater degree of interaction than television. But radio also falls prey to the scare tactics and emotive sensationalism of television—along with some of the communal shaming tactics of the Internet. One need only listen to talk radio for a short amount of time to see these elements in action.
It seems that books—with their combination of physicality and otherness, quietness and intellectual confrontation—present the best way for us to combat the toxic echo chambers of social media. They pull us out, give us a place of escape, confront our intellectual and spiritual weaknesses, then send us back into the online community better than when we stepped out of it.
Is dating on the verge of extinction? In an article featured in their latest September magazine, Vanity Fair addresses the fearful world of Tinder—and the toll it’s taking on traditional sorts of courtship:
Hookup culture, which has been percolating for about a hundred years, has collided with dating apps, which have acted like a wayward meteor on the now dinosaur-like rituals of courtship. “We are in uncharted territory” when it comes to Tinder et al., says Justin Garcia, a research scientist at Indiana University’s Kinsey Institute for Research in Sex, Gender, and Reproduction. … People used to meet their partners through proximity, through family and friends, but now Internet meeting is surpassing every other form. “It’s changing so much about the way we act both romantically and sexually,” Garcia says. “It is unprecedented from an evolutionary standpoint.” As soon as people could go online they were using it as a way to find partners to date and have sex with. In the 90s it was Craigslist and AOL chat rooms, then Match.com and Kiss.com. But the lengthy, heartfelt e-mails exchanged by the main characters in You’ve Got Mail (1998) seem positively Victorian in comparison to the messages sent on the average dating app today.
Linker writes at The Week that he’s fearful of what this means for his kids:
I want them to enjoy the fulfillment that can only come from devoting themselves to something that transcends the self — a spouse, a child, a family. I want them to experience falling in love and feel their hearts opened to hopes of a higher, more enduring form of happiness. I want them to experience the rarer and more precious goods that follow from the disciplining of their baser instincts (like the animal desire to copulate with a different sexual partner every night of the week) in order to reach an end that’s pursued for its own sake rather than for the instantaneous rewards it brings.
But The New Republic‘s Moira Weigel retorts that such responses are reactionary, and part of dating battles that we’ve had throughout history. After tracing the history of such dating wars from the 19th century through the present, she adds, “Even a short survey makes it clear that every generation has thought that the next generation was dating wrong. … The death of dating genre tends to treat each new form of courtship as a moral aberration. This is silly.”
Weigel doesn’t seem to appreciate the progressive edginess of the dating world, from the first examples she provides (like consternation over women meeting strangers in public), to the modern hookup-culture that Salon discusses. And she fails to understand why parents such as Linker aren’t merely exhibiting a merely reactionary horror, but rather a legitimate shock over how far we continue to progress in our sexual freedoms, as a culture.
Weigel does admit that we are progressing into new territory—she just doesn’t think the progression is an ethical one: “New practices like hooking up have less to do with a moral apocalypse than with the evolution of the economy,” she argues.
Young people today are told to be flexible and mobile in all other aspects of our lives; we are told to be eternal entrepreneurs of ourselves, and that we cannot count on steady gigs or fixed contracts or benefits. Why would this not apply to our love lives, too? Why shouldn’t Tinderellas use an Uber for romance and sex when they use one for everything else?
It is true that our mediums and environment influence us in powerful ways. Financial insecurity and unemployment are a couple factors that often dissuade young people from marriage. So is having experienced the divorce of one’s parents. Technology has, some argue, encouraged a short-term focus on immediate pleasures—it fosters a short attention span and an appetite for the immediate.
The question is, of course, whether Weigel is right: that these things are not ultimately moral issues, but merely a change in the way we as humans work. A stage in relational evolution, perhaps.
Linker seems to suggest that these are moral issues. He believes that, in the age of liberality, we’ve stripped young people of such moral language, and they are merely responding with the vocabulary we’ve given them: “God? Nature? Won’t the world be better off without those musty old ideas limiting our freedom, hovering over our heads, judging us, weighing on our conscience?” And Rod Dreher has responded to the Tinder story with moral arguments against this growing trend.
But many people concerned with the hookup culture are people who have glimpsed what dating could be, and what it can practically or relationally offer to young people: namely, something more than pure sexuality and momentary physical satisfaction. Dating offers the opportunity to appreciate the entirety of the human person: mind and soul, as well as body. Dating can be about getting to know a person, and learning to appreciate them in all their various virtues and defects. It can offer an opportunity to develop a friendship, one based on more than mere sentience.
And this is what we may lose, if we go from dating to merely hooking up. We lose that gift of appreciation for and friendship with one person, learning to know them in a full way that honors their personhood and individuality—in a way that sees and appreciates them for more than their physical anatomy.
Aside from the dated person themselves, there’s often a community that springs up when you date a person. Your friends mix and mingle, you meet new people and form new social bonds. As the Salon article notes, a principle of proximity often guides traditional dating relationships—a principle that not only helps cultivate closeness with the boyfriend or girlfriend in question, but also binds you to a people and place.
Losing this would be a sad thing. Dating, regardless of whether it ends in marriage, affords us an opportunity to know and appreciate other people in a unique way. It also helps us grow personally—forcing us to confront our own personal vices, teaching us to overlook annoyances and give up our own desires to serve someone else. Replacing that gift with the temporary yet exciting opportunities of hooking up may seem liberating, but one wonders whether eventually we’ll come to feel ourselves cheated out of something greater, and deeper.
When Megyn Kelly leveled her (I think good) question about women to Trump during Fox News’s presidential debate last Thursday, and received his disdainful response, my first thought was, “How ungentlemanly.” Regardless of Trump’s political views, churlish insults do not encourage a civil or thoughtful political discourse—they are designed only to enrage and insult.
Yet there are many who seem to revere Trump for his lack of political correctness—for the way in which he laughs in the face of the “PC” police. These Americans are tired of the way in which we spread a cloak of niceness over political discourse, and the resulting quagmire that we face.
You need look no further for evidence of this “niceness” promotion—and its resulting paralyzing effect—than The Atlantic’s recent article, “The Coddling of the American Mind.” In it, authors Greg Lukianoff and Jonathan Haidt consider the escalating crackdown on anything even remotely controversial on college campuses:
The thin argument “I’m offended” becomes an unbeatable trump card. This leads to what Jonathan Rauch, a contributing editor at this magazine, calls the “offendedness sweepstakes,” in which opposing parties use claims of offense as cudgels. In the process, the bar for what we consider unacceptable speech is lowered further and further.
Since 2013, new pressure from the federal government has reinforced this trend. Federal antidiscrimination statutes regulate on-campus harassment and unequal treatment based on sex, race, religion, and national origin. Until recently, the Department of Education’s Office for Civil Rights acknowledged that speech must be “objectively offensive” before it could be deemed actionable as sexual harassment—it would have to pass the “reasonable person” test.
… But in 2013, the Departments of Justice and Education greatly broadened the definition of sexual harassment to include verbal conduct that is simply “unwelcome.” Out of fear of federal investigations, universities are now applying that standard—defining unwelcome speech as harassment—not just to sex, but to race, religion, and veteran status as well. Everyone is supposed to rely upon his or her own subjective feelings to decide whether a comment by a professor or a fellow student is unwelcome, and therefore grounds for a harassment claim. Emotional reasoning is now accepted as evidence.
If our universities are teaching students that their emotions can be used effectively as weapons—or at least as evidence in administrative proceedings—then they are teaching students to nurture a kind of hypersensitivity that will lead them into countless drawn-out conflicts in college and beyond.
It is quite ironic to see reports of Donald Trump’s uncouth political tantrums interspersed with such talk of the “micro-agressions” and “trigger warnings” that pervade college campuses. But what may not be obvious on the face of it is that the boorishness of Trump and uber-sensitivity of the modern college student are related: they are derivatives of an age that has lost a standard for public discourse, an understanding of proper limits or behavior. This is something Mark Mitchell has discussed over at the Front Porch Republic—he argues that we’ve lost what Edmund Burke called “the spirit of the gentleman,” the architect and animator of “all the good things which are connected with manners and with civilization.” Mitchell writes,
Forms and limits are not welcomed in a culture that sees freedom as the highest good, a culture that fairly worships at the altar of individual choice. The history of the liberal project has been a steady and determined attempt to defy limits, to destroy forms, to expand the idea and practice of liberation to all spheres of existence. How can the idea of the gentleman, the essence of which necessarily depends on the propriety of limits, co-exist with the goals of liberalism? One admits of limits and finds nobility in respect for them; the other finds limits offensive and seeks to break down any hint of limitation, form, or residue of difference.
Ours is a culture that worships freedom, albeit in varying ways—both Trump and the college students discussed in Lukianoff and Haidt’s article are touting their version of freedom and individual choice. In Trump’s case, it’s a freedom from political correctness, a freedom to say or do whatever he pleases. In the students’ case, it’s freedom from critique or offense, a freedom to live in emotional security without fear of chastisement or judgment. The freedom of the one to say or do whatever he pleases will inevitably clash with the freedom of the others to live in a state of politically correct security. How do we police modern discourse when the righteous indignation of the one camp is so discordant with the fierce fury of the other?
This is what the “spirit of the gentleman” used to provide: a reasoned, courteous atmosphere in which public discourse could take place—where opinions could be stated without savagery, and received without rancor. The problem is that gentlemen are out of popularity on left and right—for reasons Mitchell makes clear in another FPR post.
The gentleman is unpopular with the left and “PC” crowd because, in Mitchell’s words, he “is one who is willing and able to judge well. He is discriminating in his judgments and does not shy away from making hard distinctions even when they cause him discomfort and even when he is forced to stand alone [emphasis added].” Such discriminatory value judgments will not be honored on the modern university campus, nor even in the larger political world. Yet there seems to be a right way in which to make judgments about problems, policies, and people—and the gentleman knows how to do it. The modern students shies away from those distinctions in the name of being inclusive or “PC,” but in fact, their ability to speak with any sort of moral clarity or purpose is compromised by their refusal to make such value judgments.
But this is not to say that the gentleman would applaud the brash tones of Trump or his followers. To the contrary—Mitchell states further on that the gentleman is characterized by decorum and propriety: “A sense of propriety, when properly formed and not merely a sense of personal dignity, requires an awareness of other people. A person with a well-developed sense of propriety makes other people feel at ease.” The gentleman is also amiable: “An amiable man is a good conversationalist who is interested in the people with whom he speaks. He is not self-absorbed nor is he so self-conscious that he refrains from engaging with others. … An amiable man is not a boor who cares only for the sound of his own voice.”
To suggest that Trump has any of the above qualities would indeed seem laughable. But one must be quick to note that his lack of gentlemanly virtues has won him accolades, attention, and poll approval. How are we to see and hear more of gentlemen in our public discourse, if we foster and encourage their very opposites?
The same is true of the attitudes we foster on college campuses: if good and thoughtful professors are afraid to speak up in their classrooms, how will we foster another generation of discriminatory (yet amiable) thinkers?
The sad truth is that we won’t. The reasoned voices in our media, politics, and academia will devolve into one of two camps: the shouters, or the mute.
As the media begins to buzz with anticipation over the first televised Republican presidential debate—taking place this evening on Fox News—it’s difficult not to feel disillusioned before it even begins.
Because, along with (I assume) at least a few other millennials out there, I don’t think there’s a single candidate I want to vote for. And the shouting matches that are sure to build over the next several months are only going to further entrench that viewpoint.
The problem is that the politics of the millennial generation is more diverse, issue-based, and flexible than those of our forbears. Few of us are fanatically partisan. Many of the conservative young people I talk to are passionate about their pro-life beliefs, but lean libertarian on issues such as foreign policy and same-sex marriage: they want to be less involved in conflicts overseas, are disillusioned by hawkish rhetoric, and open to diplomatic measures such as the Iran deal. Be they religious or no, many believe that the government should not ban gay marriage. They have a greater appreciation for compassion and nuance in the illegal immigration debate, and are looking for realistic, positive solutions to the problem. They’re very interested in fighting poverty and social injustices, and are more likely to have an opinion on police brutality or inner city violence. Many are skeptical of the anti-environmentalist leanings of the Republican Party, and want to see a candidate who prizes sustainability and ecological restoration. They don’t just listen to Rush Limbaugh and read Fox News—some read Wendell Berry, peruse the Daily Beast’s website, watch Jon Stewart’s “The Daily Show.”
But they will get none of this nuance from the Republican candidates who mount the stage this evening. Because the politicians on stage aren’t speaking to conservative-leaning millennials or their interests: they’re using the same tired rhetoric and debate tactics they’ve used for ages, rehashing the same GOP establishment statements and talking points. Though some of the politicians on stage were hoped—and expected—to be different, they’ve made their share of mistakes, and have slowly slipped from the limelight.
Republicans aren’t necessarily ignoring young voters—it’s just that they’re trying to reach them with technology, with fancy websites and Snapchat accounts. And as much as I appreciate a well-designed website, it’s rather offensive to see politicians thinking they can win me over with aesthetics and gadgetry, rather than with meaningful platforms and ideas for reform.
And they have a great chance—if they would only embrace it—to actually win over young voters. Though they’re the generation most likely to self-identify as liberal, many millennials were disillusioned by Obama’s presidency, and appear open to change.
Unfortunately, change isn’t what they’re going to get—at least not with politicians like Donald Trump (whose supporters are “more likely to be male, white, older, with less education“) dominating the political discourse. Newer candidates like Marco Rubio are sticking to the party line on issues like foreign policy (as A.J. Delgado put it in our latest magazine, “If you’re clamoring for the … policy of the George W. Bush years, Rubio’s your guy. If you want something pensive, positive, and fresh—he probably isn’t.”) The libertarian-leaning Rand Paul has adopted a caution that has left him well behind other contenders, and unlikely to catch up. Scott Walker, while lacking in any obviously detrimental political baggage, has also never commented deeply on schismatic social and foreign policy issues—thus making him likely “to simply adopt whatever the GOP consensus is on the topic,” as Sean Scallon pointed out in May.
Perhaps new hope will come to light after tonight’s debate. Perhaps a candidate will surprise us with a fresh or thoughtful take on a typical partisan issue. But unfortunately, that’s not usually how televised debates work. Indeed, in the age of Twitter and Facebook, it seems harder than ever to foster a really deep and meaningful discourse.
But we can keep learning about the candidates and their politics, educating ourselves on the lesser of the evils they represent. We can learn more about the Democrats running for president and the political views they represent—some, like Jim Webb, are far more interesting alternatives for the independent-leaning conservative than Donald Trump or Rick Santorum.
And perhaps, if the Republicans take another hard hit in 2016 amongst millennial voters, they’ll finally learn what they should have learned in 2012: you can’t win over young Americans with fancy websites, Twitter accounts, and hashtag campaigns. Unless you offer something meaningful, thoughtful reform, we’re going to look elsewhere.
John Green’s work isn’t too difficult to recognize: it usually features two ironically funny, slightly introverted, hipster teens. It also usually features deep—and rather dark—musings on the nature of being, and the inevitability of death. Sometimes, these musings are on the nihilistic side, featuring a gloomy solipsism that pulls oddly at the fabric of the teen novel. This was especially true of his astoundingly popular book The Fault in Our Stars: a romance about two teens, each fighting cancer, each obsessed with an eccentric novel that eventually leads them to Europe and back. The main character in TFiOS is a teenage girl, Hazel Grace, who knows death is an inescapable part of her life, and learns how to cope with it.
Paper Towns is a bit different: it follows protagonist Quentin, who has had a lifelong crush on his neighbor, the mysterious and lovely Margo. She’s known for fantastical exploits and adventures—the sort indicative of a spontaneous and fun-loving nature. Though the two were friends as children, they’ve slowly grown apart. But Quentin never stops admiring Margo from a distance. One night, she lets Quentin into her world: she invites him on a whirlwind excursion, fraught with mischief and pranks. The next day, however, she disappears. Quentin is devastated, until he discovers clues that Margo left behind for him—slips of paper, circled words on a record album, highlighted words in a Walt Whitman poem. He becomes obsessed with one purpose: to find Margo, and declare his love to her.
This novel doesn’t touch on death in the same way The Fault in Our Stars did. Rather, it’s more about our ability to know and understand the “other”: the people in our lives that we think we know and love, but so often misunderstand. The book is replete with quotes from Whitman’s Leaves of Grass: a work in which Whitman slowly beings to identify with others to the point of becoming them. As Quentin reads Whitman and searches for Margo, his interactions with his best friends—Ben and Radar—begin to reveal ways in which he doesn’t understand or know them truly. Radar tells him, “You know your problem, Quentin? You keep expecting people not to be themselves.” Meanwhile, Quentin begins to see new parts of Margo, the girl he’s always admired from a distance, and he begins to wonder: has he ever really known her? Who is Margo, really?
The problem with the book, and film, is that even the “real” Margo seems fake. Our progression toward the authentic Margo seems almost illusory, as she ever remains a dreamy, impossible version of female accomplishments and eccentricities. Green is trying to make a point about how we mystify, and even deify, “the other”—but since Margo never becomes really human, book falls flat in making this point.
Quentin’s friends are real: they have flaws, quirks, weaknesses. Margo, however, is an almost-perfect mystery: her “weaknesses” are feelings of not fitting in, worries about acting fake or inauthentic. Yet she’s still the girl who breaks into Seaworld in the wee hours of the morning, the girl who writes in a jumble of capitals and small-caps because “the rules of capitalization are so unfair to words in the middle.” She has a mammoth, diverse vinyl album collection, one that she hasn’t told anyone about. She reads Walt Whitman and Sylvia Plath. She remains ever a manic pixie dream girl: a character that Nathan Rabin once wrote “exists solely in the fevered imaginations of sensitive writer-directors to teach broodingly soulful young men to embrace life and its infinite mysteries and adventures.” The manic pixie dream girl is “the most deliciously delirious young woman, always up to her false eyelashes in madcap romps,” as Neda Ulaby wrote for NPR. Doree Shafrir wrote for The Daily Beast that indie films are particularly obsessed with this female character, who is “detached, mysterious, impulsive,” loves indie bands, and is “always just out of reach, making herself scarce at crucial moments.” That’s Margo, alright.
Interestingly, Vulture Magazine’s Matt Patches accused John Green of creating the first manic pixie dream boy with his TFiOS character Augustus Waters, Hazel Grace’s boyfriend. “He’s a bad boy, he’s a sweetheart, he’s a dumb jock, he’s a nerd, he’s a philosopher, he’s a poet, he’s a victim, he’s a survivor, he’s everything everyone wants in their lives, and he’s a fallacious notion of what we can actually have in our lives.”
Of course, the great point (*spoiler*) of all Green’s stories is that we don’t often get what we want—whether it’s death, distance, or misunderstanding, something always separates us from happily-ever-after. But that doesn’t make his characters any more real, any less aspirational and crush-worthy. Regardless of whether Hazel ends up with Augustus, or Quentin ends up with Margo, there’s no question that she’s the perfect girl, and he’s the perfect guy. It’s just fate, really, that gets in the way.
The film “Paper Towns” is a little more redemptive, in this sense, than the book. Green ends his novel in the deeply philosophical (as he is wont to do), with both Quentin and Margo pondering metaphors that properly describe existence, and our ability to understand each other. It’s touching, it’s interesting—it’s even a bit profound in places. But it still casts each onto themselves, in lonely individualism. It leaves us as islands, beckoning to each other in the dark, never able to fully understand or reach each other.
The film, however, brings the plot back to Green’s strength, the camaraderie and delight that makes his work worth reading: the friendships. In all Green’s books, the friendships are diverse, funny, heart-touching. The side characters are often more real, relatable, and endearing than the protagonists. And this is perhaps especially true of Paper Towns, in both its film and literary adaptations: Ben (played by Austin Abrams in the film) and Radar (Justice Smith) are fantastic friends. The film, in its conclusion, reminds us why such friends—the ones we’ve known and loved for ages, who annoy us to death but remain loyal—make life truly miraculous, and help us bridge the gaps between the islands.
Green tried to tell the story of a manic pixie dream girl who wasn’t. But instead, he tells a story of disillusionment, misunderstanding, and the friendships that help us overcome both. Near the end of the book, Quentin says, “Imagining isn’t perfect. You can’t get all the way inside someone else.” But as Radar has pointed out earlier, that’s not the point: we are supposed to love what we don’t understand, appreciate what we are not, and recognize the mystery and beauty of those around us—their warts and all. That’s what friendship is.
“The point of having a child is to be rent asunder, torn in two.”
In light of the Planned Parenthood videos released over the past two weeks, reading those words yesterday stirred a little irony and sadness in me. True: parenthood is about the disruption of the self. But how often, in today’s world, have we reversed those roles—rejecting parenthood in favor of rending an unborn child asunder?
“Here is the truth about abortion: It kills an unborn baby,” Damon Linker wrote for The Week last Wednesday. “We all know this—and with continuing advances in ultrasound technology, it’s something we know with greater and greater certainty all the time.”
Each video by the Center for Medical Progress has exposed ways in which Planned Parenthood harvests the organs of unborn babies for research. Some protest and say that this is an entirely legal process, and one that does not benefit the company monetarily in any way. But Matthew Lee Anderson responded well at Mere Orthodoxy when he wrote,
Planned Parenthood may not make a single dime off of participating in such a system. But they are still in a “market” where the other people and institutions who do benefit from receiving the ‘fetal tissue’ doubtlessly reciprocally support Planned Parenthood in other ways, if only through donation and political support. The practice of treating infant bodies as products in a transaction should itself shock us, regardless of who profits from it.
One problem with the debate surrounding these videos—and the pro-choice / pro-life debate as a whole—is that everything becomes muddied with language that is euphemistically (or dysphemistically) skewed in order to fit the purposes of the speaker. As Anderson writes, “Abortion requires not only the dismemberment of the human body in fact, but in our speech as well. … We cannot allow ourselves to see the baby as a whole, integrated, living organism.”
The pro-choice movement’s language can also be skewed in its discussion of motherhood. Even as Planned Parenthood and pro-choice feminists have painted themselves as supportive of women’s empowerment and right to choose, they have often denigrated the honor and joys of motherhood. They see womanhood as fulfilling only within certain bounds. And while their motherhood script is employed with good intentions, out of a desire to help support women and their life choices, it may practically achieve the opposite.
This week, a friend sent me this Salon Q&A with cultural commentator Camille Paglia. In it, Paglia says feminists abdicated the realm of motherhood for the workplace:
My explanation is that second-wave feminism dispensed with motherhood. The ideal woman was the career woman–and I do support that. To me, the mission of feminism is to remove all barriers to women’s advancement in the social and political realm–to give women equal opportunities with men. However, what I kept saying in “Sexual Personae” is that equality in the workplace is not going to solve the problems between men and women which are occurring in the private, emotional realm…
It is exactly this sort of linguistic erasure that has given rise to the sort of motherhood story told by Sarah Manguso in Harpers Magazine: she speaks of the vehemence with which she opposed the idea of motherhood—the way in which she equated it with a loss of all purpose, personhood, and joy. “Before I had my son I was convinced that motherhood would ruin my writing and cause a profound loss of self that would never be compensated,” she says. But then, her perspective began to change:
Women who deride motherhood as merely an animal condition have accepted the patriarchal belief that motherhood is trivial. It’s true that motherhood can seem trivial to women who have been insulated from the demands of others; they are given few reasons to value motherhood and many reasons to value individual fulfillment. They are taught, as I was, to value self-realization as the essential component of success, the index of one’s contribution to the world, the test of our basic humanity. Service to the world was understood as a heroic act achieved by a powerful ego. Until I’d burrowed out from under those beliefs, being a writer seemed a worthier goal than being a mother.
This “burrowing out” led Manguso into motherhood, a vocation which she says has made her life full and happy, despite her gravest fears: “My old self is indeed gone, but I perceive the world more carefully and more lovingly than before because I am more aware of the effects of love and of time on an individual person.”
The modern feminist movement often equates motherhood with loss of self-fulfillment and freedom, cultural subjugation, a gross abandonment of the economic sphere. Yes, motherhood can be okay—but only if you don’t have a child too young, and are still participating in the workforce. Motherhood may be alright—if it happens entirely according to your preconceived plan, and you have a partner who willingly and equally shares all burdens of childcare and provision.
The purpose of these cautions is to invest women with a sense of control over their lives and child planning. But it seems that their unintended consequence is often more damaging: because any woman whose childbearing story does not fit this script suddenly sees her life, or her child, as anti-woman, anti-freedom. The 19-year-old pregnant woman sees no possibility of having her child and being self-fulfilled. The young businesswoman who planned on having kids seven years down the road, surprised with an unexpected pregnancy, suddenly fears that she’s thrown away all possibility of vocational success and attainment.
Yet these stories are often untrue and misleading. They distract us from the reality that life is always uncontrollable. And they distract us from the possibility that—as Manguso points out—while motherhood may indeed lead us out of our present self, in that leading out, we may find a new and better self awaiting us.
The Planned Parenthood script is very selective in how it talks about the unborn, because the more you begin to see that fetus as a baby, the harder it becomes to terminate—the harder it becomes to see its heart, head, and hands as mere lumps of tissue, mere organs to be harvested.
But the Planned Parenthood script is also very selective in how it talks about motherhood, because the more you see childbearing as fulfilling, life-giving, even empowering, the less likely you will be to see that pregnancy as unwanted, that child as a burden.
It’s important not to ignore the difficulties of motherhood. As W. James Antle III wrote for The Week, “asking a woman to carry an unwanted pregnancy to term is not a small ask.” Elizabeth Stoker Bruenig wrote compellingly for TAC some time ago that the pro-life movement should be on the front lines, helping mothers without resources or support to get the help they need. “There are women out there having abortions despite being ideologically committed to not having them, or who at least wouldn’t have them, were circumstances different,” she wrote.
And we can help those women: first, by changing the way we talk about motherhood—by helping women see that it can actually be empowering and enjoyable. But second, we must assist mothers who are overwhelmed and frightened by the prospect of having a child. It’s an understandable fear: but one that, by embracing a new rhetoric that is pro-life toward both child and mother—respecting the personhood of both—can hopefully bring both healing and hope. Because whether a woman chooses abortion, or whether she chooses motherhood, a life will be rent asunder. The challenge is in helping her decide which life will be torn.
The New York Times tells a sad tale of suicide and depression at The University of Pennsylvania—a growing trend amongst universities throughout the nation:
Ms. Holleran was the third of six Penn students to commit suicide in a 13-month stretch, and the school is far from the only one to experience a so-called suicide cluster. This school year, Tulane lost four students and Appalachian State at least three — the disappearance in September of a freshman, Anna M. Smith, led to an 11-day search before she was found in the North Carolina woods, hanging from a tree. Cornell faced six suicides in the 2009-10 academic year. In 2003-4, five New York University students leapt to their deaths.
Nationally, the suicide rate among 15- to 24-year-olds has increased modestly but steadily since 2007: from 9.6 deaths per 100,000 to 11.1, in 2013 (the latest year available from the Centers for Disease Control and Prevention). But a survey of college counseling centers has found that more than half their clients have severe psychological problems, an increase of 13 percent in just two years. Anxiety and depression, in that order, are now the most common mental health diagnoses among college students, according to the Center for Collegiate Mental Health at Penn State.
Soon after Ms. Holleran’s death, Penn formed a task force to examine mental health on campus. Its final report … recognized a potentially life-threatening aspect of campus culture: Penn Face. An apothegm long used by students to describe the practice of acting happy and self-assured even when sad or stressed, Penn Face is so widely employed that it has showed up in skits performed during freshman orientation.
While the appellation is unique to Penn, the behavior is not. In 2003, Duke jolted academe with a report describing how its female students felt pressure to be “effortlessly perfect”: smart, accomplished, fit, beautiful and popular, all without visible effort. At Stanford, it’s called the Duck Syndrome. A duck appears to glide calmly across the water, while beneath the surface it frantically, relentlessly paddles.
… Citing a “perception that one has to be perfect in every academic, cocurricular and social endeavor,” the task force report described how students feel enormous pressure that “can manifest as demoralization, alienation or conditions like anxiety or depression.”
What is it about college that encourages this incessant pressure to achieve, and these resulting moments of crisis?
It seems our performance-based education culture must play a sizable role in this: quantified measures of skill turn learning into a competition. As Charles Tsai just wrote for Medium,
Books and books are being written about how schools operate like factories and treat students as clones of one another, training them to be compliant workers rather than people who think for themselves. Even the elite schools, the ones that actually produce the future business executives and presidents, do this. They make students jump through arbitrary hoops and use the hoops to rank and sort them.
This encourages students to measure themselves against other high-performing students, and fosters culture of anxiety in which we’re all striving for some impossible-to-attain first place prize.
But even the style of learning and character developed is affected by this performance emphasis, which values quantitative skills over qualitative goods: lauding athletic prowess, the 4.0 GPA, the stunning internship portfolio—but passing over more subjective, immaterial skills such as critical thinking, mental development, and thoughtful class participation. It does not recognize a healthy social life, a deepening interest in the culinary arts, or the business aplomb of a budding entrepreneur.
What is taught can even become secondary to the grades one receives, and the resulting benefits realized by 1) the student in his or her career pursuits, and 2) the school in financial supports and academic acclaim.
It’s also important to consider the crisis of choice that many students face when they enter college. They’re often apart from any familial or communal support system, entrusted with total authority and autonomy, and receive little to no system of rules or guidelines to navigate looming dilemmas—saving the popular yet confusing “follow your heart.”
It shouldn’t be surprising that students put into this situation often suffer a crisis of identity, or at least one of direction, as they embark upon this new world. What their parents or community prized may no longer be enough: what they thought great may of a sudden appear mediocre. Some may abandon the rules of their past for a new, nearly anarchical, exploration of self and its wants. Others may throw themselves into the rigid rules of the new performance-based system they are faced with. But both paths are often damaging.
The Times article is right to point out the role often played by demanding parents, as Julie Lythcott-Haims pointed out in Slate a couple weeks ago:
In 2013 the news was filled with worrisome statistics about the mental health crisis on college campuses, particularly the number of students medicated for depression. Charlie Gofen, the retired chairman of the board at the Latin School of Chicago, a private school serving about 1,100 students, emailed the statistics off to a colleague at another school and asked, “Do you think parents at your school would rather their kid be depressed at Yale or happy at University of Arizona?” The colleague quickly replied, “My guess is 75 percent of the parents would rather see their kids depressed at Yale. They figure that the kid can straighten the emotional stuff out in his/her 20’s, but no one can go back and get the Yale undergrad degree.”
We should be seriously concerned by a culture in which academic accolades and prestige outweigh our concerns for inner emotional wellbeing.
The Times article also looks at our tendency to strive toward perfection, and all the problems that stem from such behavior. While this is true, the problem with an anti-perfectionism backlash is that it can easily lead students to the aforementioned anarchical path, in which they run helter-skelter from goal to goal, hobby to hobby, striving for fulfillment. We want to find that middle ground where we can encourage students to be themselves, without throwing all rule books or life goals out the window.
Giving students a better college experience seems to involve the refocusing of education on lasting, permanent goods—beyond one’s GPA and extracurricular performance. It should also involve the encouragement and establishment of strong community supports—on and off campus—to invest in a student as he or she grows. It is easy to become distracted by academic accolades. But what should students walk away with when they leave college? Hopefully, they should have a toolbox of critical thinking and experiential skills to help them navigate the world, along with a community of friends and mentors to assist them on their paths. Though these things can’t be quantitatively measured, they will help students overcome their perfectionism and depression, and hopefully help them emerge into the light of confidence and community.
She was an assistant U.S. attorney for the Eastern District of Pennsylvania during the Vietnam era. It was the early 1970s: she believed in the war at the time. But one day she rode a train from Haverford, Pennsylvania along with wounded soldiers from Valley Forge hospital. Many of them had lost arms, legs, eyes. Seeing those young men on the train shocked and changed her: she became familiar with the cost of war—and was convinced it cost too much.
The young woman on that train was Faith Ryan Whittlesey. She could never guess the role she would soon play in fighting the Cold War and advocating a prudent foreign policy—rising from her law career and work in Pennsylvania politics to serve the Reagan administration, both in the White House and overseas as an ambassador to Switzerland. At home and abroad, in public and private life, she would quell conflicts and champion a reality-based conservatism, bridge seemingly impassable diplomatic chasms, and take staunch if unpopular stances for what she believed to be right. Her gentility and iron resolve would soon launch the woman on that train into the national and international spotlight.
Born in 1939, Whittlesey was brought up by an Irish Catholic father and Methodist mother in northern New York. Her parents were both Republican—her mother was a social worker skeptical of the government’s “ability to do anything positive in society,” Whittlesey says. After high school, Whittlesey received a full-tuition scholarship to Wells College in upstate New York, where she majored in history. Afterward, she attended the University of Pennsylvania Law School, where she met her husband-to-be, Roger Whittlesey. They married in 1963 and went on to have three children.
Three years later Roger became president of the Young Republicans of Center City Philadelphia and in 1968 executive editor for Richard Nixon’s presidential campaign in Pennsylvania. Meanwhile, Faith worked as a law clerk to federal Judge Francis Van Dusen, then as special assistant attorney general. Historian Thomas Carty, author of a Faith Whittlesey biography entitled Backward in High Heels, emphasizes that she was working in what was “overwhelmingly a man’s world”: more than 95 percent of U.S. judges and lawyers were men.
In 1972 Roger was invited to represent the 166th district of Pennsylvania in the state House of Representatives. He declined, but Faith decided to take his place. When the district’s Republican establishment refused to endorse her, “I cried and cried,” she says. But Roger told her, “This is the best thing that could have happened. Now we owe them nothing, and we will beat the pants off them!”
He was right: starting as an outsider, Faith forged her career independent of party pressure. She won the Republican primary against six men and came to prize the fact that she wasn’t beholden to the establishment. This independence helped differentiate Faith from the Vietnam and Watergate-era scandals: “As a woman and mother, she brought a special quality that allowed her to mobilize housewives and young people who might otherwise eschew politics as dirty and corrupt,” Carty says.
But in March 1974, tragedy struck: after a long struggle with depression, Roger Whittlesey committed suicide. Faith was suddenly a single mother and sole provider for her family. Despite her grief, she says, “I was totally consumed by the task in front of me. I had very little free time, running a large house by myself.” She entrenched herself in Pennsylvania politics, building a reputation for her maverick stances and honesty.
In 1976, she clashed with Pennsylvania’s Republican Party leaders over the presidential nomination. President Gerald Ford was the establishment candidate, but Faith Whittlesey favored the outsider: California Governor Ronald Reagan. She valued his commitment to limited government and his resolve to fight communism. Though Reagan did not win that year, Whittlesey supported him again when he decided to run in 1979. She became co-chairman of Pennsylvania’s Reagan-for-President Committee and traveled all over the state delivering speeches.
“I became fluent in explaining Reagan’s foreign policy during that period,” Whittlesey says. “I had to go out into the far reaches of Pennsylvania to explain it to ordinary people.” In November, when he won the presidency, Reagan also won Pennsylvania.
In 1981, Reagan offered Whittlesey the position of ambassador to Switzerland. She had extensive knowledge of European culture, having visited several times as a student, and was an eloquent, passionate advocate for Reagan’s policies. Nevertheless, her appointment was “controversial,” according to Boston University Vice President Doug Sears, who was a member of Whittlesey’s embassy staff in Bern. In the Foreign Service, he relates, career appointees often disparage political appointees.
Career diplomats don’t mind if political appointees “just do the social scene, if they don’t do anything.” But “what made [Faith] special was that she was serious … about doing her job—and not the way the career people told her to do it, but the way she believed to be right.”
Whittlesey invited many distinguished guests to the embassy—leading Swiss bankers and businessmen, ambassadors from other countries, college professors, authors, and others—in hopes of better explaining Reagan’s policies to the Swiss. Sears says he was always a little terrified of these embassy dinners: they were strictly diplomatic in nature and each staff member was instructed to foster discussion with Swiss guests.
“Your job was to find someone, engage in conversation, and make sure that there were opportunities to advance the U.S. position on something correctly,” he says. “She’d look daggers at you if you were off in the corner with another American.”
Ambassador Whittlesey traveled to Swiss cultural and political events at every opportunity and accepted interviews with local and national media outlets—sharing Reagan’s foreign-policy vision wherever she went. During her travels, she met a student at the University of Zurich named Patricia Schramm. Schramm calls Whittlesey “a tough, iron lady. She was a very outspoken, determined ambassador.” Schramm remembers vividly a time when Whittlesey spoke at the University of Zurich about the Iran-Contra affair. The event quickly turned chaotic: anti-American protesters tried to steal the American flag out of the room and a violent tussle ensued. “There was a lot of anti-Americanism in Europe at that time,” Schramm observes. “But she was out defending the U.S. position.”
Whittlesey was often “stirring up attention in the press,” according to Sears. “There were a lot of big issues on the world stage that made Ronald Reagan controversial. But if you know of one Switzerland ambassador who is still believed and looked up to, it’s Faith.”
This is because, despite her staunch “iron lady” beliefs, Whittlesey was also a gracious diplomat. At the very beginning of her term as Swiss ambassador, she faced a tense U.S.-Swiss disagreement over U.S. stock markets and Swiss banks. “U.S. authorities pressured the Swiss to alter longstanding national laws to protect client privacy in the financial industry,” Carty explains. When Whittlesey discovered that the U.S. Treasury Department was preparing to file lawsuits against Swiss banks, she put pressure on U.S. officials to come to Switzerland and hold discussions with the bankers.
“Whittlesey personally coached both sides to pursue a mutually acceptable solution that would maintain the United States’ reputation as a trusted trade partner to Switzerland,” says Carty. The two countries came to a solution, and the crisis passed.
This desire to both understand and explain characterized Whittlesey’s work. “She projects a conservatism characterized by humility and respect for differences,” says Carty. “Rather than attempt to impose U.S. values as universal, she applied the same limited government principles to foreign policy as she did to domestic policy.”
After she had spent just over a year in Bern, Reagan’s Chief of Staff James Baker called Whittlesey and offered her the position of director of the Office of Public Liaison back in Washington. She accepted, eager to share Reagan’s vision with an American audience.
“Reagan’s foreign policy was strongly opposed—not only by Democrats, who ferociously opposed it, but by the establishment wing of his own party,” Whittlesey says. “He was called a warmonger and a cowboy.” Whittlesey began a careful and thorough outreach program from within the White House, seeking to explain Reagan’s foreign and domestic programs to a variety of interest groups, as well as to the media.
“We engaged the public, explained Ronald Reagan’s policies,” she says. “But we were just explaining: not trying to influence public opinion, but to let people know that his policies were based on facts, so they could make up their own minds once they heard the facts rationally presented.”
It wasn’t the first time: when Whittlesey campaigned for the Pennsylvania legislature in 1972, she was pregnant with her third child. As the November election neared, her pregnancy “became an object of contention,” says Carty. “Some Republican leaders cautioned Faith that tradition-bound Catholic and other voters might not vote for a pregnant candidate, and one Pennsylvania Republican leader predicted to her that she would lose the seat because of her pregnancy.”
But Faith won, and in ensuing years she cultivated a strong following of women volunteers throughout the county. Feminist groups opposed or ignored her—as Whittlesey told Carty, “They were really not ‘for all women,’ just for certain, mainly Democratic, bigger-government women.” But “ordinary women, mostly housewives and professional women, came out in droves to help me.”
Despite her disagreements with liberal feminists, Whittlesey never stopped fighting for women’s place in the political world. “The smoke-filled rooms are filled with men. I was certainly not invited in. I fought my way in,” she is quoted as saying in Carty’s Backwards in High Heels. (The title itself is a reference to how much harder Ginger Rogers had to work than her costar Fred Astaire: she had to do everything he did, only backwards and in high heels.)
Whittlesey was certainly an outsider in the White House: from March 1983 to March 1985, she was the only woman on Reagan’s 18-member senior staff. Upon her arrival in Washington, she found to her surprise that Baker and Deputy Chief of Staff Michael Deaver were planning to assign her specifically to “women’s issues.” She found the proposal demeaning. “None of these men wanted to deal with the women members of the establishment Republican Party,” she told Carty. “They wanted me to do that, which was, I believe, a completely sexist approach. In my entire political career I had repeatedly declined to be pigeonholed as someone dealing with women’s issues.”
Whittlesey “had disrupted her family life for the opportunity to serve the Reagan administration by promoting the president’s full range of administration policies,” Carty writes. She wasn’t about to be relegated to the role of “token woman on the senior staff.”
Many in the White House were worried about the “gender gap” in Reagan’s support and wanted Whittlesey to deemphasize his pro-life stance in order to garner more female voters. But she refused to ignore or gloss over Reagan’s stand. She believed a strong conservative agenda, well explained, could appeal to voters regardless of sex. Syndicated columnist Sandy Grady wrote at the time, “Women who want a strong White House voice on feminist issues, abortion, and lower Pentagon spending may not be thrilled by Mrs. Whittlesey’s views. She’s probably as far right as Reagan himself.”
In April 1985, Whittlesey returned to Switzerland for a second term as ambassador. She served there for three years, then resigned in 1988 and moved back to New York City to be closer to her family. That fall, a delegation from the American Swiss Foundation asked her to assume leadership of their association. The organization was created in 1945 to foster greater knowledge of Switzerland’s unique political traditions among an American audience and to build strong private connections between the two countries. Whittlesey accepted the position.
The most prominent of her projects at the foundation has been the Young Leaders Conference: a travel program that brings together young American and Swiss professionals. Those who attend the conference travel around Switzerland participating in a weeklong series of events, learning more about Swiss culture and government.
“Through this program, Faith has brought together over 1,000 leaders,” says Schramm, who now serves as president of the foundation. “Tocqueville talked about the importance of private associations, and her work exemplifies that. She’s built a very active foundation into a very important and powerful private association.”
“The Young Leaders Program is her baby,” Sears says. “It builds a class of people who know Switzerland better and generates a personal network. To the extent that we continue to have a good relationship with Switzerland, it will be because of those personal networks and private relationships.”
Whittlesey’s work in private diplomacy is not limited to Switzerland, however: she has also made six trips to China since 1979 and has worked extensively to better the country’s relationship with the United States.
She never thought China posed the threat to the U.S. that the Soviet Union did. “It was becoming clear to me and others,” she told Carty, “that the Chinese would be increasingly important to the U.S.—and in world events—and that communication and better understanding were desirable, despite our continued strong disapproval of their oppressive governing system.”
On a trip to China in 2005, she was alarmed at the hostility she sensed amongst the Chinese toward the U.S. because of the Iraq War and George W. Bush’s interventionist policies. “They seemed to think everyone in the U.S. was a warmonger,” she says. Whittlesey tried to explain that she—along with many other Americans—did not agree with the administration’s bellicosity and was in fact desirous of a more cautious foreign policy.
The next year, she was invited to bring a larger delegation to China and seized the opportunity to introduce the country’s officials to a broader array of conservatives. Her delegation included Andrew Bacevich of Boston University and Georgetown University Professor Joshua Mitchell.
Whittlesey was a “constant diplomat” on the trip, Mitchell recalls. She presented her delegation as those who had reservations about U.S. policies in the Middle East. The team was diverse, and its participants disagreed on various political issues. But uniformity was not the point—indeed, Mitchell says, “she knew there were differences among us. She knew the dance we’d be doing together with different positions. She just let it happen, she didn’t dictate.”
“I brought groups over there to have discussions,” Whittlesey says. “It was private public diplomacy. But I believe we have to work on a peaceful resolution of our differences with China. I know some in the Republican Party believe a conflict with China is inevitable, but I believe it would be disastrous. We have to prepare for the worst-case scenario but be constantly working on a best-case scenario.”
Whittlesey often surprises those who view Reagan’s foreign policy as interventionist or neoconservative. She adamantly argues that Reagan would not have supported the wars in Iraq and Afghanistan.
She expressed this view one day at the Heritage Foundation’s showing of “Reagan,” a documentary on the president released in 2011. There’s a section at the end of the movie, Whittlesey points out, which “purports to show why Reagan would’ve supported the Iraq War.” After the viewing, Whittlesey stood up. She said, “I had the privilege of working with Reagan for eight years, and I think differently. I don’t think Reagan would’ve supported these wars.”
While Reagan had a clear understanding of the threat posed by the Soviet Union, Whittlesey explains, “He decided the benefits of launching a land war would not justify the cost, and he did not respond militarily. He was cautious and prudent: he believed in giving freedom fighters in other nations material means to fight tyranny but not American boys and girls.”
Sears says he was surprised when he first heard Whittlesey express uncertainty about the wars in Iraq and Afghanistan. “In the fullness of time, we’ve seen her skepticism validated,” he says. “She’s always watching, she understands human nature and motivation. And she cherishes life. She’s known some hard times, her life has been marked by some tragedies, and you can feel it when you’re with her. The idea that we’re sending young men and women into conflict without a clear understanding of why really gets to her.”
Whittlesey believes the current plight of religious minorities and refugees in the Middle East is a direct result of U.S. interventions in Libya, Syria, Iraq, and Afghanistan. “Every time we go in and change a government in the Middle East, it seems to get worse,” she notes. While we should be grateful for our unique political system, she says, we mustn’t forget that it is inextricably tied to our historical background. “To send our precious young men and women to impose this system on other countries that don’t have our historical background is folly, and a waste of their lives and limbs.”
Schramm observes that America’s relationship with much of Europe has been severely damaged by foreign interventionism. “Faith’s example of diplomacy is to go to that country, listen to the leaders and what they think, take them seriously, and let their opinions flow into the public discourse in the U.S.,” she says. “This would prevent us from running roughshod over foreign relations.”
This sort of diplomacy deeply influenced Whittlesey’s standing amongst the Swiss: despite the unpopular stances she often had to take during her years as ambassador, Carty writes that she cultivated a strong friendship between the two countries, “based on shared understanding and mutual respect.” Sears says Whittlesey is “revered” in Switzerland: “I can’t think of any American ambassador who is respected as she is.”
Yet despite the esteem she has garnered overseas, Whittlesey’s name is still relatively little known in Washington. “She leads discreetly in the background,” notes Schramm. “In a way, perhaps, she is a bit of a loner. People don’t really understand her politics: how can you be a former Reagan Republican who is against interventionism abroad?”
But in this sense, as Mitchell notes, Whittlesey is simply a traditional conservative, who believes that limits and tradition still matter. “She is trying to reestablish what Republicanism should mean to the broader public,” he says. “She wants a refined view in simple formulations. She will fight if necessary, but she starts with diplomacy. Decorous conservatism—this is Faith.”
Gracy Olmstead is an associate editor of The American Conservative.
What’s the point of handwriting? In Hazlitt Magazine, Navneet Alang argues that it gives us “practical and symbolic resistance to the pre-programmed nature of the modern web” by helping us assert our own voice and style, outside of a technological box:
Handwriting is profoundly bodily. Like an exaggerated, intensified version of the sweeps and swipes we use on a tablet, writing by pen can make muscles ache. Write while crying and one’s hand becomes shaky, write with excitement and watch the swirls and loops of one’s arcs become wild—an inky neurochemical expression that type just can’t replicate or capture. … To write by hand is to always foreground an inevitable uniqueness, visually marking out an identity in opposition to, say, this font you’re reading right now.
Handwriting is inherently physical, and an expression of the individual, Alang writes. But it is also, I would argue, inherently social and familial. Take, for instance, my own handwriting: it’s far from perfect, still strewn with the inconsistencies that Alang rightfully notes are the nemesis of the aspiring writer. But when I look at my own writing, I see a variety of interesting, personal histories reflected therein:
I see the graceful, pirouetting “f,” “l,” and “s” of my mother. She was a ballerina, and I’ve always felt her script lyrically matches her dancing past. It has a sort of artless yet playful poise to it.
But in my script, I can also see the strong capitals, the precise “c” and “t” of my father’s accountant hand: his firm and meticulous script brought glorious form and concision to every post-it note and schedule. It is, interestingly enough, very similar to his banker mother’s handwriting.
My “y” and lower-case “g” are my farmer grandfather’s: he adds a graceful, arcing loop to their tails. It’s such a poetic flourish, one that I always thought slightly ironic (yet beautiful), coming from the practical, jocular man I called grandpa.
As you can see, I’m something of a handwriting thief: stealing my favorite artistic forms from the people I love most. But there’s something fun and even comforting to look at a piece of paper, strewn with journalism notes, and to see an entire family history therein.
Handwriting also tells a history, if you trace it through time: from a child’s first shaky alphabet, to an adult’s final shaky words. Here again, we see life coming full circle. We see a person’s progression, maturation, decay. My other grandfather—a pharmacist—used to write me four or five-page letters in a meticulous cursive. With age, he’s switched to cards, and careful-yet-faint capitals, as his hand trembles too much to write in his former script. Regardless, his distinctive style is there, shouting to me from every addressed envelope I receive from him in the mail, calling to mind a whole history of correspondence and affection.
People talk about the resurrection of handwriting via social media like Instagram or Pinterest—but the problem is that what we’re seeing here isn’t necessarily handwriting, but rather a (worthy and laudable) resurrection of typography. Here, it seems, we need to differentiate between three different, important, terms:
First, there’s handwriting: this is the subject at its most basic. It is very simply “writing done by hand.” Nothing specific here: it could be cursive, print, italic, what have you. It’s just the simple (and importantly physical) work of writing with your own hand, as opposed to using a keyboard.
Second, there’s penmanship: this is a more stylistic and specific term, referring to the way in which we write by hand. It carries with it a certain feeling of quality and finesse. A person with good penmanship can easily write a letter in polished cursive; a person with good handwriting may only be doing so in a simple printed script.
Finally, we have typography: a word that is important to include, because the world of graphic design and printed media has had such a tremendous impact on the way in which we write. Many of the artists you see on Instagram and Pinterest are not sharing examples of handwriting or of penmanship: they’re creating extremely stylized, beautiful letters that are exemplars of their craft—not an enunciation of their particular style or personality (in other words, of their penmanship).
Typography is increasingly popular today. But the simple, quotidian task of writing things down via “handwriting” is growing more and more rare. And thus penmanship—the daily practice of writing by hand, of cultivating a personal style and method of writing—is perhaps close to extinct.
There are interesting scientific studies that Alang refers to, in which we learn that handwriting may develop important cognitive skills and promote memory retention. But what his piece points to is that there are also qualitative goods involved in the work of handwriting that we may lose, if we abandon it altogether to technological devices. Even as computers produce a prolific, perhaps endless amount of stylized fonts for us to choose from, we may lose the humanness involved in the writing and development of letters. And this would be a sad fate—not just because of the physicality of handwriting that — considers, but because penmanship is, in my mind, a deeply familial and personal thing.
Were I to abandon all handwriting, and with it all practice of penmanship, the traces of lineage I see in my script would probably begin to die. The letters from my grandmother, in her spidery hand, may even become indecipherable as my eye grows unaccustomed to a human’s hand—even as I, as a print designer and editor, may grow ever more shrewd in recognizing the differences between fonts like Times New Roman and Garamond.
There are certain human traditions worth preserving, because they are inherently good—regardless of their utilitarian value. They bind us to the past, and give meaningful beauty to our present. Whether I am penning a letter and mimicking my mother’s loopy, graceful “f,” or whether I replicate my grandmother’s angular capitals as I fill out the crossword puzzles she loved, I am participating in a graceful linguistic and artistic tradition that threads through the past, and into the future. It’s not a cumbersome or annoying discipline: it’s a dance.
Following Hillary Clinton’s first major address on the economy this week, Jeff Spross wrote Wednesday for The Week that Clinton’s “focus on freeing up women to enter the labor force,” while laudable, is “also too narrow and uni-directional”:
She needs an approach that is both broader and more radical: giving all Americans, men and women alike, more control over when they participate and don’t participate in the labor force. … We should be working towards a society where men and women can negotiate with one another, with equal resources to pick their own version of the good life. And so that when they have to compromise that ideal, they can do so on their own terms.
Whether women are working less or working more is the wrong question to ask. Rather, ask if they’re doing what they want to do, or what society and the economy demand they do.
It’s interesting that we live in a time when society (or certain political groups within society) is single-mindedly focused on driving women into the workforce, seeing full-time work as a superior lifestyle for mothers. Contrast this with a piece also published yesterday, in which Wednesday Martin considers the “captivity of motherhood” experienced by women in the 1960s:
Johnson was groping toward feminism’s second wave before it came to be, feeling for a toehold. She listed, in her catalogue not of grievances so much as unsentimental facts about the lives of herself and her conspecifics, the following: isolation, worries about illness and money, and sexual boredom. She referred to the whole schmear as “the housewife’s syndrome, the vicious circle, the feeling of emptiness in the gap between what she thought marriage was going to be like and what it really is like.”
… A participant-observer in the bizarre social reality she describes, Johnson conveys without rancor the existentially isolated, restless feeling that my mother’s generation grappled with, and that Friedan wrote about when she quoted one of her subjects: “I’m a server of food and a putter-on of pants and a bed-maker, somebody who can be called on when you want something. But who am I?”
And Johnson might have added, “Why am I the only one?” Johnson starts off with the demographically inflected observation that women are cut off from their extended families and friends by the idealized nuclear family, with its ostensibly perfect evenings of popcorn and TV in PJs, but moves toward darker realities, including the impact of this existence on female consciousness.
This latter paragraph illustrates the geographical and urban components of Johnson’s problem: compare the isolation described above with this account from Benjamin Schwarz of working-class motherhood amongst working-class families in London up until the late 1950s:
…Working-class life was defined by an idiosyncratic approach to what is “always one of the great and indispensible functions of any society,” as Willmott and Young put it: the task of caring for children. That approach emerged from the distinctive relationship of the working-class mother—the sainted, mythic figure known as “Mum”—to her married daughters, a relationship that had developed from a universal truth: “Child-rearing,” Willmott and Young wrote in a deceptively obvious observation, “is arduous, it is puzzling, it is monotonous…” But a daughter’s “work can be less arduous because it is shared; her life less lonely because she has someone to talk to; the behavior of her children less perplexing because she has someone whose experience she can draw on.”
For a married working-class daughter whose husband was at work, her Mum was “the person with whom she can share the mysteries as well as the tribulations, the burdens as well as the satisfactions, of childbirth and motherhood.”
The result was a flourishing matriarchy, in which a woman’s authority and stature grew with age and in which—thanks to the proximity of employment for the menfolk and the geographically compressed layout of working-class districts—the households of Mum and of her married daughters’ families were, and in fact had to be, nearly always at most a few blocks away and often on the same street. (In a 1956 study of a Liverpool neighborhood, the sociologist Charles Verker revealed a not-atypical arrangement in which the households of one extended family—mother, daughter, father-in-law, sister-in-law, uncle, three cousins—occupied eight of the 22 houses of a short street.)
Children were raised as much in Mum’s house as in their own; married daughters would see their sisters and their sisters’ children at Mum’s, usually daily; their husbands would regularly have their supper after work at Mum’s; family “popping in” for a cup of tea and a chat was the norm—Mum and her daughters saw each other on as many as a dozen separate visits each day. The sons-in-law who gathered at Mum’s were usually drinking friends, and the friends and workmates of each became part of the others’ vast network of acquaintances.
In this world, Willmott and Young explained, in which daughters turned to Mum “in the great personal crises and on the small domestic occasions,” daughters and mothers would “share so much and give such help to each other because, in their women’s world, they have the same functions of caring for home and bringing up children.”
Separate clusters of extended families formed the working-class neighborhood. Its residents knew each other with an intimacy of detail and often through several points of connection—the people a young mother encountered while doing her marketing were her or her parents’ childhood playmates and the friends and relatives of her husband or her brothers- and sisters-in-law. Characterized by these informal, intimate, multilayered social networks, and by what Willmott and Young called “a sociable squash of people and houses, workshops and lorries”—in which the local pub was at the end of the block, the store for daily provisions around the corner, and a destination five minutes’ walk away was considered in a different neighborhood entirely—this was a largely static, tremendously local, intensely parochial realm.
This strikes a remarkable contrast to the suburban housewives described by Martin, “cut off from their extended families and friends by the idealized nuclear family.”
And the problem persists today—Martin says that even the “privileged mommies” she’s met are most often “cut off from extended families,” and thus “depend at least in part on nannies to help them with the heavy lifting of motherhood.” The disintegration of the home economy and its corresponding networks of support has developed steadily throughout time, and it’s had consequences for mothers.
When we talk about the plight of the 1960s housewife, we don’t talk enough about the role the rise of the suburbs—and increasing familial atomization—may have played in her isolation. Cut off from the rapport and community infrastructure Schwarz describes in his article, she was increasingly alone. Housework, child care, cooking, and cleaning responsibilities fell squarely on her shoulders—without any of the corresponding social or familial supports that may have alleviated those burdens.
Yet as our society industrialized and urbanized, women came to recognize their feelings of loneliness and isolation as an economic / inequality problem. While that may not be entirely false, there were other social, cultural, and geographic dilemmas involved. And without fixing those, the problem has continued to simmer beneath the surface.
Why? Because not all women have found contentment within the workforce, within a 9-to-5 job. Martin suggests that a lot of this discontent comes from a still-rigid set of options available to women: a lack of institutional flexibility, coupled with a lack of benefits, that make it difficult for a woman to “have it all” (job plus kids). Spross, too, despite his welcome acknowledgement that women should be able to choose their venue of work, be it in or outside the traditional workforce, continues to look to economic or career-related policies to salvage and assuage the problem. This understanding continues to reinforce the idea that SAHM’s loneliness or search for meaning can be fixed with economic band-aids. This makes no sense, when you consider the fact that this is an entirely narrow and fractured understanding of the human person, whose nature is decidedly social, relational, intellectual, and emotional, as well as vocational.
Last week, I talked to Joel Salatin, sustainable farmer and author, about the interns who come to his farm. He noted that several of them are coming from white-collar desk jobs: one current intern worked 10 years as an engineer for a Fortune 500 company. He got tired of computers and cubicles, Salatin said. “Our trajectory of institutional schooling, to student loans and college, to a white collar job keeps people from discovering other passions,” he said. “Lots of our interns were burned out by white collar careers. They tell me, ‘I always wanted to do this, but I was told I couldn’t.'”
While increased flexibility and benefits for women are of course beneficial and important, they overlook the other problem we face: that of a culture whose fabric has become increasingly rigid and atomized, in which women are separated from the familial, communal, and institutional supports that could help give them a sense of meaning and purpose. It’s a culture that gives smart, talented women one trajectory for their gifts, and tells them other pathways are a “waste,” or a sure-fire road to loneliness and dejection.
But if there’s one thing I’ve learned from the stay-at-home moms, homeschoolers, and retired ladies who I met growing up, it’s that everything is a matter of perspective, location, and relationship. What are you doing with your time? Are you living in a place that increases or decreases your ability to connect in a meaningful way? Are you reaching out to others, building an infrastructure of community and camaraderie around your family?
These questions may not be sufficient to assuage all the problems or inner conflicts that a mother faces—but they present a more holistic solution to her problem than the purely career-centric focus that modern society most often presents her with.
Some community colleges are dropping the word “community” from their name—and with it, Anna Clark argues at Next City, they are dropping an idea and distinction that is very important:
In all, more than 80 schools have cut “community” in the past 30 years, with at least 40 doing so in the last decade. … By assuming a name that has the ring of a traditional four-year school, community colleges are playing into the stereotype that they are less valuable than their counterparts. They give credence to the second-class stigma.
… [Community colleges] deserve to be taken on their own terms. Traditional education metrics, like the time it takes to graduate, aren’t neatly comparable. The Department of Education’s measure of graduation rates is designed for students who enroll in the fall as first-time degree-seeking undergraduates who attend school full time. None of that describes the majority of community college students. Most attend part-time, begin in “off” semesters, transfer to other schools and, in many cases, they are not seeking a degree or certificate when they sign up for classes. All of this distorts the statistics on community colleges, confusing “graduation” as a synonym for “educational completion.”
Because they serve a different population — including working adults, like my parents — their singular value should be recognized. And when it comes down to it, despite the snide nicknames, people get this. A June Gallup poll revealed that Americans are about as likely to describe the education at a community college as “excellent” or “good” as they are to positively rate four-year schools. With the national student loan debt at a preposterous $1.2 trillion, that fact deserves a spotlight.
Clark makes some good points here, and helps draw out some of the distinctions that make community colleges a “community” venture: they’re about serving a wider, more diverse swath of the local population. Less about academic prestige and imposing admissions standards, they emphasize vocational training and accessibility. They provide a starting point for high schoolers eager to get some prerequisites out of the way, and flexible schedules for working adults trying to obtain a degree. I once profiled a community college back in Idaho that provided classes to students with intellectual disabilities. As one mother told me, “The more the student is part of the college life and community, the more the impact on their skill acquisition and their connection to the community at large.”
In turning away from their “community” role and nomenclature, these colleges may in fact be making a dangerous turn away from the local, and from the vocation they serve within that sphere: Clark notes that when Jackson (
Community) College “announced its new name in 2013, officials boasted about how it will make them more marketable to international students. This is a surprising turn for a campus founded in 1928 to serve the people of rural mid-Michigan.”
As Zachary Michael Jack noted last month at Front Porch Republic, academia has grown increasingly rootless over the past several decades, even as the “local” becomes more popular in other venues of society: “America’s colleges and universities, particularly those in suburban and rural areas, continue to militate against students’ love of home. … Community colleges and small liberal arts campuses like mine that draw 75 to 80 percent of their students locally are widely and wrongly disparaged as ‘commuter schools’ by elitists, and yet they have gained market share precisely because they have helped students build bridges between “dual citizenships” at home and at school.”
Clark’s article should prompt us to consider the role that community colleges can, and should, play within their local community. Though four-year universities are important, they are often migratory institutions, drawing young people away from their towns and cities of origin—and often saddling them with mountains of student debt in the process. If community colleges fulfill an important niche in America’s academic society, as Clark argues, we must consider whether this name change constitutes a deeper change in their method, role, and philosophy—and whether that change could have deleterious consequences.
Ready for your library to go all-digital? That day may be coming sooner than you think—as the Washington Post reports:
Around the country, libraries are slashing their print collections in favor of e-books, prompting battles between library systems and print purists, including not only the pre-pixel generation but digital natives who represent a sizable portion of the 1.5 billion library visits a year and prefer print for serious reading.
Some of the clashes have been heated. In New York, protesters outside the city’s main branch have shouted: “Save the stacks! Save the stacks!” In Northern Virginia, the Fairfax County library system chief recently mused that the Friends of the Library were no longer friends — a feud fueled by outrage over a print collection that has shrunk by more than 300,000 books since 2009. The drop in the District is even more dramatic: Nearly 1 million books have vanished since 2009.
… “We’re caught between two worlds,” said Darrell Batson, director of the Frederick County Public Libraries system in Maryland, where the print collection has fallen 20 percent since 2009. “But libraries have to evolve or die. We’re probably the classic example of Darwinism.”
Batson’s words here reveal an unspoken—and flawed—assumption that undergirds many of these library transformations, and leads to a lot of misconceptions in the print vs. online debate over books. Namely, he’s assuming that technology always equals evolution, that embracing new technological fads is an essential part of progress.
But is this true? The new isn’t necessarily more durable or preferable—indeed, the new is often flimsy, unpredictable, and prone to unintended consequences. It has not yet withstood the test of time. In regards to book technology, specifically, a recent study has shown that e-reader users absorb less than those who read on paper. One professor reports that over 92 percent of students she surveyed “said they concentrate best when reading a hard copy.”
Technology, while an incredibly useful tool, is not necessarily better just because it is new. And the mobs of protesting bibliophiles who want their codexes back should be at least somewhat indicative of this. They’re not all reactionaries—indeed, as the Post points out, “One survey … found that just 5 percent of millennials read only e-books. Twenty-one percent of the millennials said they read more hard copy than e-books, and 34 percent reported that they only read print.” Even the youngsters still like print. They like e-readers, too, but their enjoyment of one does not exclude their use of the other. They’re “hybrid readers,” enjoying both mediums in different venues, at different times, according to need and the occasion.
It is also worth mentioning that this technological transition brings a considerable expense to the library: e-books are more expensive, says the Post, in part because “publishers fear large databases of free e-books will hurt their business.” Yet library’s budget spending on e-books has grown from 1 percent to 7 percent, while print book budgets have fallen from 67 percent in 2008 to 59 percent in 2015.
It isn’t obligatory—nor is it commendable, necessarily—for libraries to shun all new technological fads and tools. But it is important that they don’t shirk or shun the treasures of the past as they embrace new and promising gadgets.
Books have withstood the test of time for a reason: they’re excellent communicators, storytellers, and informers by their very medium. We love them exactly how they are. And some of us think that you don’t have to throw them out in the name of progress—indeed, such a hasty move may represent a regression, rather than an advancement. Rather, the trick is in preserving the best of the old, while finding ways to incorporate the best of the new.
Your parents’ income may play a large role in the major you select: in “Rich Kids Study English,” Joe Pinsker considers the elite bias toward studying the arts, history, and other less practical majors:
Kids from lower-income families tend toward “useful” majors, such as computer science, math, and physics. Those whose parents make more money flock to history, English, and performing arts. …With average earnings for different types of degrees as well-publicized as they are—the difference in lifetime earnings among majors can be more than $3 million, one widely covered study found—it’s not hard to imagine a student deciding his or her academic path based on its expected payout. And it’s especially not hard to imagine poorer kids making this calculation out of necessity, while richer kids forgo that means-to-an-end thinking.
Pinsker’s article draws us back to the discussion of the liberal arts and their relevance (or lack thereof) that has proliferated over the past few years. As B.D. McClay pointed out at The Hedgehog Review, many defenders of the liberal arts get caught up in trying to prove their vocational “usefulness.” Yet such arguments often, necessarily, fall through: “Pure mathematics has no place in a scheme of education that is about utility,” McClay notes. “Neither do the observational sciences, which are—despite being of great importance in the history of science—politely shown the door in pop accounts of the discipline. The fine arts, which have always depended on patronage for survival, will never be able to justify themselves on the grounds of utility.”
The liberal arts were taught in classical antiquity to freemen, and focused on non-vocational subjects seen as having a larger intellectual, philosophical, or political value. Yet such an education was still seen as having a practical value, considering the civic role that ancient Greeks and Romans played in the operations of their government.
What Pinsker’s research indicates is that only the rich think they can afford to learn something that isn’t useful to modern life’s larger goal (namely, procuring a secure and profitable career). While this makes sense from a practical point of view, it also indicates a changing attitude toward larger perceptions of civic duty and involvement, and what such duties entail. In today’s society, we are still called upon to vote, to engage in important local and national political decisions. Is it still necessary that we have a well-rounded and thorough understanding of non-vocational subjects such as philosophy, mathematics, economics, and history in order to make wise decisions in these matters?
Writer and philosopher Susan Neiman thinks so—in a recent Q&A with Salon, she talks about our increasingly infantile age, and points to its political-intellectual roots:
The social structures within which we live are constructed so as to keep us childish. The state has an interest in preventing us from thinking independently, and it cultivates and exploits our worst tendencies in order to do so, for grownup citizens are more trouble than they’re worth. The state’s desire for control and our own desire for comfort combine to create societies with fewer conflicts, but they are not societies of adults.
… It’s remarkable that though we are constantly told to exercise our bodies regularly, we hear very little about the importance of exercising our minds after we’ve finished our formal education. Reading Rousseau and Kant is one way to do so. … As Rousseau and Kant teach us, society has an interest in our not reaching maturity. By encouraging our most infantile characteristics, and diverting us from the truly important adult questions, it distracts us from the social problems that need to be solved.
While I would refrain from making a blanket “It’s the state’s fault” statement when considering our society’s love of youth, Neiman is right that our intellectual immaturities often prevent us from participating in civic matters with the necessary discretion, thoughtfulness, and prudence. The question of why we exercise our bodies and not our minds only points to this immaturity: we prize a fit body because it is more youthful and likely to produce longevity. But in our free time, we feed our minds on Netflix and Facebook surfing. The liberal arts are focused on developing healthy patterns of thought, robust intellectual habits and virtues. They teach us how to think, how to engage with important subjects, how to read and speak with a careful intelligence. By focusing only on career and monetary compensation, not on intellectual sagacity, our education system reinforces materialistic habits—and thus bolsters an idolatry of youth and comfort that (inadvertently or no) can encourage the dominance of the nanny state.
Of course, such problems aren’t solved by someone deciding to become an English major. But more thorough study of the liberal arts, in one’s private time as well as in one’s academic studies, may help reverse the trends of immaturity that we see scattered throughout our culture. Such studies may not lead us to greater financial benefit—but they will cultivate long-lasting political and cultural goods.
When I think of the July Fourths of my childhood, one stands out in particular. We had spent the weekend at a large family reunion in Cascade, Idaho, where my cousins and I biked along the lake, played chess, and created a makeshift parade with flags duct-taped to our hats. During mealtimes, the mothers, aunts, grandmothers, and great-aunts brought out of pasta salad, baked beans, casseroles, and all those other lovely northwestern comfort foods. We would load up our paper plates, sit around picnic tables, and listen to my great-grandfather, his sister, and brothers—all nearing or in their 90s—swap stories about the old days.
The night before the Fourth, my sister, cousin, and I travelled home with my grandparents. We spent the next morning helping them unpack supplies from their camper, and then Grandma fed us in a splendid fashion. That night, we watched fireworks from the fields next to their house: fields that led up to a steep cliff, overlooking the valley beyond. I was cold, so Grandma gave me her red jacket. Like any splendid grandmother, she wore seasonal clothing with aplomb. This unbelievably soft jacket had a bejeweled flag pinned to the left shoulder.
I’ve often wondered why that Independence Day stands out so strongly in my memory. Perhaps it’s because it embodies many of the things I love and feel devotion for: strong family ties, rich heritage and history, distinctive local food cultures, bright aesthetic displays of devotion.
In recent years, my Fourth of July has looked quite different. There were some college summers spent at Nationals baseball games in D.C., grabbing food in Georgetown and watching fireworks on the Potomac. Post-marriage, I joined a new clan, and began to acclimate to their Independence Day culture: there were less casseroles and more salads, but the close family dynamic remained.
What I’ve missed most on the east coast, perhaps, is that rich tapestry of inter-generational history and context that gives permanence and meaning to one’s celebration. Washington’s Fourth of July revelers are a less rooted bunch—many of them interns and recent college graduates, who scan the homepages of the Washington Post and Washingtonian for the best hole-in-the-wall restaurants and rooftop bars from which to watch fireworks and drink booze. They have no family get-togethers to attend. Even my husband’s close-knit family members are immigrants to this area, originally from the midwest.
Yet there is an aura to Washington’s Fourth that is distinctive and interesting: it involves a sense of intellectual camaraderie and depth that is very different from the Independence Days of my past. These days, I am better versed in Tocqueville and the Federalist Papers. I’ve visited monuments and written extensively on American politics; I’ve met congressmen and attended countless policy panels. These all give me an interesting tapestry of intellectual and political distinctiveness through which to view July 4th. They even give me a slight sense of kinship with the various interns, politicians, and think tankers scattered throughout Washington.
But I’m also learning that a little bit of homesickness is almost a necessary piece of Independence Day, as we grow older. It reminds us where we’re from: of the local ties that bind us, the fond remembrances that truly encapsulate our American experience. Without that sense of loss and homesickness, we wouldn’t understand fully what it means to love a place and its people.
So this year, as I gather to eat with new family and watch fireworks with old college friends, I’ll also be thinking of Grandma and her soft red jacket—of watching fireworks burst bright colors across the shadows of Idaho fields, and the sweet tapestry of family and history that brought me to this new place.
How ought we to read? In the New York Review of Books, Tim Parks considers a Vladimir Nabokov quote which promotes the intense reading of a few over the broad perusal of hundreds—and he wonders, is Nabokov right?
“When we read a book for the first time,” Nabokov complains, “the very process of laboriously moving our eyes from left to right, line after line, page after page, this complicated physical work upon the book, the very process of learning in terms of space and time what the book is about, this stands between us and artistic appreciation.” Only on a third or fourth reading, he claims, do we start behaving toward a book as we would toward a painting, holding it all in the mind at once.
… The ideal here, it seems, is total knowledge of the book, total and simultaneous awareness of all its contents, total recall. Knowledge, wisdom even, lies in depth, not extension. … Since a reader could only achieve such mastery with an extremely limited number of books, it will be essential to establish that very few works are worth this kind of attention. We are pushed, that is, toward an elitist vision of literature in which aesthetic appreciation requires exhaustive knowledge only of the best.
… So, is this an ideal attitude to literature? Is Nabokov right that there is only rereading? Does the whole posture, both Nabokov’s and that of critical orthodoxy, bear any relation to the reality of our reading habits, particularly in a contemporary environment that offers us more and more books and less and less time to read them?
Meanwhile, Ken Kalfus writes a relatable—and even amusing, though in an almost tragic way—piece in the New Yorker about the way we shop for books now:
Bookstores have become places of regret and shame. We once enjoyed shopping in them or simply looking in their windows, back in the days when they were ordinary retail establishments. They were like stores that sold shoes or hats, but with more appealing merchandise. Now they’ve taken on moral significance. Buying a book and choosing the place to do so involve delicate and complicated considerations. You may fail to do the right thing.
… My remorse enfeebles me. I recognize that I’m no longer thinking about the essence of the reading experience or the book I want to buy, which in the depths of my moral rumination has been turned into simply another form of consumption, and not even that, but rather the aspiration to consume.
For the bibliophile, these dilemmas are deeply understandable. We’ve all confronted that large and looming bookshelf, considering furtively—or even fearfully—what we should buy… or whether we should buy anything at all, since we’ve probably got three or even 15 books on our shelf that are as yet unread. We think of the treasure troves left to be discovered, the talented authors who we could support through our sales, the tiny indie bookstore feebly making it by, day by day. And all of a sudden, choosing a book becomes a monumental, even moral, task.
Are other pastimes saddled with this moral weight? Few of us spend such time and mental energy perusing Netflix or movie theatre listings. True, we may be overwhelmed by choices; but in the end, we’re merely seeking some evening entertainment. And unless one is a true film enthusiast, these cinematic choices don’t leave us in any sort of moral quandary or panic.
So why is reading different? Perhaps because, for many of us, it’s more than entertainment: it’s part of a larger search for truth, goodness, and beauty. It’s a way we delve deeper into our souls, and the souls of others. It often leaves us shaken and transformed. As Nabokov points out, the longer we spend immersed in a particular work, the more we begin to know and love it—and the more it begins to change us. Reading leaves a more indelible mark on the human mind than most other forms of entertainment. Thus, choosing a book is often like choosing a particular course for one’s future: mapping out the free hours of the coming days, yes, but even more, mapping out a new mental and spiritual journey for the self to embark upon.
Through our reading, we come to know and love writers. We often find ourselves compelled to keep buying and reading their material, seeking to know them better—and seeking to support their work.
Through our reading, we come to know and love bookstores. They leave us with deep sensory impressions, fond memories of serendipitous discovery, a lingering thirst for joys yet undiscovered. We frequent our favorites with religious devotion—memorizing the layout of their shelves, seeking their most comfortable chairs.
For the bibliophile, reading is a monumental part of living. And thus, the choosing of every book must be considered deliberately and thoughtfully. Which brings us back to the two questions that Parks and Kolfus consider: first, should we seek quality or quantity in our reading? And second, what moral claims should lay heaviest on our hearts in the choosing of a book?
Back in 2013, I wrote about George Vanderbilt’s enormous library and voracious reading habits: he reportedly read 3,159 books in his lifetime (approximately 80 books a year). I used to think that such a feat would be accomplishable for me—3,000 didn’t even sound like that many. But the older I get, the busier life becomes… and getting through 40 or 50 books in a year seems like a monumental accomplishment. I’ve realized that Vanderbilt’s record will never be mine—and that it doesn’t have to be. Instead, it’s become clear that I ought to savor each book: re-reading my favorites, while still seeking those essential reads that stretch brain and attention span in healthy ways. Speed reading is useful for the accumulation of necessary knowledge, while slow reading is essential for the appreciation of written beauty. Perhaps our best reading choices lie at the junction of quality and quantity: we can quickly peruse tedious or secondary works, then slowly absorb the masterpieces worth relishing.
And what of the purchasing of books? William Giraldi touched on this struggle in his excellent piece on personal libraries earlier this year. “Agonizingly aware of the human lifespan,” he wrote, “The collector’s intention is not to read them all, but, as E.M. Forster shares in his essay ‘My Library,’ simply to sit with them, ‘aware that they, with their accumulated wisdom and charm, are waiting to be used’—although, as Forster knows, books don’t have to be used in order to be useful.”
To the non-bookworm, such sentiments probably sound ridiculous and expensive. But to the bibliophile, such a statement bears witness to the dilemma we all face, the tightrope we walk on every trip to the bookstore or the bookshelf: what to read, re-read, and why, are the questions that must be weighed in the balance. Thankfully, a good book is worth all the work.