“Does GPS kill curiosity?” That’s the title of a piece David Sturt and Todd Nordstrom wrote over at Forbes yesterday. Ironically, their article doesn’t really give an answer—their points on geocaching and the criminalization of curiosity are metaphors for a discussion of creative innovation in the workplace. They encourage businesspeople to “go somewhere ‘in-between’,” to “lose your map,” but all in a figurative sense. So despite the options offered to innovators in the workplace, we’re left with this question:
… Has the true explorer died within our culture? Have the pathways, roadmaps, proven strategies and GPS units killed off the spirits of great explorers like Magellan, Lewis and Clark, and Marco Polo?
It is good to consider the effect GPS systems have had on our culture. They have greatly enhanced the ease of travel—the ability to get from point A to point B—but they also make it more difficult to “go somewhere ‘in-between,’” as Sturt and Nordstrom write. We journey, most often, on freeways that are disconnected from the social and architectural fabric of passing communities. We are often too busy noting our estimated time of arrival and upcoming traffic patterns to enjoy passing landscapes. The GPS always promotes the most efficient route for drivers to take—but it doesn’t take note of scenic or historic importance. This is great when you’re in a rush, but can be potentially damaging for road trips, when we’re meant to see and savor.
GPS-navigated travel can also encourage a sort of mental laziness on the part of the driver. We aren’t forced to fully remember which turns we take, or which roads we’re driving on. We merely follow the GPS’s step-by-step instructions. Contrast this with traveling by maps (even a printed out Google map): while in the former scenario, we’re fed baby bites of navigation, the latter forces us to pay careful attention to every sign that passes, every twist and turn of the road. We recognize landmarks and road signs, and can easily find our way a second time, sans map.
Maps, of the smartphone and printed variety, are incredibly useful tools. I’m not saying we should stop using them. But there are times when, perhaps, we should consider taking Sturt and Nordstrom’s advice more literally—when we should seek out “in-between” places, rather than focusing on “getting from destination to destination.” Some of the best places are “in-between” larger places: small towns nestled up on mountain roads, scenic hikes tucked away from urban bustle, hole-in-the-wall restaurants hiding from more bustling thoroughfares. Our explorations of place should involve a desire for detours, and a willingness to stop.
We should also be willing to put down the maps. As Sturt and Nordstrom put it, “All of us have grown up in a world where our outcomes have been directed, and expected … we are suggesting you let your curiosity be your guide into some unknown territory.” Following maps—whether a smartphone GPS system, or merely our own travel-worn steps in a familiar town—can prevent us from discovering new and beautiful things. Setting down these guides enables us to discover beauty and mystery, both on the road and in our most familiar places. Read More…
Focus is a difficult thing to muster these days. As David Brooks wrote in a recent New York Times column, we are all “losing the attention war. … Many of us lead lives of distraction, unable to focus on what we know we should focus on.” This affects many of our daily activities, especially things that require special mental concentration. It creates a minute-to-minute challenge for the reader, who will always feel the pull of the next Twitter story, the latest email, the newest Facebook post, etc. Whether reading news articles or books, we feel the distractions itch at our brain.
Yet despite our society’s general lack of focus, we aren’t necessarily abandoning long books—the Goldfinch, one of the most popular novels on the market right now, is 771 pages long. But this lack of focus does mean that modern books increasingly cater to the short-term attention span. Tim Park explains at the New York Review of Books:
Never has the reader been more willing than today to commit to an alternative world over a long period of time. But with no disrespect to Knausgaard, the texture of these books seems radically different from the serious fiction of the nineteenth and early-twentieth centuries. There is a battering ram quality to the contemporary novel, an insistence and repetition that perhaps permits the reader to hang in despite the frequent interruptions to which most ordinary readers leave themselves open. Intriguingly, an author like Philip Roth, who has spoken out about people no longer having the “concentration, focus, solitude or silence” required “for serious reading,” has himself, at least in his longer novels, been accused of adopting a coercive, almost bludgeoning style.
The author suggests that, in this world of limited attention spans, the serious writer may have to parcel out his or her writing into shorter sections or volumes, in order for anything like the eloquent and verbose reading of prior eras to remain. An idea worth considering: could this trend toward short reads give rise to a deeper appreciation of poetry in the future? It’s thought-provoking and often mind-taxing, true—but it’s short. Or at least, some of it is. It would be interesting to see whether poetry makes a comeback in the future.
Another possibility worth considering: what if we brought back the short story, or syndicated novel? The New Yorker publishes short fictional works, but few other large journalistic publications do. Yet these fictional stories used to occupy a considerable portion of the news cycle. What if modern novelists, like Charles Dickens in days past, published their novels in publications like the New York Times or TIME magazine—one chapter at a time? I think the public would love it—and it could also help print publications build an audience that they seem to be steadily losing. Read More…
Dear Mr. Dawkins,
You’ve said lately that fairy tales are quite harmful. Your reason for thinking this is simple, and true: you told attendees at the Cheltenham Science Festival, “I think it’s rather pernicious to inculcate into a child a view of the world which includes supernaturalism … Even fairy tales, the ones we all love, with wizards or princesses turning into frogs or whatever it was. There’s a very interesting reason why a prince could not turn into a frog – it’s statistically too improbable.”
But shortly after, you did add a caveat to those statements—you noted that you do not “condemn fairy tales. My whole life has been given over to stimulating the imagination, and in childhood years, fairy stories can do that.” But you still wondered, understandably, if fairy tales “inculcate into a child’s mind supernaturalism … that would be pernicious. The question is whether fairy stories actually do that and I’m now thinking they probably don’t.”
There are two reasons I think fairy tales are important, and I wonder if you’d consider them—especially the first reason. I don’t know if you’ll like the second reason—because I think it could bring life to your worst fears.
The first reason is one that C.S. Lewis (I know you’re probably not a fan of his, but bear with me) first posited. In a longer essay on writing for children, he suggests that fairy stories present important—and very real—courage to their readers, through a metaphorical means:
… Since it is so likely that [children] will meet cruel enemies, let them at least have heard of brave knights and heroic courage. Otherwise you are making their destiny not brighter but darker. Nor do most of us find that violence and bloodshed, in a story, produce any haunting dread in the minds of children. As far as that goes, I side impenitently with the human race against the modern reformer. Let there be wicked kings and beheadings, battles and dungeons, giants and dragons, and let villains be soundly killed at the end the book. … It would be nice if no little boy in bed, hearing, or thinking he hears, a sound, were ever at all frightened. But if he is going to be frightened, I think it better that he should think of giants and dragons than merely of burglars. And I think St. George, or any bright champion in armour, is a better comfort than the idea of the police.
Lewis saw a potent metaphorical force in the fairy tale: it helped children battle the pains and frustrations of reality through its images of valor and heroism. None of us ought to read children news stories about serial killers and tragic accidents. These things are too graphic and frightening for their young minds. But by reading them stories of evil monsters, and by telling them of knights and heroes who bravely stood up to such monsters, they receive greater mental and moral strength. When they grow older, they’ll have to fight their own real-life villains and calamities. The fairy tale’s metaphorical power gives real strength to them as they grow. Read More…
There has never been a time in human history when knowledge was so readily available to the average person. The vast annals of the Internet beckon to each person with infinite possibility. But this limitless compendium, despite its many good qualities, has its dangers as well. Karl Taro Greenfeld explained some of these drawbacks in a New York Times article last Saturday:
What we all feel now is the constant pressure to know enough, at all times, lest we be revealed as culturally illiterate. So that we can survive an elevator pitch, a business meeting, a visit to the office kitchenette, a cocktail party, so that we can post, tweet, chat, comment, text as if we have seen, read, watched, listened. What matters to us, awash in petabytes of data, is not necessarily having actually consumed this content firsthand but simply knowing that it exists — and having a position on it, being able to engage in the chatter about it. We come perilously close to performing a pastiche of knowledgeability that is really a new model of know-nothingness.
He is right to point out that our knowledge is often of a shallow sort: the kind that’s gleaned from a swift perusal of headlines, or a lunchtime browsing of 140-character tweets. But even if our knowledge is elementary, it is necessary that we have it. Ignorance—or at least, a humble acknowledgment of ignorance—is largely taboo. If we are ignorant about something, it’s usually best to hide it.
Leah Libresco noted in a recent TAC article that the satiating of curiosity via Google has encouraged a pervasive ignorance-guilt in our culture. Libresco notes that the website “Let Me Google That For You” (LMGTFY.com) “exists to rebuke those who ask a friend something that they should have googled … The unstated premise is that asking for help is a rude imposition, one that reveals incompetence or laziness.”
Outside the realm of friendship, this fear of “incompetence” seems to run very deep. In the career world, it’s almost dangerous to be ignorant. When writing resumes, going to job interviews, talking to colleagues, or chatting at happy hours, we must be—above all else—knowledgeable. It seems imperative that we know the code words, the vague cultural or technical references. And in the larger cultural conversation, a minute awareness of the literary, musical, and artistic world is paramount to proper participation. “Whenever anyone, anywhere, mentions anything, we must pretend to know about it,” Greenfeld writes. “Data has become our currency.”
This cultural tendency is dangerous for a couple reasons. First, it encourages a deemphasis on “longhand” knowledge. But while a miniature collection of various facts and curiosities is a useful toolbox to have at one’s disposal, the best knowledge—the sort that guides our souls and enlightens our minds—ought to run a bit deeper. As Greenfeld notes, we must allow ourselves to be “lost in the actual cultural document itself,” whatever it might be, rather than the more popular alternative: “to mine [the document] for any valuable ore and minerals — data, factoids, what you need to know — and then trade them on the open market.”
Additionally, our emphasis on bloated erudition has created an atmosphere in which it seems impossible to say the words, “I don’t know.” They almost seem rude. They convey all the things our society despises: technological incompetence. Lack of erudition. Shallow-mindedness. Amateurishness. In this world, the ignorant are scorned—even if theirs is a thoughtful or humble ignorance.
Taking back the words “I don’t know” is important—for while such words can imply intentional ignorance or apathy, they can also signal a bevy of virtues: humility, teachability, an eagerness to truly learn. Unfortunately, a website like LMGTFY.com exists to deride and sneer at such humility. It leaves us thinking that, in a technological age, all knowledge—even the specialized sort—has become common knowledge. And in this world of common knowledge, humility is no longer a virtue anymore.
The New Trad is a poetry journal and self-described experiment, published by a small press in Sydney, Australia. This little literary platoon is determined to revive poetry’s place in the public consciousness, come what may. They are well aware that they face an uphill battle, but their resolve to eschew free verse and highlight the importance of place and rootedness is admirable.
The introduction makes a valiant effort—and largely succeeds—in describing the two-part decline of poetry in the public sphere. On the public side, the song and novel largely replaced the poem’s literary value during the second half of the twentieth century:
The people’s poet of the nineteen-sixties was not Ted Hughes but Bob Dylan. The popular song is what impinges on the traditional territory of the poem, forcing it to deform itself to justify its existence, much as the photograph did to painting, film to theatre, or science to philosophy. (p.9)
But changes in popular taste are not solely responsible for poetry’s endangerment. The editors accurately observe a recalibration amongst the intellectual milieu and academia writ large from a vertical to horizontal orientation: “Above was replaced with ahead; the promise of a kingdom of heaven was exchanged for the promise of the Enlightenment: a self sufficient humanity, striding forward into a future of progress, peace and self-mastery.” (p.7) The unfortunate byproduct of this rearrangement is the poem’s obfuscation; there is a resulting lack of connection between poet and reader. A true commitment to form is replaced with lines devoted to the author’s own internal monologue, decreasing the impact on his readers.
Now that the poem is more a means of expressing individualism, it has lost much of its original potency of evoking a particular time and place. Poetry is a cultural heirloom in a way the novel is not. Novels, by design, are narratives subject to the author’s vision. Poetry, even in epic poems that have a narrative, is imbued with historical and cultural ties that are tantamount to identity. Homer’s cultural influence on the Ancient Greeks, for example, was inestimable not only in terms of its artistic contribution, but also in its cultural legacy. In other words, poetry has a rootedness—both in its structure and in the themes it evokes.
In its first volume of poetry, New Trad seeks to bring back the locality of metered verse, mostly modeled on the Ancient Greek rhyme and meter patterns. What the editors describe as the raw emotional power of the confessional poetry that dominated post-modern poetry (Ariel by Sylvia Plath was the groundbreaking work in this arena) is preserved in the poems printed in this edition, but the rhyme scheme and meter is the invisible structure holding it all together. The journal’s first edition is divided into three parts: the first is a spate of a few short poems submitted by writers and academics; the second contains two academic papers on meter. The second paper is an introduction to a segment of an epic poem written in an Icelandic style that constitutes the final section of the journal. You can get through the volume in an afternoon. Read More…
Ian Marcus Corbin’s excellent book review on Albert Camus brought to mind some important—and easily forgotten—truths on the intersection between the abstract and the particular, specifically as they apply to the realm of writing. One of my favorite parts from the article:
There is no way for a thinker—or indeed, a user of language—to eschew abstraction entirely, of course, but Camus was deeply attuned to the dangers of excessive abstraction. This may not sound particularly heroic, but it can be, and it certainly was in Camus’s day. Camus’s peers, mid-century French intellectuals, were all too susceptible to the raptures of abstraction. The Left Bank bien pensants were, with few exceptions, stalwart armchair Marxists, obliquely aware that the divine dream of the worker’s paradise was exacting a brutal toll on the actual humans of the Soviet bloc, but blissfully unmoved by this fact. Camus publicly, angrily, charged that their fixation on beautiful ideas made them insensate to the ugly cost such ideas imposed on the much-beloved proletariat. And indeed, it is now difficult—impossible—to think Camus wrong.
In contrast to other philosophers of his day, Camus couldn’t turn a blind eye to the pain—and beauty—of his world. Corbin notes that he travelled to executions and wrote about them in “excruciating detail.” He was a man “entranced” by the real—a man who once wrote, “There are, before our eyes, realities stronger than we ourselves are. Our ideas will bend and become adapted to them.”
These descriptions of Camus’ focused writing and living reminded me of a New Yorker piece I read yesterday about Nellie Bly—a groundbreaking female journalist who, in her perhaps most well-known feat, pretended to be mentally ill in order to report on maltreatment and abuse in a mental asylum. The sheer grit required for her feat amazes me. Bly’s undercover journalism required courage, persistence, and a deep love of humanity.
But long, person-focused stories like hers seem to dominate the media less and less, unfortunately. And though it’s true that shrinking newspaper budgets may place some part in this, I think technology, nationalization, and “datafication” of the news are primary culprits.
First: it’s becoming increasingly easy for journalists (myself included) to work primarily in front of a computer all day. While some may still go out into the world and interview real people, interviews are increasingly easy to do over the phone or email. And the web is a machine that can dangerously curate our experience of the world: Google weeds out news stories and websites that it thinks users won’t want. While we may interact with people who defy our stereotypical vision of the world, it becomes increasingly difficult to do so. We deal mostly in trends, blog stories, and click bait. It’s much easier to lose the human face in this world.
The primary realm of on-the-ground reporting remains the small town newspaper—and sadly, these newspapers are suffering most in our current economy. I think it’s vital that these small-town presses stay alive. They play an important role in the life of their citizens, and do much of the work that newspapers were principally created for. National newspapers, while still offering us valuable information, focus more on the aggregate than the individual by design. They’ll cover a person, occasionally—but only if the person is notorious, or emblematic of a larger trend or movement.
It is true that there are still beautiful news features that focus on the human person, as an individual. But it often seems more useful, pragmatic, and precise to measure the human species (or voter bloc, or nationality, or gender) as a whole, and write about that. Thus, our stories change—rather than writing about a local single mom, we write about “Why America’s Single Mothers Struggle With So-And-So.” Instead of writing about a young woman struggling to find a good job, we write about “The Confidence Gap.” Instead of discussing gentrification in Seattle or San Francisco, we write about “How Gentrification Hurts People” (complete with pretty graphs).
This rebuke isn’t just for websites like Vox, which focuses perhaps more than most on the quantified news story. Many journalists nationalize and abstract our news stories. After all, we’re told to—the news cycle and commenters reinforce, and even demand, our abstraction and quantification. A writer who speaks from experience, or tells a singular story, may receive the retort, “Well that may be true for you/your source. But where are the numbers? This seems like an isolated incident.” We begin to realize that personal stories no longer matter—unless they fit within a well-known trend or datafied truism.
David Brooks made some excellent observations on this tendency in his Thursday New York Times column:
…[A]cademic research offers a look at general tendencies within groups. The research helps you to make informed generalizations about how categories of people are behaving. If you use it correctly, you can even make snappy generalizations about classes of people that are fun and useful up to a point.
But this work is insufficient for anyone seeking deep understanding. Unlike minnows, human beings don’t exist just as members of groups. We all know people whose lives are breathtakingly unpredictable: a Mormon leader who came out of the closet and became a gay dad; an investment banker who became a nun; a child with a wandering anthropologist mom who became president.
… By conducting sensitive interviews and by telling a specific story, the best journalism respects the infinite dignity of the individual, and the unique blend of thoughts and feelings that go into that real, breathing life.
Thus, Camus and Bly wrote about the individual. They still acknowledged the big picture, and made sure to write about it. But they didn’t forget the inconsistent, unpredictable beauty of the individual.
Neither should we: though we may (and perhaps still should) write stories about “Why Reading Is Important for Everyone Everywhere At All Times,” or “How Women Can Conquer the Inequality Gap,” we should also write stories about Tom, Harry, Mary, and Anne. Because theirs are the stories, says Camus, that are real.
Back a couple years ago, I started using Goodreads: it’s a useful tool to keep track of books read and enjoyed, and it’s a great place to discover books not yet read. But now I’m considering a step away. And it has nothing to do with the social network itself—it has to do with me, and the susceptibility to self-consciousness as a reader.
Tania Unsworth, an author of three books, described this tendency well in her recent post on Nerdy Book Club. She reminisces on reading as a child, the utter abandonment it proffered, and compares it with her reading now:
There was an intensity to reading then, a kind of total involvement in story that is hard to reproduce as an adult. I know too much now about tired plots and clichés. I am always comparing one thing to another, recognizing devices, identifying styles. No matter how good or bad I find something, I’m always aware of my response, slightly detached, consciously enjoying or not enjoying.
She writes of a time when she was a “girl of eleven, with a flashlight under the covers, devouring The Chronicles of Narnia”—when there was a complete immersion in the world of the novel, when one connected with a book’s protagonist and experienced the world through the eyes of the “other” in a powerful, beautiful way.
Why do readers lose this sort of joyous abandonment?
Perhaps it starts with book-based essays and college papers: with the constant call to analyze, quantify, and measure what we’re reading. This is, to some extent, unavoidable. But it doesn’t end there: the social media world encourages us not to do things for their own sake, but rather for the approval and attention of our burgeoning online audiences. We don’t just read according to the suggestions of others; we don’t just join book clubs. Rather, we Instagram pictures of ourselves reading, join a social network where we can show off our huge bookshelves, and post smart-sounding quotes on Facebook. I’ve done this—perhaps we’ve all done this. The problem is that it uses the author, the book, and the protagonist for our own personal, selfish ends. It makes the book about us, rather than about the story itself.
There’s also the siren call of list-making: we all love watching a list of accomplishments grow and grow. This is perhaps the largest reason I find Goodreads dangerous, personally: it enumerates the books I’ve read, and organizes them into admiration-worthy lengthy lists. It encourages me to look not at the quality of reads, not at the specific beauties of various works, but to admire and venerate the amalgamated monstrous whole. Thus, I begin to rush: I want to finish this book, that book, and the next—not to meet a deadline, necessarily, not because I’m so engrossed in the book I can’t stop—but merely because I want to check another book off my list.
Perhaps, as a writer and occasional book critic, this sort of self-conscious reading will be somewhat inevitable in the future. But I do want to re-experience the animation and passion that Unsworth describes in her article. There is a beauty to the imagination that deep-reading requires. Whenever we read for criticism, for an audience, or for the joys (and dangers) of list-making, we will always have another besides the story in mind: whether it be ourselves, or our audience. Read More…
Conservatives have developed a tendency to deify the rural, as Matt K. Lewis noted in a Thursday story at The Week. This idolization probably stems from a variety of influences—Lewis mentions “the influence of religion (think the Garden of Eden versus the Tower of Babel), philosophy (Rousseau’s notion about noble savages), and various ideas during the time of America’s founding (Thomas Jefferson’s agrarianism, for example).”
But what of the city? Isn’t the city one of the classical bastions of culture? Indeed it is. And Lewis argues that rural-worship is not, in fact, true to the roots of classical conservatism’s past:
Much of conservatism — free markets, for instance — is premised on the notion that more people equals more ideas. (This, of course, is inconsistent with a more traditional, populist strain of conservatism.)
This more optimistic brand of conservatism gained a foothold when economists like Julian Simon and Ester Boserup took on the Malthusian catastrophe argument (which erroneously predicted that global overpopulation would lead to mass starvation), and instead argued that more people equals more ideas, innovation, and yes, prosperity.
When you think about it, it makes sense. Rural societies tend to work on subsistence (you eat what you grow), but cities lead to things like cooperation, specialization, and trade. These things make us rich. Cities are the areas where these things are magnified. More people — constantly bumping into each other — leads to all sorts of inventions and human flourishing.
He adds a bit later,
Perhaps the most ironic thing about other conservatives adopting an anti-city worldview is that it is partly based on a pernicious lie advanced by the high priest of romanticism, Jean-Jacques Rousseau.
Rousseau essentially invented his own creation myth out of whole cloth. It differed greatly with the Christian understanding of creation, inasmuch as instead of viewing man as a fallen creature (due to original sin), Rousseau envisioned early man as a sort of noble savage. It wasn’t until man recognized the concept of property and ownership, Rousseau argued, that he became greedy and corrupted. In that view, a simple life is good and pure. A modern urban life is dirty and wrong.
… As a boy growing up in rural western Maryland (seriously, this was physically and stylistically closer to West Virginia than Baltimore), it was instilled in me that country folks were God-fearing, salt-of-the-earth types and that big city folks weren’t. The sense wasn’t just that cities were different, but that they were somehow morally inferior.
Lewis’s post struck many chords for me. As a “crunchy con“-leaning girl with strong ties to farmland and agrarian culture, I love to champion a Wendell Berry-esque conservatism that savors the beauty of fields, farmland, and small-town community. Indeed, Wendell Berry’s writings—though excellent and full of good thoughts—do have this tendency to reverence the rural and unfairly criticize the cosmopolitan. Sadly, many conservatives (myself included) confuse this love of the “pastoral” with a proper love of “place.” We think that, in order to be “rooted” to a specific plot of land, we must be rooted in a country haven.
However, when one really considers the urban nature of America, it makes no sense—and indeed, it would be detrimental—for all of America’s conservatives to abandon urban areas and cosmopolitan centers for a secluded country lifestyle. We may need our countryside Benedictine havens—but we might also need a few similar havens within the city itself. Read More…
It’s Saturday—the day of waiting, the day of quiet. The day when disciples quaked behind closed doors, and darkness covered the lands, and the Son of God lay in a tomb. The day of aching, grieving, seething pain.
That 24-hour cycle of numbness and fear throbbed through Jesus’ disciples, through the people who were “looking for the kingdom of God,” like Joseph of Arimathea. It was after Jesus was dead that Joseph and Nicodemus finally exposed their allegiances—they took Jesus’s body, wrapped it in a linen shroud, wrapped it in 75 pounds worth of spices. They wrapped his body in their own allegiance and love, telling the world who they followed.
And the women followed and saw—the women who had cared for Jesus, ministered to his traveling troupe—they followed him from road, to cross, to tomb. They didn’t fear the blood or turn away. They didn’t run and hide. They followed and watched, then went to prepare their spices and ointments for His body. But first, on the Sabbath, “they rested according to the commandment.”
How do you rest when your hopes and dreams are lying in a grave?
We live in a culture of pain. So often, our response to the world’s pain and death is either cynicism or despair. Author Leslie Jamison writes in her essay, “Grand Unified Theory of Female Pain,” that we live in a post-wounded culture (she limits this to women, but I think it could apply to much of our world):
The post-wounded posture is claustrophobic: jadedness, aching gone implicit, sarcasm quick on the heels of anything that might look like self-pity … Their hurt has a new native language spoken in several dialects: sarcastic, jaded, opaque; cool and clever.
This is a world that has screamed with the pain of genocide, holocaust, terror and war. It’s a world in which 55 million babies have been aborted since Roe v. Wade in 1973. It’s a world of shunning and racism, hate and abuse, violence and fear. We grow accustomed to the stories—we look back on anniversaries and shrug our shoulders: What could we have done differently? Perhaps nothing. We sit in the silence and nurse our aching wounds. We begin to believe the lie: we were made for this bleak, hostile, hurting world. We were made for death and destruction.
Though often mistaken for a partisan ideology, true conservatism represents a much richer intellectual tradition.
We’ve just published a new Kindle ebook, The Essence of Conservatism. It contains a series of classic essays written by thinkers like Andrew J. Bacevich, Roger Scruton, and Daniel McCarthy. The collection addresses the legacies of Irving Babbitt, Wilhelm Roepke, and Evelyn Waugh, examines forces of rising inequality and militarism, and points to local stewardship and “high church” conservatism.
These works help clarify and retrieve conservatism’s deepest insights, demonstrating their essential relevance to the present moment. Buy the Kindle e-book on Amazon for just $2.99!