Gracy Olmstead is a writer and journalist located outside Washington, D.C. She’s written for The American Conservative, The Week, National Review, The Federalist, and The Washington Times, among others.
America is increasingly polarized.
That isn’t news to anyone who’s been following the social research of the past couple years. After the 2016 presidential election, David Wasserman of FiveThirtyEight wrote that “America’s political fabric, geographically, is tearing apart,” and suggested this should be seen as a “flashing danger sign.” In Yuval Levin’s Fractured Republic, which came out in May 2016, he wrote of a hollowed-out society in which mediating institutions and social capital had all but disappeared from American life, leaving in their wake a jaded individualism and growing political rancor.
But a new Pew Research Poll suggests that this polarization—across geographic, cultural, and political lines—is growing even more pronounced with time. Our political differences are strengthening, with an increasing number of urban Americans moving further left and more than half of rural voters (54 percent) declaring their allegiance to the GOP. What’s more, most urban and rural Americans see themselves as judged and misunderstood by each other, with a majority from both groups saying those who don’t live in their types of communities have a negative view of those who do.
Urban and rural divides are not new, as University of Wisconsin political scientist Kathy Cramer told the New York Times. What’s unique about our moment, however, is that “cultural divides overlap with political divides, which overlap with geography,” creating a maelstrom of suspicion and disconnect.
This remarkable growth in polarization leads the Times to ask an important question: are we sorting ourselves, increasingly moving to fit in with those in our “camp”? If not, how and why are the numbers becoming so extreme?
Cramer, for her part, suggests that place-based resentment is becoming a sort of identity marker, especially as politicians employ “us versus them” rhetoric. Shopping at Whole Foods or going to the gun range have increasingly become political acts, talismans of personality and place with markedly partisan affiliations. Our sorting seems to have more to do with an increased tendency to tie cultural and social acts (as well as geographic identity) to politics than it does with a marked shift in our habits or moving patterns.
Alongside these differences, however, the Pew poll also shows remarkable (and somewhat alarming) similarities between urban and rural communities. Both groups are about equally worried over the impact of the opioid epidemic on their neighborhoods. Both are worried about job availability. Young people from both are more mobile and restless—although “Roughly a third (32%) of young adults in rural areas say they are very or mostly dissatisfied with life in their community; this is significantly higher than the share of young adults in suburban areas who say the same (21%).”
About four in 10 Americans across geographic divides say they don’t feel attached to their current communities. While knowing one’s neighbors, owning one’s house, and living in one place for a long period of time all increase the chances of community involvement and satisfaction, only three in 10 Americans say they know most or all of their neighbors—and a third say they would move away if they could. While a greater percentage of rural folks say they know their neighbors, that doesn’t mean they interact more often. Indeed, according to Pew, community involvement doesn’t vary much by community type: “Among those who know at least some of their neighbors, rural Americans are no more likely than their urban and suburban counterparts to say they interact with them on a regular basis.”
Obviously, these figures could be worse. Most Americans say they still know at least some of their neighbors; large numbers in urban, suburban, and rural communities say they remain close to—or have moved back towards—their families. But there’s still a marked sense of alienation, suspicion, and discontent displayed in this poll. Not only do disparate American communities suspect each other of unkindness and disrespect, many have retreated from neighborliness and association within their own circles.
These findings reminded me of the suggestion in Patrick Deneen’s recently released Why Liberalism Failed that the political ideology of liberalism drives us apart, making us more lonely and polarized than ever. As Christine Emba writes in her Washington Post review of Deneen’s book:
As liberalism has progressed, it has done so by ever more efficiently liberating each individual from “particular places, relationships, memberships, and even identities—unless they have been chosen, are worn lightly, and can be revised or abandoned at will.” In the process, it has scoured anything that could hold stable meaning and connection from our modern landscape—culture has been disintegrated, family bonds devalued, connections to the past cut off, an understanding of the common good all but disappeared.
That latter loss—of a common understanding of the good—seems particularly applicable to the Pew poll’s findings regarding polarization. Although our country has always struggled with an urban-rural divide, it could be that our lack of a common conception of the good has made it even worse. Left and Right subscribe to different liberal tenets that tear at association and community: on the Right, “classical liberalism celebrated the free market, which facilitated the radical expansion of choice,” while the Left’s liberalism “celebrated the civil right to personal choice and self-definition, along with the state that secured this right by enforcing the law.” As Emba notes, both forms of liberalism foster “a headlong and depersonalized pursuit of individual freedom and security that demands no concern for the wants and needs of others, or for society as a whole.”
Thus we disconnect in terms both broad and intimate, struggling to equate our political autonomy and self-definition with the demands of empathy, neighborliness, and service. The fact that our urban and rural communities are so suspicious of each other suggests a degree of navel-gazing and self-consciousness that is deeply detrimental, if not tempered by a proper degree of rationality and generosity.
Fixing these problems will require more than a distrust of our political leaders’ schismatic rhetoric, instrumental in entrenching our divide though that rhetoric has been. Turning to the state for answers or blame is one of the reasons we’re in trouble in the first place. A healthy effort to “plug in”—to connect at the local level, to dialogue with our political “enemies,” and to engage in civic and philanthropic efforts—may be the best way to cut back on some of this rancor and polarization.
In the conclusion of Why Liberalism Failed, Deneen suggests that we need to foster local “counter-anticultures”: bastions of community, civic engagement, philanthropy, and religion to counteract our cultural and social vacuum. Levin recommends something similar in Fractured Republic, turning to Edmund Burke’s “little platoons” and the reinvigoration of associational life as a balm for widespread fragmentation.
This newest Pew poll suggests that the more deeply we know each other—and the more time we spend together—the less lonely and restless we will feel. That isn’t a shocking revelation, but it’s an important one nonetheless. Those who feel nourished and cared for by their communities will feel less cheated by the state and more empowered to confront the changes and dilemmas in their neighborhoods. It may be that by itself this can’t bridge our deep urban-rural divide, considering how widespread our resentment and political differences are. But I do think a community that feels self-sufficient and nourished is less likely to harbor feelings of resentment and suspicion toward those outside its borders: there’s less temptation towards discontent, and often a deeper awareness of the issues we share in common. Philadelphia, Pennsylvania, and Elizabethville, Pennsylvania, need the same things: committed citizens, generous philanthropists, passionate civic leaders, savvy planners and political leaders, strong local institutions, and vigorous community involvement. They often struggle with the same things, too: loneliness, despair, unemployment, fragmented families, weak civic and educational institutions, a lack of funds, poor urban planning, and so on.
While our national discourse champions rancorous politics, local associations and news celebrate self-empowerment, service, and communal ties. They emphasize every community’s desire to become the best version of itself. The more we can focus on these things, the better.
Gracy Olmstead is a writer and journalist located outside Washington, D.C. She’s written for The American Conservative, The Week, National Review, The Federalist, and The Washington Times, among others.
Reports of Barnes & Noble’s imminent demise have long been foretold—and not necessarily exaggerated. The bookstore has been on a downward slide for years. In 2013, its CEO resigned amid the company’s Nook expansion failure. At the time, Idea Logical’s Mike Shatzkin alleged that Barnes & Noble would not recover; it could only hope to “make the slide into oblivion more gradual.”
Five years later, Barnes & Noble is still around. It’s now outlived Borders to become the last national bookstore competing with Amazon in the retail space. But survival does not necessarily mean flourishing: at the beginning of the month, the company’s stock plunged nearly 8 percent. According to the Guardian, the bookstore has lost $1 billion in value over the past five years.
“It’s depressing to imagine that more than 600 Barnes & Noble stores might simply disappear,” New York Times columnist David Leonhardt wrote a couple weeks ago. “But the death of Barnes & Noble is now plausible.”
Visits to Barnes & Noble were one of my favorite occasions as a child. Back then, there was something nostalgic yet stately about the bookstore. It seemed like a gem of grace and sophistication, a place not just to browse but to learn. I remember distinctive elements that still set the store apart in my mind: the hunter-green wallpaper, the woodcut and art deco feel of its literary decorations, the old-fashioned chairs and quiet corners suggesting a respect and reverence for the old and the classic. The store evoked the same “feel” I got when I handled an old antique hardback. Back then, the bookstore’s coffee shop had a few tidy tables where patrons could sit, but there was no wifi. I would grab a stack of five or six books and read the first chapter of each one before deciding which I wanted to buy.
Then there was Barnes & Noble’s children’s section, with its old-fashioned Winnie the Pooh prints and large, open stage for readings and play. The children’s book section felt like a fantastical wonderland: a place of discovery, serendipity, and beauty. Even when I got too “old” for it, this was always the part I gravitated to.
Of course, it’s important for a store to update and improve its style and form. I’m not saying Barnes & Noble should have stayed in the 90s and refused to renovate. The real problem is that they never seemed to prize their own distinctive beauty, and always sought to make themselves more like technology stores rather than emulating the success of their smaller, indie-bookstore counterparts.
Do you remember the Nook? The company is still selling them, though it has been ages since I’ve seen one in-store (and even longer since I’ve met somebody who owns one). Barnes & Noble jumped on the e-reader bandwagon in 2009, hoping to compete with the Kindle and the iPad. But while Amazon and Apple were obviously and entirely digital companies, and their e-readers fit their image and model, Barnes & Noble’s Nook pursuit always felt a bit dissonant and off-color. In many ways, the Nook was the beginning of the store’s end—because it signaled that the company would always try to play “catch up” with the newest fad or trend rather than focus on its own secret sauce: physical books.
Compared to the old Barnes & Noble of my childhood, the new bookstore that just opened in Ashburn, Virginia is almost unrecognizable. It is large and minimalist, bare of decoration or color. There are only a few chairs scattered throughout, and little room to sit and read in the children’s section.
Instead, this new Barnes & Noble is half bookstore, half café: it features a souped-up Starbucks, bar, and a kitchen offering items like charcuterie and avocado toast. The theory here appears to be that, if you’re going to sit and read a book or magazine without buying it, they’re at least going to make you pay a hefty sum for wine or a croissant. But as Eater notes, “entering the restaurant business, with its notoriously low profit margins and high rate of failure, is an unlikely Hail Mary for the nation’s largest bookstore chain.” Many of the people I have seen parked at tables inside this Barnes & Noble don’t have books or magazines with them; instead, they’re plugged into computers and headphones, teleworking or doing conference calls.
Although this Barnes & Noble still hosts readings and book signings according to its website, there’s no obvious advertising for them in the store—and it’s difficult to determine where they are, considering the significant lack of sitting space. The children’s section boasts one table laden with Legos, with a few cushions surrounding it. There are no armchairs to sit and read a book with my toddler. Instead, we had to crouch on the ground next to a shelf of Dr. Seuss books in order to read together.
Perhaps it’s this—the table full of Legos but absence of chairs for reading—that best marks Barnes & Noble’s drastic transformation. For a long time now, the bookstore has seemed to emphasize everything but books: its puzzles and DVDs and records, gadgets and greeting cards and plush Harry Potter sorting hats. As one analyst told the Guardian, “The stores just look like an enormous Aladdin’s cave of all sorts of random products, including departments selling CDs and DVDs that are never crowded.”
It is sad that Barnes & Noble believes it must add all these other perks to get people to frequent their stores. It seems that, for a long time now, they’ve ceased believing in the power of their own product. Perhaps this is what really drives customers away: why go to a store that doesn’t really seem to love or understand its own telos?
It makes sense that, in this absence of love or mission, customers would increasingly turn to the indie bookstores Barnes & Noble was once driving out of business. These smaller stores still get it: although they’ve also had to be inventive in order to survive, they’ve stayed true to their mission and their product. As Guardian reporter Edward Helmore put it, “innovation is where Barnes & Noble went wrong. Other big booksellers have tackled Amazon’s onslaught by doing precisely the opposite—going back to basics and putting the books first.”
If Barnes & Noble were still a place to pause, to savor, and to experientially luxuriate in the codex itself, I don’t think it would be struggling as much as it is. If the store had put greater emphasis on book clubs and special memberships, story hours and author signings (as many indie bookstores have), it would have given readers a space that celebrates the book, its creation and its consideration, as well as the membership of bookworms. If the store had put more emphasis on hiring and training book lovers who were passionate about their product, it might have wooed the suspicious and the hesitant to buy more books. If it had invested less money in gadgetry and more in its volumes, as well as in the style and form of its shopping and reading space, it could have successfully differentiated itself from Amazon and emphasized the things that set it apart: physical interactions with physical books (and physical people, for that matter).
But as it is, everything that Barnes & Noble does today is done as well or better by someone else. You could go to its café and get a Starbucks coffee and croissant—or you could go to the nearest indie coffee shop and get higher-quality versions of the same things. Nearly every café and coffee shop offers free wifi and space to work these days. And indie bookstores usually offer both new and old books at the same (or better) prices, while providing their customers with a greater sense of passion and reverence.
Barnes & Noble has long been a place of learning and love. But in its determination to innovate, it has forgotten its telos. Which means that it won’t survive much longer unless it can rediscover what made it special in the first place.
Gracy Olmstead is a writer and journalist located outside Washington, D.C. She’s written for The American Conservative, The Week, National Review, The Federalist, and The Washington Times, among others.
Chip and Joanna Gaines were the best thing to come out of cable TV in decades. Their show “Fixer Upper” was goofy, cute, and family-friendly. They showed that fixing up homes was not just about making money, but about reinvigorating one’s place, helping other families, and resurrecting the treasures of the past. In contrast to Home and Garden Television’s oft-bickering couples, Chip and Joanna joked together, supported each other, and put forth a united family front.
These days, Chip and Joanna aren’t HGTV stars anymore: they announced last year that the fifth season of their show will be their last. But the end of “Fixer Upper,” apart from sentimental reasons, doesn’t really matter—because it made the Gaineses into national celebrities. And while “Fixer Upper” may have departed, Chip and Joanna remain strongly present in the national consciousness. They’ve penned a bestselling biography, Chip has just written a memoir called Capital Gaines, and the couple puts out a seasonal magazine called “Magnolia Journal.”
Now, alongside the opening of their new restaurant in Waco, Joanna Gaines is releasing Magnolia Table, a cookbook full of recipes she’s concocted and collected over the years. Food and family has always been a significant focus of “Fixer Upper”—from the emphasis Chip and Joanna put on bringing pizza and the kids to their reno homes for impromptu picnics, to the scones and cookies Joanna always sets out for her guests during design meetings.
Joanna emphasizes the same values in her new cookbook, which she calls “a celebration of bringing people together.” It is less focused on party-throwing or having guests over, however, than it is on feeding and loving one’s family. “A home-cooked meal is the thing that connects us all the most,” Joanna writes in her introduction. Despite the frenzied rhythm of their familial life, the Gaines still put a strong emphasis on family meals. Joanna shares stories of making biscuits and gravy on Saturday morning, serving tortilla soup for a weeknight dinner, and prepping a loaf of banana bread for her kids when they return home from school. Cooking in the kitchen is Joanna’s time for respite and refocusing, for serving and loving her family.
In contrast to other hospitality gurus like Martha Stewart and Gwyneth Paltrow, Joanna paints herself as decidedly down-to-earth and middle class—less about fancy dinner parties and talismans of wealth, more about family values and simple pleasures. This is epitomized in Magnolia Table, which occasionally references pancetta and arugula, but abounds in middle-class American foods like casseroles and mac and cheese. Joanna balances out her more “fancy” dishes (like roasted asparagus with a red wine béarnaise sauce or fatayar) with classics I remember from my childhood: sour cream chicken enchiladas, BLT sandwiches, and lemon bars. She also lets her readers know that sometimes she feeds her kids Eggos for breakfast.
This fits perfectly with the Gaines ethos. Although they occasionally dabble in the more haut monde hipster fashions of the moment (midcentury furniture, earthy textiles, all the succulents), they are also—always—a Texas farm family. Chip will always tell awkward dad jokes and love biscuits and gravy. Joanna may wear Anthropologie dresses and bohemian jewelry, but she still loves driving on back country roads and eating good barbecue.
In many ways, the Gaines’ success has stemmed from this fusion. They are successful ground-up entrepreneurs and passionate supporters of their hometown, a folksy farm family with a successful marriage and happy kids. But they are also trendsetters: as “Fixer Upper’s” design and philosophical ethos developed, Joanna managed to bring all the bourgeois hipsterdom of Kinfolk magazine’s hospitality ethic, eco-consciousness, and design minimalism into the American mainstream. My Idahoan, antique-loving parents and Birkenstock-wearing D.C. friends adore Chip and Joanna equally. The Gaines are what a lot of Americans aspire to be: integrated, supportive, successful—all without coming across as stuffy, self-entitled, or detached.
The question now is whether Chip and Joanna can keep up this folksy, small-town image as their brand expands. Five years since the launch of “Fixer Upper,” the Gaines aren’t just remodeling houses anymore. They own a giant design warehouse called The Silos, Magnolia Seed + Supply, a cupcake bakery, a restaurant called Magnolia Table, a Target brand called Hearth & Hand, and two bed and breakfasts (Hillcrest House and Magnolia House).
Will fans continue to see Chip and Joanna as personable and relatable now that they have become a corporate behemoth? Can we trust the Gaines’ vision of home, family, and community when they themselves can’t seem to stop growing their empire?
It’s not that any of these enterprises are wrong or off-brand. Joanna’s new cookbook is perfectly in-sync with her televised personality: down-to-earth and beautiful, focused on hospitality and family. But at the same time, Joanna feels somehow less Joanna these days, and more Martha Stewart. No matter how hard she tries to mask her corporate success with pictures of family picnics and homemade chocolate chip cookies, we know the Gaines aren’t your average middle-class Texan family anymore. They’re a brand—and must embrace all the responsibilities that come with that success. After all, how can they do real estate, manage restaurants and bed and breakfasts, put together a national retail store brand, write books and magazine articles—and still, as Joanna writes in her latest magazine issue, take painting classes with their kids and wash dishes and care for farm animals and go out for relaxed dates on Friday night? Either the Gaines take “work-life balance” to godlike levels, or they cannot and are not doing it all.
It seems more likely that they’ve embraced the latter than that they’ve accomplished the former. Ending their HGTV show this year was a sign that Chip and Joanna still believe in limits, and in protecting both their family life and their image as small-town people. It’s certain that many of the everyday workings of the Magnolia machine—from restaurant to silos, Hearth + Hand to magazine editorializing—is happening apart from the direct influence and oversight of Chip and Joanna.
In one sense, that’s a good thing: it means Chip and Joanna are still the couple we connected with in 2013, the couple that loves spending time with their kids and observing family dinners. One hopes that, in delegating the corporate workings of their business, the Gaines are erring on the side of doing less and not more.
On the other hand, it means that Magnolia may eventually lose some of its personal feel and become a less Gaines-y brand. The couple has thus far managed to keep their personal touch on much, if not all, of their brand’s output. They’ve been vulnerable and open in ways that have enabled their empire to still, somehow, feel small. But this celebrity roller coaster could catapult the Gaines into ever-greater spheres of monetary and cultural stardom, distancing them from their original ethos—and from the fans who have loved them for it. If they’re to preserve the characteristics that made them famous in the first place, they may have to let that roller coaster run on without them, growing ever more detached as it swells.
If they do, I won’t be disappointed; in fact, the less Magnolia grows from here on out, the more I’ll continue to love Chip and Jo-Jo.
Droves of Millennials are leaving Christianity behind, many to join America’s growing ranks of religious “nones.” But do these young people (and their compatriots from other generations) really understand what they are abandoning, in all its fullness and complexity?
That is the question Bonnie Kristian seeks to address in her new book A Flexible Faith, set to release next week. The book serves as a primer on Christianity’s considerable theological diversity, and covers issues as various (and controversial) as predestination, war, gender roles, baptism, and hell.
The Christian faith provides lots of room for thought, disagreement, and nuance, Kristian writes, all within the limits of orthodoxy. The faith Millennials think they know, via their own personal church experiences, may only provide half of the picture. A Flexible Faith, by presenting a thoughtful overview of faith disagreements within the church, seeks to tempt the doubting or frustrated to give Christianity another look—and to maybe consider switching denominations rather than leaving the faith altogether.
“A vibrant diversity within Christian orthodoxy—which is simply to say a range of different ways to faithfully follow Jesus—is a strength of our faith, not a weakness,” Kristian writes. “…We can get so stuck in our own little pool that we never notice the stream of orthodoxy is wide and deep and beautiful.”
Of course, suggesting that there are “many ways to follow Jesus” can be a controversial—and potentially heretical—statement. But Kristian is careful with her claim: she defines creedal Christianity and separates it from the realm of flexibility at the outset of her book. Some will still argue that Kristian’s denominational and doctrinal choices are too inclusive. But many of the issues she discusses are ones Christians tend to see as inhabiting gray areas: mysterious, open to nuance, and worth approaching humbly.
It’s important to note that A Flexible Faith is not meant to serve as some heady guide to biblical exegesis. Kristian’s chapters are, for the most part, quite short. She provides her readers with a summary of the doctrinal question involved, an overview of the most common opinions on that question, an introduction to a historical Christian who contributed important scholarship to that issue, a few discussion ideas, and a book list for those who would like to study the matter more. Kristian’s chapter on the equality of men and women, for instance, highlights complementarianism, egalitarianism, the legacy of Quaker Margaret Fell Fox, and a long list of books about women’s roles in the church. Kristian also includes a few Q&As with church and nonprofit leaders from various denominations, a fascinatingly diverse group of thinkers and activists. In some chapters, Kristian provides her own doctrinal stance as a safeguard against bias and a means of sharing her opinions. But for the most part, the book is a straightforward compilation without much editorializing: its tone, style, and structure is succinct and objective. Kristian’s concise journalistic writing stands out here as a bonus, offering as it does both a pithy and interesting condensation of these tricky doctrinal areas, and a very measured and gracious stance on issues that can be deeply controversial and contentious.
The book may seem deceptively simple to some readers. But it speaks to Kristian’s skill as a writer and scholar that she can so concisely and thoughtfully condense these issues.
“I don’t want to see Christians becoming nones because they’ve been falsely told there’s just one way to follow Jesus,” she writes. “That’s why I think there’s a lot of value to introducing Christians to our siblings and even distant cousins in the faith, particularly if that’s what it takes for some to remain in the family.”
But this also presents an assumption that many Christians will need to explore and grapple with. Although we know that the Scriptures and the Apostle’s Creed do not always provide easy answers to questions on the Book of Revelation or the nature of hell, it is still easy for us to reject or condemn the viewpoints of others. To what extent are we called to flexibility and empathy in our doctrinal choices—and to what extent should we stand firm?
This question becomes particularly important in reading Kristian’s chapter on homosexuality. Kristian argues that, though many Christians treat homosexuality as an issue of dogma (essentiality to the faith), it is in fact an issue of doctrine (open to flexibility and debate). After defining the traditional, celibate, and marriage-affirming positions that Christians hold on the issue, she closes with a petition for Christians to seek a “third way.”
“This is regularly cast as a two-sided topic debate with no middle ground,” she writes. “Everyone is either with us or against us, good or evil, friend or enemy.” She argues that “Christians can disagree here, too, and we must learn to do so without trying to kick our opponents out of the faith.”
Of course, a key difficulty here—as with other doctrinal issues in this book—is that, while homosexuality may not be covered in the Apostle’s Creed, it is discussed in the Bible. And the position one takes on this issue, as with many others, largely involves his or her willingness to adopt a more rigid or flexible interpretation of the Scriptures. Some issues, like involving women in the church, say, have obvious cultural and historical contexts that should be taken into account. Very few Christians today argue that women should have their heads covered during church services, for instance. But there are other Scriptures that are much harder to read loosely, to set in the “cultural context” box as opposed to the “eternal relevance” box. The more flexible we become, the more we must ask ourselves whether we are stretching the limits of orthodoxy to the breaking point. A key challenge in reading a book like Kristian’s lies in the question of how much we can or should test those limits. How “flexible” should our faith be?
It seems inevitable that some (especially those with strong attachments to their particular doctrinal beliefs) will be frustrated with or even offended by this volume. But I don’t think the reader has to agree with every one of Kristian’s doctrinal or denominational choices in order to benefit from A Flexible Faith. Its focus on diversity within the faith can and should prompt us to humility and a proper awe for the mysteries inherent within our religion. I cannot begin to describe the incredible spiritual wisdom I have gleaned from the writings of St. Augustine, Thomas Merton, C.S. Lewis, R.C. Sproul, John Calvin, G.K. Chesterton, Flannery O’Connor, Marilynne Robinson, John Piper, Russell Moore, Watchman Nee, Amy Carmichael, and others. Is it a coincidence that they each come from rather unique church contexts? Is it wrong for me to borrow from the treasures of so many different denominations to add to my own biblical and philosophical storehouse?
Perhaps it is necessary at times to draw lines in the sand, to stake out a doctrinal position and to stand firm in rejecting all other stances. But Kristian is right to argue that we should be extremely careful about how, when, and whether we make those choices. Because the lines we draw may not just separate brother from sister. They might drive some of our siblings out into the cold of disbelief.
The world is mourning for Alfie Evans, a British two-year-old who this weekend passed away at Liverpool’s Alder Hey Children’s Hospital.
Alfie had an extremely rare neurodegenerative disease, one which doctors decided was impossible to cure or reverse. After they determined it was best to take the boy off ventilation and to allow him to “pass away peacefully,” in opposition to Alfie’s parents, they fought for (and were granted) legal control of Alfie’s life, and took him off life support.
Alfie’s parents, Kate James and Tom Evans, fought the hospital tooth and nail. After they procured Italian citizenship and a spot at Rome’s Bambino Gesu hospital (as well as the support of the Pope) for their son, they asked the courts to allow them to travel with Alfie to Italy, transferring him out of British care. On Wednesday, that petition was refused. For the past several days, the family, the hospital, and the world have waited and watched.
Alfie Evans defied expectations. Doctors suggested that Alfie would pass away immediately once taken off ventilator support. But the little toddler lived for nearly a week without that support—and during that time, drew the prayers of supporters from around the globe.
This case was, in many ways, a simple parental rights issue: if parents believe their son should get further treatment and have the means to do so, as Kate James and Tom Evans did, how could the court possibly refuse them the right to care for their own child? As Rod Dreher put it last week, “Shouldn’t the presumption be that a child’s best interest is to live, not to die — especially when there are resources being made available to pay for the child’s further treatment and care, at no cost to the British Crown?” Unless there were some hidden cruelty involved in James’ and Evans’s wishes (which there was not), how could the court obstruct their right to extend care for their son?
This is the most tragic, unjust, and angering aspect of Alfie Evans’s death. As Megan McArdle put it on Twitter this weekend:
Some parents demonstrate that they don’t care, and in that case, a court properly intervenes. But we should always err on the side of the parents and family, because none of us wants our fate decided by a disinterested expert. We want it decided by people who love us.
— Megan McArdle (@asymmetricinfo) April 27, 2018
And also because killing someone’s child is the worst thing you can do to a human being. The state should not do that to people who will have to live with the agony of watching their child killed in front of them by “disinterested experts”.
— Megan McArdle (@asymmetricinfo) April 27, 2018
The parental rights questions involved in Alfie Evans’ case have been strongly argued and thoroughly discussed by multiple sources. They have been the primary focus of U.S. conservatives interested in the case—understandably, and justifiably.
But the medical questions involved in this case, under the surface of its legal arguments, are much trickier. And while I don’t doubt that Alfie Evans’s parents should have been enabled to care for him, and that—from a legal perspective—this is a parental rights case, I also believe we must pay attention to those deeper questions of care and treatment. Because as the world of medicine expands and deepens, offering us options that heretofore never existed, these cases are likely to grow in complexity and quantity.
The legal arguments surrounding the hospital’s ability to act in loco parentis and assume control over Alfie’s care pointed out stark differences between Alfie’s parents and his doctors’ beliefs surrounding his condition. Doctors believed Alfie was dying. He had been on life support for over a year, and during that time had been slipping deeper and deeper into a semi-vegetative state, they said. Alfie’s parents, however, believed their toddler was only ill, and that they needed time to explore further treatment options.
Doctors saw no chance for recovery: the toddler’s neuro-degenerative case was unique, they said, with causes that were still undiagnosed. According to testing they had conducted over the past 16 months, Alfie would not respond to treatment options available to some others with neuro-degenerative diseases. Furthermore, the hospital said his brain scans showed “catastrophic degradation of his brain tissue” and that further treatment was not only “futile” but also “unkind and inhumane” and insisted on taking the toddler off ventilation.
Alfie’s parents, on the other hand, believed this was an assumption which should not go unchallenged—one which did not appreciate the mysteries and unknowns surrounding Alfie’s condition. They wanted to do more testing before giving up and hoped to give him another chance in Rome.
This is what makes Alfie’s case difficult: many trust the parents’ word over the doctors,’ and have assumed ill motive on the hospital’s part. As Matthew Lee Anderson pointed out in a Medium blogpost, “The case … [has] been met by social conservatives (primarily, though not exclusively in the States) with outrage and accusations that the hospital and government is engaged in ‘state-directed murder.’”
Others, however, have taken the side of the doctors and the court, and assumed Alfie’s parents were blinded by their own grief. One palliative doctor from the UK noted that the sheer trauma of losing a child can push parents to (or past) the brink of despair:
“I have held fathers as they collapse in my arms, seen a mother biting her own arm in her grief, and wondered, over and over, at the vastness of the pain this world can inflict on its youngest, most undeserving and innocent. Indeed, Mr. Justice Hayden described this week ‘a father whose grief is unbounded and whose sadness, as I have witnessed in this court, has an almost primal quality to it.’”
We all have felt this case strongly. A child’s life was at stake. Parental grief was involved. Any of us who have borne, birthed, and loved a child feel something primal and instinctual within us recoil at the prospect of “pulling the plug” on our child’s life. It’s easy for us all to “put ourselves in their shoes,” and consider what we might or might not have done if Alfie were our son.
But amid the rhetorical passion of the moment and the heat of our own instincts and beliefs, we must remain humble. There are many things in this case which we cannot presume or know fully. We can only see darkly, in the half-lit light of the internet sources available to us.
It could be that the British government and Alder Hey hospital were on a power trip, and cared more about control than they did about Alfie Evans. That is, at least, what many parental rights advocates suggested last week. But, on the other hand, it is important to remember that Alfie was in Alder Hey’s care for over a year. The hospital’s doctors and nurses treated his condition for days, weeks, months at a time. They gave him medicine and various treatments for his epilepsy, observed his condition day in and day out. It seems presumptuous to assume that they had no relationship with or empathy for Alfie, that they’re cruel humans who only wanted to see him suffer. Their care may have been deeply unfounded, misdirected, or wrong. But that does not mean it wasn’t well-intentioned.
Which is why, as Anderson argues in his blogpost, I think it’s unfair to assume this was an instance of “state-directed murder.” As Anderson notes,
…[T]hat our social discourse is so driven by tribal sentiments and passions means that there is a higher responsibility upon everyone whipping the troops to ensure they have chosen a case that will hold up over the long term. This is particularly the case in the midst of such a tragic situation as this; the heart-wrenching upheaval of compassion makes critical reflection seem not simply callous but superfluous, and justifies whatever form of heightened rhetoric we have access to. But it is just when the emotions and rhetoric are at their highest temperatures that we stand most in need of careful distinctions and reflection.
The question that remains—the question that we cannot easily answer from afar—is whether the hospital Alfie off of life support because they believed he was dying, or whether it was because they deemed his life “not worth living.” There is an enormous difference between the two claims. The former is a painful, difficult arena of choice, and thus more open to the realm of compassion and nuance. The latter, however, is a rhetorically volatile and indignation-fomenting possibility. The doctors at Alder Hey hospital may have been deeply, woefully mistaken. But it does not necessarily follow that they intended to murder Alfie.
Let’s pretend Alfie’s parents were in full control of this situation. Their son had been on artificial life support for over a year. Over that time, his mental and physical health had continued to decline. Doctors told them Alfie was in a semi-vegetative state, beyond hope of cure. One must ask whether it would have been appropriate, then, for Alfie’s parents to stop artificial means of life support.
This is where the ethics of Alfie’s case gets so tricky. Who decides when to offer artificial life support, and when to remove it? Our increased scientific and medicinal control over life has complicated this issue enormously. Alfie would not have been alive last week if not for all the interventions the hospital had taken up to that point on his behalf. His parents’ only hope for improvement or cure lay in more artificial means and experimental treatments. The more our medical power grows to artificially sustain life or experiment with it, the harder and harder it will become to determine when (or if) it is right to stop such efforts.
Alfie’s tenacious hold onto life over the past few days, despite the withdrawal of life support, only emphasizes further the mystery and complexity surrounding these questions—the inability of our fallible minds, despite expert advice and care, to know what is right or just or even likely. Perhaps getting transferred to Italy would have made no difference, and Alfie would have only passed away there, given more time. But perhaps not. How can we know?
I don’t pretend to have the answers to these questions. But I must ask them. Because I could have been Alfie’s mother this week, in an impossible and heartbreaking position. Maybe, someday, I will be in her shoes. Maybe you will be. God help us to make the right decisions.
On a recent trip home to Idaho, I talked to a farmer who’d known my great-grandfather. He told me that, long after his neighbors had traded in old-fashioned pasturing methods for confined animal feeding operations (CAFOs), my great-grandfather continued to graze his cows along the ditch banks and corners of his land. When asked why, Grandpa Walter replied that it made the animals happy and kept the land pretty.
These days, we know that old-fashioned pasturing methods encourage much more than aesthetic charm. A long feature published in New York Times Magazine this past week considers the numerous benefits pasture grazing (especially rotational grazing) has for the plants, soil, and climate of a place. It would appear that modernity is finally catching up to my great-grandpa—and that an agriculture that is beautiful and humane may actually, in a karmic or providential twist, be more vibrant and profitable long-term.
In his latest book The Art of Loading Brush, Wendell Berry talks about the intuitive aspects of agrarianism: that there are many things agrarians do and uphold not for specific scientific reasons, but because they know in their bones that it’s “best.”
“I think that agrarianism had, and where it survives it still has, a sort of summary existence as a feeling—an instinct, an excitement, a passion, a tenderness—for the living earth and its creatures,” he writes in his introduction.
Chuck Marohn highlighted this same intuitive genius in ancient urban planning during his most recent podcast for Strong Towns. As a history lover and engineer, Marohn has observed patterns in urban planning that have been passed down through millennia, patterns that built a deep logic and beauty into the places they sculpted. “Human habitat is pretty ordinary,” he notes. “We need certain things, and those’ll be within a certain distance of each other. Buildings will be arranged in certain ways and will have certain attributes, because it makes places safer, and it makes places more social. It has all this ‘spooky wisdom’ built into it.”
“Spooky wisdom” is the term Marohn employs: “the idea in quantum mechanics, at least as it’s developed today, is that we know these things work—but we really don’t know why.” “We write equations out of our understanding of quantum mechanics,” he explains, “we can test those equations, they test out true—so clearly we’re onto something—but we don’t know why it works. …And what I’m suggesting is that the more I have studied and looked at human development patterns pre-modernity, the more I just find spooky wisdom. Things that work, and I can’t really explain or understand why.”
For millennia, humans have followed specific patterns passed down by their forbears without always knowing why. This is the essence of culture: the layers of belief and precedent, ritual and intuition that guide societal life and practice. As Maurice Telleen once put it, “A funny thing about cultures is that they produce people who understand more than they know. Sort of like osmosis.”
But in modernity, as Marohn notes in his podcast, we chose to dispense with precedent and tradition. We decided to distrust the “spooky wisdom” of the past—whether it had to do with old-fashioned agrarianism or dense walkability—and instead start from scratch, inventing our own way of doing things. Thus, freeways cut through the core of our cities, severing neighborhoods and communities. Suburbs sprung up around cosmopolitan centers, fashioning their own car-centric rhythms and culture. Farmers, meanwhile, were told to “get big or get out,” to trade diversity and sustainability for homogeneity and profit. Small to midscale farms steadily lost land and resources to their larger, industrialized counterparts.
Some of these seismic transformations came in response to new technologies, like the automobile and the combine. We refashioned industry around the “future” as we imagined it.
But there’s a reason cities were designed a certain way for centuries. There’s a reason, insofar as we’ve diverged from that pattern here in the U.S., that we have recently found ourselves gravitating back to it. If Marohn is right, then humans know—even if only on a deeply visceral, subconscious level—that they’re suited for a specific sort of habitat. And they’re going to seek out and crave that habitat, whether they find it in an old European city or in the walkable neighborhoods of small-town Michigan.
Similarly, we humans are beginning to react against our agricultural experimentations—and when we do, it’s often on an instinctual level. There’s a reason that, even though a lot of farms are large and industrialized, their appearances in TV commercials always reflect a small-town, mom-and-pop sort of charm. There’s a reason egg cartons at the grocery store portray happy, plump chickens walking through a meadow—even if the eggs contained therein were produced by caged, stressed birds. There’s a reason farmers I talk to about their polyculture operations are always stressing the smell of the air, the beauty of the land, and the happiness of their animals. These people know in their guts that these things matter. The osmosis of culture is still impacting their tastes, instincts, and desires. They’re still attuned to the “spooky wisdom” of the past.
We can still use newfound knowledge to make improvements or tweaks to old forms. I’m not saying all change is bad. Many farmers are using innovative new electric fencing to foster better grazing methods. The Land Institute is striving to breed perennial grain crops that will build sustainability and land health alongside the financial viability of farmers. Not all new things are unhealthy or unwise—but the best “new” things contain seeds of truth and wisdom that have been passed down for generations. They do not discard precedent.
In many ways, this is what conservatism is about: a respect for handed-down traditions and customs, even when we don’t entirely understand them. As Patrick Deneen writes in Why Liberalism Failed, liberalism was “a titanic wager that ancient norms of behavior could be lifted in the name of a new form of liberation and that conquering nature would supply the fuel to permit nearly infinite choices.” This sort of radicalized independence has infiltrated both left and right in our era. Whether we’re applauding the encroachments of big government or the unbridled license of free market capitalism, the root instincts involved are often the same: we’ve decided that scale and limits, norms and traditions, no longer matter. “Progress” is the only virtue worth pursuing.
In his “Ten Conservative Principles,” Russell Kirk warned that “the destroyers of custom demolish more than they know or desire.” Perhaps broken cities, stressed people, depleted soil, suffering animals, and contaminated water are telling us something we increasingly must listen to, whether we like it or not. Perhaps in our time, the nostalgia for vinyl records and craft beer will morph into something deeper and more philosophical: an understanding that the treasures of the past are treasures for a reason. As scientific knowledge “catches up” with spooky wisdom, we may come to understand and appreciate the thinking of the ancients in greater depth.
To quote T.S. Eliot, perhaps “the end of our exploring will be to arrive where we started, and know the place for the first time.”
As a kid, I listened to my dad discuss politics and philosophy with people from all walks of life. After church picnics or town hall meetings, while visiting friends or grabbing coffee at a local coffee shop, Dad would strike up conversations. Sometimes those conversations could veer into controversial territory: presidential politics, abortion, church doctrine.
I knew my dad had strong convictions about some of these issues. But those convictions never pushed him into bombast or bluster. Even when talking to folks who were heated arguers, he never lost his cool. He’d smile, say something encouraging or positive, ask a question or two. He walked away from every debate and discussion having encouraged peace, gentleness, and a sense of dignity. Watching him was a little like staring at a photo negative of a Fox News panel.
In college, when we studied Edmund Burke’s Reflections on the Revolution in France, I realized my father was the disciple of a rather ancient set of values. He was a gentleman: someone who still practiced the art of manners, and as such, he was both inheritor and promulgator of the classical conservative tradition. As Burke puts it in his classic work:
Nothing is more certain, than that our manners, our civilization, and all the good things which are connected with manners and with civilization, have, in this European world of ours, depended for ages upon two principles; and were indeed the result of both combined; I mean the spirit of a gentleman, and the spirit of religion.
Good manners and religious virtue extend far beyond discourse in coffee shops. But so often, this initial meeting ground is the arena in which our true character and the nature of our beliefs are fully revealed. And it seems that, as our society forsakes both the spirit of religion and the spirit of gentlemen, these initial meeting grounds are some of the first to suffer.
As professor and author Mark Mitchell has written for the Front Porch Republic:
Forms and limits are not welcomed in a culture that sees freedom as the highest good, a culture that fairly worships at the altar of individual choice. The history of the liberal project has been a steady and determined attempt to defy limits, to destroy forms, to expand the idea and practice of liberation to all spheres of existence. How can the idea of the gentleman, the essence of which necessarily depends on the propriety of limits, co-exist with the goals of liberalism?
At some point, I realized my dad’s discussion style was growing rather rare—especially, perhaps, in our technological age. While Facebook and Twitter aren’t to blame for our online belligerence and boorish behavior, they are ontologically geared toward hubris and navel-gazing. If I open Facebook, it immediately asks me “What’s on your mind?” and constantly encourages me to like, dislike, comment, or share: to actively express myself in one form or another.
Gentlemanlike conduct, on the other hand, requires humility. As Mitchell writes in another FPR post, “A sense of propriety, when properly formed and not merely a sense of personal dignity, requires an awareness of other people. A person with a well-developed sense of propriety makes other people feel at ease.” The gentleman, Mitchell writes, must also be amiable: “An amiable man is a good conversationalist who is interested in the people with whom he speaks. He is not self-absorbed nor is he so self-conscious that he refrains from engaging with others. …An amiable man is not a boor who cares only for the sound of his own voice.”
Our political moment—divided as it is—seems to have made a contest out of being controversial and belligerent. We’re always being dared by social media to share what we think, and the more inflammatory our thoughts, the more attention we get. This is exactly the sort of environment that gives rise to anger and division rather than empathy and peace.
All of this has been at the forefront of my mind amidst the heated debate surrounding Kevin Williamson’s hiring—then firing—by The Atlantic. There are many complicated facets to this controversy, and the best piece on the subject is (and will likely remain) Ross Douthat’s thoughtful Sunday column for the New York Times. Douthat, in his writing, exemplifies exactly the sort of decorum, gentleness, and dignity that my dad always used when engaging with political opponents. In so doing, he defended Williamson better than Williamson defended himself.
While I understand that “pushing people’s buttons” can be a sort of journalistic style, I would argue that it has done more damage than good in our political age. Commentators and politicians on left and right clamor to offend each other: they embrace the controversies of the Internet and throw their rhetorical barbs whenever possible—perhaps to get attention, perhaps in the name of free speech, but rarely (if ever) to actually persuade. Williamson has often said controversial things for dramatic effect (like, for instance, his suggestion in 2016 that struggling postindustrial communities “deserve to die”).
In a series of tweets published last week, Washington Post columnist Elizabeth Bruenig argued that this bluster and demagoguery has become a marketing model for the right, one that caters beautifully to their target audiences, but that translates badly for journalists who might want to reach more mainstream readers:
2. The feeling is mutual but the output is not. The reason is that rightwing media rewards and prizes a writer's ability to "trigger the libs." The more left-wingers you can personally upset and the more fury you can stoke, the better. Entire careers have been made this way.
— Elizabeth Bruenig (@ebruenig) April 8, 2018
I agree wholeheartedly with Bruenig, but I also don’t think this marketing model is used exclusively by the right. Take this tweet, published as the Williamson affair unfolded, by author Jill Filipovic:
And here is the thing we are reticent to say out loud: it is extrenelt difficult to find thoughtful, rational, non-racist, non-misogynist conservative Republicans to write for mainstream publications.
— Jill Filipovic (@JillFilipovic) April 5, 2018
According to Filipovic’s tweet (which was liked nearly 400 times), the Republican Party is almost entirely made up of irrational sexist bigots. This is hardly the sort of thoughtful, winsome language that’s likely to build fans and friends across the political aisle. It is the sort of thing, however, that is infinitely shareable on Twitter and likely to appeal to those within one’s own echo chamber.
Someone might read Douthat’s Sunday column and feel convicted, moved to sympathy, or eager to hear more. But few would listen to Laura Ingraham’s latest diatribe on Fox News, or read Filipovic’s tweets, and react that way.
A lot of this nasty back-and-forth is built around a dangerous whataboutism: the belief that, if X says nasty things about me and my political movement, I am entirely within my rights to lob insults back at him/her. Such reasoning is not new; it’s existed since Adam said the “woman [God] put here with me” made him sin.
But the virtues my dad practiced in conversation—gentleness, humility, quietness—require us to lay aside whataboutism. They mean we have to assume that our own words are not the most important, and that being right is not the end or culmination of conversation. Fostering peace in discussions online and off obligates us to consider that we may actually be wrong about something—and that, even if we are right, hurting or offending our opponent is not a prize worth seeking. To quote St. Paul:
Romans 12:16-21—Live in harmony with one another. Do not be haughty, but associate with the lowly. Never be wise in your own sight. Repay no one evil for evil, but give thought to do what is honorable in the sight of all. If possible, so far as it depends on you, live peaceably with all. Beloved, never avenge yourselves, but leave it to the wrath of God, for it is written, “Vengeance is mine, I will repay, says the Lord.” To the contrary, “if your enemy is hungry, feed him; if he is thirsty, give him something to drink; for by so doing you will heap burning coals on his head.” Do not be overcome by evil, but overcome evil with good.
Ephesians 4:1-3—I therefore, a prisoner for the Lord, urge you to walk in a manner worthy of the calling to which you have been called, with all humility and gentleness, with patience, bearing with one another in love, eager to maintain the unity of the Spirit in the bond of peace.
Philippians 2:3-4—Do nothing from selfish ambition or conceit, but in humility count others more significant than yourselves. Let each of you look not only to his own interests, but also to the interests of others.
Somehow, many of us right-leaning folks have come to see peaceable, gentle speech as “soft” or “weak.” We aspire more to Rush Limbaugh’s bluster than we do to Christ’s quiet example. And while I by no means expect a non-Christian to adhere to Paul’s admonitions, the many conservatives and Republicans who describe themselves as Christians should. No matter what the “PC crowd” says, we are called to a higher standard than whataboutism.
As we witness the collapse of manners online, it’s important to note that both right and left are culpable. But the responsibility for reforming manners and bringing back dignified conversation is ours; we cannot wait for others to make things better or to change the trajectory of our discourse. “So far as it depends on you,” Paul wrote, live peaceably with all.
Truth-telling is important. We must have courage in our discussions, and not back away from hard topics. “Gentleness” must not become an excuse for cowardly evasion. But in this particular day and age, we all too often label bombast and churlishness as “courage,” though nothing could be further from the truth. Rhetorical barbs and unkindnesses are just as cowardly as conversational evasion. They are both forms of self-protection, aimed more at elevating the self than at loving the others in our midst.
Don’t let Twitter or the television deceive you: manners take work. Kindness takes courage. And both are worth preserving and practicing in this fractured age.
American journalism is in a strange state these days. We all (readers, editors, and writers) know it. We all have been grappling with the seismic changes that have shaped media in recent years—technological, ideological, economic, and geographical. But we don’t all agree on what’s wrong with journalism and we can’t figure out what the profession ought to look like going forward.
This has become exceedingly clear over the last few days as a video montage of several local broadcast stations belonging to Sinclair Broadcast Group went viral. Sinclair owns nearly 200 television stations in the United States, more than any other media company, and is seeking to acquire more. In the words of NPR, the company has “long been criticized for pushing conservative coverage and commentary onto local airwaves.”
Much of this criticism remained out of the national spotlight, until news anchors were required by Sinclair to read a particular script on air, one that sought to rebuke the national news media and the spread of fake news:
I’m [we are] extremely proud of the quality, balanced journalism that [proper news brand name of local station] produces. But I’m [we are] concerned about the troubling trend of irresponsible, one sided news stories plaguing our country.
The sharing of biased and false news has become all too common on social media. More alarming, national media outlets are publishing these same fake stories without checking facts first. Unfortunately, some members of the national media are using their platforms to push their own personal bias and agenda to control ‘exactly what people think’ … This is extremely dangerous to our democracy.
We understand Truth is neither politically “left or right.” Our commitment to factual reporting is the foundation of our credibility, now more than ever.
Some of the statements in the above script seem obvious and uncontroversial. The sharing of biased or false news on social media has become a problem, as any who have followed Facebook’s 2016 election struggles can attest. Many news stories get published that are irresponsible and one-sided. This tendency toward bias isn’t necessarily a new phenomenon; it’s something journalists have struggled with since the dawn of time. But as news becomes a 24/7 cycle of hot takes and quick write-ups, it does seem to be growing ever more present in—and perhaps influential upon—Americans’ minds.
Nonetheless, a couple sentences in the above script are incendiary and concerning—particularly the bit that alleges “some members of the national media are using their platforms to push their own personal bias and agenda to control ‘exactly what people think.’” Statements such as these are two-edged swords, which condemn the accuser even as they are in the act of accusation: when you say that unnamed members of the national media are seeking to control what their consumers think, isn’t that a way of controlling what your own audience thinks or believes about the national media? “Some members” becomes an opaque, stereotyping blanket, spread indiscriminately over thousands of well-intentioned and hardworking journalists. What’s more, because this accusation is lobbed at the underlying intentions of the reporter instead of at specific actions or the quality of the product, it’s extremely difficult to verify, defend, or refute.
This sort of rhetoric is by no means new. It has lived on talk radio shows for years, employed by hosts such as Rush Limbaugh and Michael Savage to remind their listeners that there is no one who is trustworthy—outside of their own immaculately reliable programs, of course. The danger of this rhetoric is that it doesn’t urge listeners to “trust, but verify,” or even to verify then trust. Instead, it suggests that the national media are too cunning and crafty to be entertained or trusted at all.
There seem to be two primary camps of thought when it comes to the state of American news media. The first sees the nation’s leading news outlets—such as the New York Times, Washington Post, and Time—as credible and reliable sources of information in a dark political time (“democracy dies in darkness”). American journalism’s greatest problem in the minds of these critics lies in the rise of grassroots media that publish incendiary stories for the masses, such as Breitbart and Drudge Report. This camp disapproves of social media’s tendency to enclave humanity within ideological bubbles, but that disapproval primarily targets those whose political bubbles involve Russian fake news stories about Pope Francis and Hillary Clinton, and is less condemning towards its own online sociopolitical echo chamber.
The second camp allies itself with Donald Trump in seeing “fake news” as a proper descriptor for nearly all national media outlets: from broadcast stations like MSNBC and CNN, to the Gray Lady and the Los Angeles Times. To these skeptics, decentralized grassroots outlets offer more truth than their elitist counterparts in Washington, D.C. and New York City.
This second camp holds less elite credibility or support, but its complaints aren’t entirely unfounded. As FiveThirtyEight’s Nate Silver and Politico both reported last year, the national news media is largely clustered in bubbles throughout the United States. “Concentrated heavily along the coasts, the bubble is both geographic and political,” Politico’s Jack Shafer and Tucker Doherty report. “If you’re a working journalist, odds aren’t just that you work in a pro-Clinton county—odds are that you reside in one of the nation’s most pro-Clinton counties. …So when your conservative friends use ‘media’ as a synonym for ‘coastal’ and ‘liberal,’ they’re not far off the mark.”
Is this consolidation resulting in poorer—or at least less balanced—reporting? It could be. In a piece published last week for The Week, Matthew Walther suggests that breathless national coverage of Donald Trump—on everything from “covfefe” to the Russia investigation to Stormy Daniels—has failed to offer readers any sense of perspective or scale. It’s not that these stories aren’t important; it’s that some are more important than others, but all get painted with the same intensely dramatic brush. It’s also that some (less sensational) stories get buried when they don’t easily feature Trump as the villain. As a result, many news consumers are tuning out, while many others are growing increasingly suspicious of the whole spectacle.
“It is just possible, I suppose, that members of my profession could have exercised their reasoning faculties to decide what in the administration was good, what was bad, what was unremarkable or indistinguishable…what was painfully idiotic, what was, perhaps, evil,” Walther writes. “We chose not to exercise this responsibility. Instead we decided to indulge in our live-action roleplaying fantasies about being brave selfless journos taking on a mean demagogue because we love the Constitution so much.”
In many ways, this media battle—left versus right, establishment versus populist, city versus rural—reflects our growing political divide. Only one arena of consensus has appeared to remain amidst this vitriolic battle: most of us have continued to cite our confidence in local news media.
The reasons for this are pretty obvious. Local news can get messy and divisive, but it’s usually quite practical. It’s grounded in fact and circumstance, to a degree the abstracted nature of our national political conversations cannot be. What’s more, a lot of local news stations and newspapers feel a deep sense of responsibility to their communities. They get regular feedback from readers and watchers, and spend time out on the street talking to people and reporting on stories. There is less separation between them and their audience than there might be between the average Politico correspondent and his or her readership. This helps grow a sense of trust and accountability.
But what happens as small to midsize newspapers and media stations go out of business, or look to groups like Sinclair for financial backing? It seems that the primary problem with Sinclair is not any attempt to influence local stories, but rather in its trying to refocus our attention—once again—on the national and on the partisan.
While all of us in the media struggle with bias—something we’ve struggled with for generations—we are also increasingly grappling with a lack of independence, the potential impact of which is far from obvious. It is depressing to think that all these news stations, whose audiences probably believe them to be independent, were handed scripts by sponsors with strong national partisan ties. In so doing, Sinclair threatened the one journalistic arena in which some confidence and hope remained.
Independence isn’t just a local problem, however. It’s a national one, too. Sinclair has become a monolithic force in the local broadcasting world because of the collapse of independently owned local and midsize institutions. At the national level, publications are looking to wealthy donors or owners (such as Jeff Bezos) to keep themselves afloat. The best such benefactors will prize their publications’ independence. Some, however, are likely to use their financial clout to suggest—or even enforce—a certain editorial line. (What’s more, the bubble-intensive nature of national news media makes them much more susceptible to self-policing and groupthink enforcement.)
Personal bias is almost impossible to eliminate. I think most Americans have accepted this, and understand that the news they read will be (at least somewhat) influenced by the context, background, and beliefs of the reporters and editors involved.
But what about media independence: the ability of a reporter or publication to step away from some partisan, publication, or company line? Will that also disappear?
Hopefully not. Hopefully Sinclair scrutiny will prompt us to ask deeper questions about the independence and fairness of multiple media outlets, not just the ones that suit our ideological proclivities. Hopefully it will urge us to seek out (and encourage) more thoughtful, practical, and local conversations. Hopefully it will give us a greater understanding of the perils of political and geographical bubbles, large and small.
Because if not, we could devolve into a “fake news” battle that never ends.
The 2018 farm bill is currently at a standstill as congressmen debate proposed changes to the bill’s SNAP (Supplemental Nutrition Assistance Program) provisions. SNAP is a huge and important welfare program, one that reminds us that the USDA and the farm bill are not focused exclusively on farms, but are also responsible for a bevy of other rural development issues (such as rural energy programs, the rural housing service, and rural utilities service).
But this pause in the farm bill process also gives us an opportunity to talk about what this bill does not usually do well: namely care for the nation’s small to midsize farmers and incentivize sustainable farming methodologies.
This deficit in focus and care is not new. The farm bill’s bias towards bigness has existed for decades now, and was doubly reinforced during the 1970s by USDA Secretary Earl Butz (the man who notoriously told farmers to “get big or get out”). Many of the agricultural revolutions we’ve seen over the past few decades—from small family farms to large-scale factory farms, from crop diversity to commoditized homogeneity—emerged most prominently in the 1970s and 1980s under Butz’s leadership at the USDA. At the time, our understanding of agriculture and its purposes were also shifting: what had formerly been understood as a local enterprise meant to feed local inhabitants was increasingly viewed as a global enterprise meant to foster trade relations and massive corn and soybean sales overseas.
That understanding still dominates agriculture, and has transformed its processes. Today’s farmers produce gluts of goods that largely do not feed Americans, but rather get exported to countries abroad. International trade is not a bad thing: it’s important for diplomacy, economic security, and prosperity. But our emphasis on international commodity sales has increasingly distorted our local food markets, our care for the earth, and the sustainability of our production processes. What’s more, it has put an immense economic and psychological burden on American farmers: as President Donald Trump has proposed a variety of tariffs on countries such as China and has threatened to disband trade agreements with other nations like Canada and Mexico, farmers are the ones who primarily feel the heat and worry about their ability to survive. China buys upward of $14 billion in U.S. soybeans each year. Last year, they bought approximately 60 percent of the soybeans grown in the state of Iowa. What happens if those sales disappear under Trump’s leadership?
In the 1970s, Earl Butz assured American farmers that their surest plan of success was to plant from fencerow to fencerow—to invest all their vision, capital, and land in international trade.
But that promise has not paid reliable dividends. In the 1980s, largely in response to the expansion and homogenization Butz recommended, the U.S. underwent a severe farm crisis. Over the past several years, America has suffered similar slumps in the prices of corn, wheat, and other farm commodities, caused by a worldwide glut of grain. Farmers are now fearing a similar (if not more debilitating) farm crisis. The Wall Street Journal reported in early 2017 that some farms are shutting down in response, and warned that “the next few years could bring the biggest wave of farm closures since the 1980s.” The United States’ share of the global grain market is less than half what it was in the 1970s, and the Department of Agriculture estimated that farm incomes dropped 9 percent in 2017.
Over the past several decades, farmers who worry about making ends meet have turned to corn for a steady, assured source of income—not necessarily because the market is demanding more corn, but because government crop insurance and subsidies have assured them a steady form of return if things go badly. According to the National Sustainable Agriculture Coalition, subsidies provided farmers with 49 percent of their income in the year 2000—up from 13 percent in 1996.
In his book This Blessed Earth, journalist Ted Genoways noted that in 2014, American farmers set new records for corn and soybean production: they harvested 14 billion bushels of corn, and nearly 4 billion bushels of soybeans. “With so much production, roughly three-quarters of the harvest nationwide went directly into bins, as every farmer waited and prayed for rebounding prices,” he wrote. “They never came. Instead, prices continued to slump as yields continued to grow, and whispers spread about the possibility of another Farm Crisis.”
But as the 2018 farm bill sits in Congress, we are not talking about this looming possibility—not as much as we ought to, at least. The Wall Street Journal is one of the only large U.S. newspapers to have written about the perils in agriculture’s future if things continue this way. A lot of conservative and libertarian writers have recently called for farm bill reform, but their efforts are focused on spending cuts. That is by no means a bad focus, considering the glut of taxpayer dollars that currently go to America’s largest and wealthiest farms. But the problem is much bigger than that.
The ethos we’ve built around farming in the United States is broken. Rather than encouraging farmers to eliminate risk by diversifying their crops, pursuing sustainability, and embracing a proper scale, we’ve propped up a truly unsustainable style of production with subsidies and insurance programs. When things go sour, we tell farmers to just keep following the same formula, never asking whether that formula is malfunctioning, hoping in vain that new machinery or cover-cropping initiatives can band-aid over the problem.
As Wendell Berry wrote for The Atlantic in 2012, “Industrial agriculture characteristically proceeds by single solutions to single problems: If you want the most money from your land this year, grow the crops for which the market price is highest. Though the ground is sloping, kill the standing vegetation and use a no-till planter. For weed control, plant an herbicide-resistant crop variety and use more herbicide.” But these methodologies have led to a loss of biodiversity, rampant soil depletion, and devastating water pollution—issues that are plaguing communities across the United States. Faced with the pressure to produce more, dairies and chicken farms cram their animals into cramped, stinking, and inhumane spaces.
Fixing these issues requires more than a few reforms. It requires transforming our entire mindset towards agriculture and its problems. It requires rethinking our fascination with quantity over quality, and coming to grips with the “cult of the colossal,” as it has transformed the world of farming. It requires particularizing our conversation according to geography and farm, thinking about long-term wholeness over short-term profits, and demonstrating a loyalty to local agricultural communities—their jobs, land, and families—over the profit of national and international agribusinesses.
As Wendell Berry put it in Orion Magazine, “We can no longer pretend that agriculture is a sort of economic machine with interchangeable parts, the same everywhere…. We are not farming in a specialist capsule or a professionalist department; we are farming in the world, in a webwork of dependences and influences probably more intricate than we will ever understand.”
The future of America’s farms depends on so much more than one bill. But fixing what’s broken will require that we start here and now with the resources we have—before it’s too late.
What is marriage for? As cohabitation and singleness are on the rise, we increasingly struggle to answer that question. In 2010, 39 percent of Americans said they believed marriage was becoming obsolete. Those who do marry often cite “love” and “companionship” as their primary reasons for doing so—but why go through all the work to plan an expensive wedding when cohabitation no longer bears the social stigma that it used to?
This is a question Andrew Cherlin is determined to answer in a recent article for The Atlantic. He looks specifically at the spike in same-sex marriages that followed the 2015 Supreme Court case Obergefell v. Hodges, and wonders why so many homosexual couples (many of whom had been living together for years) saw marriage as integral to their future happiness.
He suspects that the answer is less practical than it is symbolic. “For many people, regardless of sexual orientation, a wedding is no longer the first step into adulthood that it once was, but, often, the last,” he writes. “It is a celebration of all that two people have already done, unlike a traditional wedding, which was a celebration of what a couple would do in the future.”
In many ways, this feels like a mirror of countless Disney movie plots I absorbed as a kid. Marriage is the end of the story: as wedding bells chime, we are told that the married couple lives “happily ever after”—without ever understanding what happiness in this “ever after” might entail.
What’s more, marriage as a celebration of two people and their accomplishments is far more isolated and individualistic than marriages past. It is, in many ways, the culmination of our collective emphasis on the nuclear family, an emphasis that has grown since the Industrial Revolution and that completely transformed the way we think about marriage and its meaning in society. Marriage used to be a much more practical, communal event. Emotion had little to do with it—in most ancient cultures, the couple themselves had little (if anything) to do with it.
In Christian societies, marriage became something more than mere contract: it was a covenant, wedded deeply to faith, virtue, and community. It was about more than two people and their caring for each other. The liturgical marriage vows (still occasionally used today) emphasize that the participants are “gathered here in the sight of God and in the presence of these witnesses.” Marriage was something to be witnessed—not merely for sake of celebration, but because of its deeper meaning and purpose. That purpose was (and is) deeply communal: Christian households were meant to be part of a larger church community, one that the apostle Paul called a “body.” The church body was required to tend and care for the health and wholeness of all its members, to live in constant fellowship and care.
Although Aristotle suggested that the household (oikos) was the core and beginning of community, he never said the household was sufficient for human community or happiness. Instead, he argued that individuals cannot perform their proper functions outside of a larger community. Households were not to exist in isolation, but rather to band together in service, community, and virtue. Married couples and their children need the polis—just as much as the polis needs them.
Thus, the relationship of a married couple to their larger community (be it familial, spiritual, or neighborly) is reciprocal: without larger context and support, nuclear households do not have the support they need to flourish. But without the integration and involvement of smaller households, communities do not have the “hands and feet” they need to care for their own.
Today’s marriages are still, in many ways, contractual. Marriage guarantees certain legal rights and benefits. It involves the same need for witnesses, commitment, and legality. But modern marriage is also, often, a contract that comes with easy, well-delineated exit signs. Prenups have little to do with guaranteeing that a marriage lasts—quite the opposite. Today’s marriages are usually set up not to last.
Part of this contractual temporality stems from our larger lack of purpose within marriage—divorced as it is from spiritual virtue or communal meaning, marriages start (and end) with the same focus: on personal wellbeing and emotional happiness. We see this reflected in the weddings our society creates: they are less about community than they are about fun, entertainment, and intense personalization. Wedding magazines show us glamorous destination weddings and fancy elopements, instruct us on how to shorten “boring” ceremonies or write our own vows. As we truncate or completely cut out the communal and covenantal aspects of the wedding ceremony, we increasingly divorce marriage from its foundations of support.
While marriage’s decline may be tied to the withering of the spiritual roots underneath our culture, it’s important to recognize that Christian marriages are failing, too. This is because we have neglected the practical, communal aspects necessary to marriage’s success. We have harmfully emphasized the nuclear family—the “perfect” suburban household—and forgotten the importance of supportive, nurturing communities. We’ve been surprised and disappointed to see the divorce rate among Christians rise, even as our churches have grown into fragmented behemoths or frail and desolate islands. As we’ve embraced the individualism of our society, we have neglected the roots and support structures that make marriages last. We have forgotten what marriage is for: not just love, commitment, and devotion, but larger service and wholeness within a caring community. And without being able to tell Christians how marriage can last, or giving them the support they need to make it last, we’ve done little more than put millstones around the necks of young, naïve couples. Our message is an only slightly spiritualized version of our culture’s individualistic “happily ever after” wedding ceremony.
It takes a village to make (and keep) a marriage. Thus does it makes sense that our isolated and fragmented society would increasingly make marriage itself seem obsolete and unnecessary. Without community—be it familial, church-oriented, or neighborhood-focused—marriages will continue to struggle. More and more young people will see their contractual nature and obligations as unnecessary.
Since this problem is cyclical and self-enforcing—community needs committed households, and committed households need community—it can seem rather daunting and impossible to fix. But perhaps a greater focus on neighborliness and communal revival really can preserve the context and foundation necessary for marriage to survive. By helping Americans remember what marriage is for—by surrounding them with communities that can support them through good times and bad—we may, in fact, help them have a real “happily ever after.”
A lot of people are citing Conor Lamb’s Pennsylvania special election victory last Tuesday as a sign of turning tides in our political atmosphere. And it’s definitely a bolstering outcome for Democrats who are aiming to win back the House and Senate.
But the picture painted by Lamb’s win should not be one of unequivocal blue dominance in a red district—because Lamb wasn’t your typical blue politician (at least not in this day and age). The 33-year-old is a Marine veteran and Catholic. He personally (though not politically) opposes abortion, has called for changes to the Affordable Care Act, emphasized the importance of bipartisanship, and said he would oppose an assault weapons ban. On economic issues, Lamb is a union supporter who backs President Trump’s tariff plan.
As Bill Scher put it for Politico, “[Lamb] took some positions from out of each camp’s bucket, all while brandishing his assault rifle.” When Washington County’s newspaper endorsed Lamb, they called him “the kind of moderate, conciliatory figure that is needed in this tempestuous moment in our political life.”
It’s likely Lamb’s careful centrism will drive politicians and pundits in both political camps crazy. Already, he’s getting heat for not giving more vitriolic talking points on Trump. When NBC reporter Kasie Hunt tried to get Lamb to say something critical of the president during an interview, he blatantly refused her bait:
Conor Lamb has no comment on whether President Trump is a stable commander in chief based on today’s events.
— Kasie Hunt (@kasie) March 13, 2018
One Guardian reporter said “Lamb has been almost painfully non-controversial.” In an age in which hating Trump is a badge of honor in the Democratic Party—and hating the Democrats usually serves as an equal badge in the Republican camp—Lamb seems sure to stir up controversy the moment he steps inside the Capitol building, if not before.
But in choosing not to align himself perfectly with one party or the other, Lamb actually did what politicians are supposed to do: he embraced a more gray and a more diverse platform, one that appeals to the interests and beliefs of the voters in his district. Lamb represents an old-school style of union Democrat who allows for flexibility and realism. That means he may not pass an ideological purity test—but it also means he appealed to the Trump Democrats and more moderate conservatives in his district. That ability to look past red and blue issues and instead offer a localized realism is probably the best way Democrats can defeat Republicans in future elections—because it’s specific and generous, rather than polarizing and absolutist. And Republicans right now aren’t doing a great job offering that sort of integrity and localism.
Consider the politician who Lamb is replacing: incumbent Republican Senator Tim Murphy, who stepped down from office after the Pittsburgh Post-Gazette revealed that he tried to talk his mistress into getting an abortion. Murphy was supposed to be a strong pro-life candidate and principled conservative. Turns out that was merely a façade.
Republicans chose “Jurassic-era conservative” Rick Saccone to run in Murphy’s stead. Saccone boasted he was “Trump before Trump was Trump,” and cut attack ads trying to defeat Lamb (who, in the meantime, was out campaigning door-to-door). He (like many Republicans of late) tried to run on anger, and he lost. Even Republicans didn’t turn out to vote for him.
Lamb, by contrast, offered voters the moral consistency they needed post-Murphy: he’s a Catholic who opposes abortion on principle, even if he doesn’t think it should be outlawed. He emphasizes “common ground” in a political arena that’s grown both vitriolic and stagnant. And like Trump, he says he will work for the unemployed and hurting workers of his district who have been ignored by the white-collar bigwigs in Washington. It’s not hard to see why a distant and seemingly angry Republican would lose to someone like that.
In an article for the New York Times published Wednesday, Tom Ferrick compared Lamb to the Pennsylvania 18th’s own Senator Bob Casey, who beat conservative Rick Santorum in a U.S. Senate race back in 2006. “In another era, Mr. Casey would be called a centrist,” Ferrick writes. “Now, he’s too conservative for progressives and too liberal for conservatives. The only people who like him are the voters.”
The biggest concern here is that Lamb will be pressured to toe the party line in Washington, and to put the interests of his constituents second to the demands of his party. But that’s exactly the attitude that has brought us a schismatic and extremely polarized Washington, one in which no one is willing to compromise or converse, and in which every new bill is a line drawn in the sand.
Some Democrats are saying Lamb’s centrism is the way forward for all races in the fall—but that hardly seems like a winning strategy either. Lamb ran for Senate as a representative of Pennsylvania’s 18th. One would hope a politician running for office in an extremely progressive, cosmopolitan district would run a different campaign—not merely for utilitarian reasons, but because they recognize the local spirit and desires of their constituents.
The point of Lamb’s win is that we should not try to impose some nationalized process or pattern on politicians who are meant to support and represent their districts. America is a smorgasbord of political proclivities and passions. Lamb should not have run like Hillary Clinton, whose cosmopolitan elitism made her seem distant and apathetic toward many within Rust Belt America (her cringe-worthy “basket of deplorables” comment serves as one example of such an attitude).
Meanwhile, Lamb’s win offers an important lesson to Republicans who don’t want to lose Congress in the following months: don’t run like Trump—on vitriol, polarization, and aggrandizement. Don’t offer more Tim Murphys, who say one thing and do another. A politician with character, nuance, and love for his community can still appeal to voters, even (perhaps especially) those discontented with the Washington status quo.
Regardless of the impression left by the schismatic political debates we watch on TV, a lot of people still prefer their politics to be respectful, nuanced, and local. Or at least, they used to. I think many more could be wooed to such a stance if only they had politicians of caliber to vote for.
You don’t have to read the news. But you should.
The latter is easier said than done though. Following the news has gotten increasingly complicated, depressing, and confusing over the past decade. It’s become difficult to separate the wheat from the chaff. (How important are the Oscars in the scheme of things? Do I have to follow the latest news on Trump’s alleged affair with Stormy Daniels? Is it important to know Jeff Bezos’ net worth?)
It’s difficult to determine the best methods for absorbing and analyzing news as well: we wonder whether reading the newspaper, listening to podcasts, scrolling through Twitter and Facebook feeds, catching up with Hugh Hewitt, or watching Wolf Blitzer will give us the most interesting, diverse, balanced, and thoughtful range of information.
We wonder whether we are getting the whole picture. (Perhaps if we watch both Fox and MSNBC, we’ll find the truth somewhere in the middle?) We wonder what journalists and what outlets are most dedicated to giving us news without the spin. And as we survey news feeds increasingly filled with vitriol, bias, and bombast, at least some of us are tempted to tune out everything altogether—all the anger, confusion, and scandal.
Distrust of and frustration with the media is nothing new. According to Gallup, we trusted newspapers most in 1979 when 51 percent of Americans said they had a “great deal” of confidence in them. But according to FiveThirtyEight’s Clare Malone, the numbers have gone downhill since then—and in 2016, only 20 percent of Americans said they trusted newspapers.
First, it’s important to note that the burden for fixing this problem falls largely on the media itself: news outlets and their staffs should strive to write in thoughtful ways that transcend stereotype and cynicism. Those of us who write opinion news have an even greater responsibility in this battle. The more we vilify “the other side” or use absolutist, black-and-white terms, the more likely we are to obliterate opportunities for civil discourse and bipartisan agreement.
But readers are also responsible for how and whether they absorb this news. We must all figure out the balance between tuning out and being too tuned in.
Fostering an empathetic and civil public process requires the soft-hearted and sensitive to remain part of the conversation, to push through the vitriol and bombast with words of gentleness and persuasion. Sadly, these are exactly the sort of people most likely to be turned off by our current news cycle, and the most likely to tune it all out.
The problem is that shutting out the shouting match entirely, while it may help with peace of mind, can also have political, cultural, and social implications. It’s most important that we are invested in local news and philanthropic efforts—but what if we turn away from regional and national news altogether? Those who live in more comfortable and privileged corners of America are, in a sense, off the hook: they no longer have to read about the plight of the unemployed or the nation’s staggering (and growing) opioid crisis. They can turn a blind eye to our nation’s foster care emergency. And they can pretend that churches and associations throughout America are doing just fine, because theirs is.
Tuning out—unless it’s coupled with a very intentional effort to plug in physically and locally—can have detrimental implications.
While several thoughtful and smart people have argued in the past that the news does greater harm than good in society (Richard Weaver being perhaps one of the most prominent of those voices), I do not think we can ignore the benefits it could and should bring to our populace—especially in a world in which, like it or not, folks just aren’t reading heady philosophical or political books as much as they used to. Our introduction to the vital ideas and issues that will shape our nation and make us better citizens are most likely to come through the news. In just the past few months, columns by Ross Douthat and Andrew Sullivan, news articles at Vox and the New Republic, podcasts by Christianity Today and Strong Towns, have offered me thoughtful, considerate, and opinion-shifting ideas and facts. Sometimes tuning in can make us better.
But of course, staying tuned in to the wrong outlets can also have a nasty effect on our political discourse and states of mind. As one journalist told me, many of our news outlets are more focused on “narratives” than they are on stories these days. Talking heads make sure we know “the bad guys are still just as bad and the good guys still just as good,” and we emerge from an hour of news coverage more enraged than when we turned on the television.
Because of this pattern of titillation and fragmentation, Weaver compared modern news media to the Platonic cave: “the multitude sits with eyes fixed upon the wall where shadows rise and fall. The wall of shadows is the world of the ephemeral and trivial.” While the philosopher seeks to draw multitudes out of the cave and into the real world, “the function of the modern journalist is to keep the focus of the multitude upon the wall.” In this cave, Weaver wrote, the media propagates a “sickly metaphysical dream. The ultimate source of evaluation ceases to be the dream of beauty and truth and becomes that of psychopathia, of fragmentation, of disharmony and nonbeing.”
These are strong words—words that journalists should take to heart and news consumers should consider carefully. But they should not prevent us from seeking to understand the sharp and poignant issues shaping our world and nation. While some of the best authorities on our modern struggle will be found in books, there is also excellent journalism out there worth reading. The trick is finding it, amidst the 24/7 news cycle and growth of tabloid journalism.
Journalists and readers alike should ask themselves whether the pieces they write and/or read are asking the right questions, proffering a long-term view (not just a quick hot take), and sharing the opinions of smart and thoughtful people. Stories should offer a realistic and clear understanding of both sides of a debate—even if an op-ed is prejudiced towards one viewpoint, it should still strive to offer a fair picture of the opposing view. And all of us could probably do a better job of steering clear of any new medium—radio, print, television, or social media—that devolves into a shouting match or bombastic rhetoric.
News needn’t keep us trapped in the cave. But staying away from news shouldn’t mean holing up in a literal cave, away from any media influence. There’s a balance we can strive to achieve. It’s true that finding nuggets of real gold amidst the trash heap of modern news media will take some work. But I believe, in the long run, that we can be all the better for it.
A lot of folks don’t read as much as they used to. In 2014, Pew reported that nearly 25 percent of American adults hadn’t read a single book in the past year. And “the number of non-book readers,” The Atlantic noted, “has nearly tripled since 1978.”
What about tech-obsessed young people? Many have their doubts that Millennials can keep reading alive—what with stories assuring us that “The Bachelor” is the new book club, and reports that the young spend more time on their phones than any other age group.
But there are some signs that younger generations are rediscovering a love of reading. According to a 2017 Pew poll, 18- to 29-year-olds are those most likely to have read a book in any format over the past year. It’s likely that school accounts for much of this reading—as Forbes reporter Neil Howe notes, “millennials are far more likely than older adults to say [their reading is] for a specific purpose, such as work, school, or research”—but Pew also found that young people are “equally likely to read ‘for pleasure’ or ‘to keep up with current events.’”
This is a great trend, but it’s also one that remains a lonely pursuit, and might easily fall off post-college if not for some fellowship and accountability.
Recent studies have found that many Millennials—even those who are smart, successful, and digitally connected—are struggling with loneliness. A General Social Survey published last year found that the number of Americans with no close friends has tripled since 1985. In a 2011 study, 86 percent of Millennials reported feeling lonely and depressed. It seems that, even as our Facebook friend circles expand into the hundreds and even thousands, our real-life circles of comrades and kindred spirits are steadily dwindling.
Perhaps our growing love of books can be a solution to this difficulty. Many associate book clubs with an older generation of readers, but a revival of these focused, communal gatherings could stem the tide of isolation that threatens us in our digitalized and displaced time.
One friend of mine, Kira, has seen firsthand how book clubs can foster community and friendship. She has started four book clubs since college: one at Harvard, one in Pittsburgh, and two in Arlington, Virginia. While her first club consisted of college students and their significant others, moving to Pittsburgh presented a different challenge. Kira wanted to create a book club that was explicitly place-focused. “I joined a couple of running clubs and made some friends there, but the majority of my [book club] members came through an announcement I put up on Next Door,” Kira recalls. “From that posting, I got about 15 interested women and on a regular basis had between six and 10 women show up for each club.” Even though Kira and her husband have since moved to Arlington, her Pittsburgh book club is still going strong in her absence.
In Arlington, Kira has created two book clubs: one with a group of old college friends who like to discuss more theological books, and one neighborhood club akin to her Pittsburgh group, also built through the Next Door app. “Right now, my main focus is on building friendships,” she says.
Kira chooses the books for her group, and generally likes to prepare a set of questions. She notes that, as members grow more comfortable with each other, conversation becomes relaxed and organic. Her groups have read classics like The Brothers Karamazov, and more recently works like Atul Gawande’s Being Mortal. In the past, she’s served coffee and tea with a treat like scones or apple bread. Recently, she’s started serving dinner and wine as part of the gathering.
But regardless of format or genre, she says “there’s something almost magical” about the book club’s ability to foster community: “In the midst of discussing a plot point or character trait, we fall into deep friendship.” She points to a P.G. Wodehouse quote: “There is no surer foundation for a beautiful friendship than a mutual taste in literature.”
I’ve also found reading literature in community—whether in a book club, with a sibling, friend, parent, or spouse—to be an incredibly comforting and growing experience. My fellow readers often stretch and challenge my presuppositions, forcing me to look twice at characters I may dislike or ideas I may be uncomfortable with. They point out beautiful details and quotes I might otherwise have overlooked. And our discussions often transcend the page, touching on larger life issues we may be dealing with. One friend and I put together an email “book club” after college because we were too physically distant to meet regularly. But via email, we read Ernest Hemingway together and discussed life challenges and successes. My husband and I usually try to read at least one book together per year—aloud on the couch after our toddler is in bed, or via Audible in the car. We’ve introduced each other to childhood favorites, read books about theology and philosophy, and indulged in fun novels together.
A new blessing I’ve discovered is my sister-in-law’s book club, which meets every other week over tea and cookies (or fresh bread, olives, and cheese—we’re flexible). It’s become a safe space, a way to unwind and rest amidst the craziness and loneliness of life. New friendships have blossomed, hearts and stomachs have been nourished, minds have grown and been challenged. The group is diverse—we don’t always like the same book, or agree on all the ideas therein—but that’s only sharpened and brightened our discussions.
Starting a book club might seem frightening to some. It’s not always easy to think of book ideas, or to take the time to prepare questions and food. One way to ease the burden associated with hosting is to rotate the home in which friends meet, and let members take turns picking a book and preparing questions. (Meeting in a public space like a coffee shop or restaurant can also ease that burden, but it can take some of the intimacy out of the gathering and make it a little more expensive for members to participate.) Clubs can also be thematic in nature—work their way through the classics, tackle a philosophical or political question, or focus on newer works of literature, for instance—in order make picking books a bit easier.
It’s difficult, as we enter new stages of our life or new spheres of work and living, to keep community alive. Churches and running clubs, dog parks and libraries, school events and town hall meetings, can all help us meet and mingle. But there’s something special about a close, small gathering of readers, all eager to deepen their knowledge of the world, the written word, and each other. Such gatherings seem uniquely suited to combat our collective disillusionment—and foster joy, hope, and camaraderie in their place.
I didn’t need another thing to take care of. Life was already too busy. I could barely keep up with my daily to-do list as it was.
That’s what I told myself as I contemplated (for the umpteenth time) making a sourdough starter. A sourdough starter is a living thing, and functions much like a house plant: it requires daily tending and feeding, and must be watched for signs of health and wholeness. Friends who have embraced their sourdough journeys often upload pictures of gorgeous brown loaves to Instagram, including Michael Pollan who posts photos of his own (and others’) crusty homemade artisan bread on his Twitter profile.
But if today brought a sourdough starter, tomorrow would bring homemade yogurt and kombucha, I told myself. If I started down this path, my life would soon be overrun by the tedious tending of fermented and homemade things.
Is that possibility really so bad? Why does tending a sourdough starter seem so annoying? Pondering this question, I realized it had little to do with the ritual itself—which only takes five minutes—and everything to do with that lingering, underlying sense that it was inconvenient and that there were “other (more important) things I should be doing.”
I was thinking about all this as I read Tim Wu’s Sunday article in The New York Times about the “Tyranny of Convenience.” As Wu writes, in America today, the supremacy of convenience has reshaped our lives in monumental ways:
Convenience has the ability to make other options unthinkable. Once you have used a washing machine, laundering clothes by hand seems irrational, even if it might be cheaper. After you have experienced streaming television, waiting to see a show at a prescribed hour seems silly, even a little undignified. To resist convenience — not to own a cellphone, not to use Google — has come to require a special kind of dedication that is often taken for eccentricity, if not fanaticism.
… But we err in presuming convenience is always good, for it has a complex relationship with other ideals that we hold dear. Though understood and promoted as an instrument of liberation, convenience has a dark side. With its promise of smooth, effortless efficiency, it threatens to erase the sort of struggles and challenges that help give meaning to life. Created to free us, it can become a constraint on what we are willing to do, and thus in a subtle way it can enslave us.
Our fetish for ease seems to stem from promises fed to us during and after the Industrial Revolution. Prior to the rise of mechanization, manual labor and back-breaking chores were daily realities. Vacuum cleaners, dishwashers, canned food—all these things promised to free us from our bondage to the quotidian. “By saving time and eliminating drudgery, [convenience] would create the possibility of leisure,” writes Wu. “And with leisure would come the possibility of devoting time to learning, hobbies or whatever else might really matter to us. Convenience would make available to the general population the kind of freedom for self-cultivation once available only to the aristocracy. In this way convenience would also be the great leveler.”
But as technology sought to free us from more and more of the tasks that otherwise “chained” us to reality and diligence, liberation itself became ever more illusory. Our mediums of emancipation became founts of distraction. In the vacuum created by convenience, we turned to the television and the internet. Now, when the conveyer belt of ease breaks down—when our washer and dryer stop working, or our internet is down, for instance—we find that convenience has actually enslaved us. Without machines to do our laundry or keep us company, we’re at a loss.
Thus can our craving for convenience easily deceive us, luring us into a false sense of freedom, distracting us from the things that make us better human beings. We so easily forget that sometimes—often—the best things in life take hard work. Playing a musical instrument, learning a sport, tending a garden, building a table: all these things are difficult. But they’re also good—for our souls, minds, and bodies.
“Today’s cult of convenience fails to acknowledge that difficulty is a constitutive feature of human experience,” Wu writes. “Convenience is all destination and no journey. But climbing a mountain is different from taking the tram to the top, even if you end up at the same place. We are becoming people who care mainly or only about outcomes. We are at risk of making most of our life experiences a series of trolley rides.”
As Alexander Langlands puts it in his new book Cræft, “We’re increasingly constrained by computers and a pixelated abridgement of reality that serves only to make us blind to the truly infinite complexity of the nature world. Most critically, our physical movements have been almost entirely removed as a factor in our own existence. Now all we seem to do is press buttons.”
Which brings me back to sourdough starter. There’s a reason I wanted to add this ritual to my daily routine: it’s a creative act that saves money, serves my family, and creates a thing of beauty. It’s healthier than store-bought bread, and making it is a discipline with scientific and culinary properties that are fascinating and complex. It’s a craft that, in our convenience-driven culture, could all too easily go extinct.
But I’m also drawn to sourdough because I’ve learned that the best parts of my life are without a doubt the most “inconvenient”: they require time, focused attention, and loving care. All the things that bind me to place—the toddler running around, the Irish Setter begging me to play fetch with him, the myriads of house plants needing watering, the little seedlings waiting to be planted in our vegetable garden—are also the things that light up my life, bringing vibrancy and joy and meaning to every day. I’m proud of them, eager for them to flourish and remain healthy and vibrant.
If we’re honest with ourselves, we must admit that we are also “inconvenient” beings: anxious for time and affection and comfort, eager to be loved and focused on. It is an act of deep hypocrisy to treat the world around us as something that we ourselves are loath to be: easily consumed, quickly disposed of.
In a consumer culture, we’ve forgotten what it means to create—and we’ve forgotten just how liberating and rewarding the act of creation is. It is time-consuming and difficult to sew a quilt, plant a tree in the backyard, or bring a meal to a family in need. But creative and manual experiences allow virtue, beauty, and sustainability to flourish in our daily lives. Every moment—no matter how inconvenient or time-consuming—is worth it. We just have to change our focus: from ease to meaning, and from entertainment to joy.
Gracy Olmstead is a writer and journalist located outside Washington, D.C. She’s written for The American Conservative, The Week, National Review, The Federalist, and the Washington Times, among others.
The past two years have shattered Mark Zuckerberg’s faith in social media, or humanity, or both.
At least, that’s one of the primary takeaways from Wired magazine’s feature last week on Facebook and fake news, written by Nicholas Thompson and Fred Vogelstein. In it, Thompson and Vogelstein consider Facebook’s many slip-ups and struggles in publishing and curating news stories over the past two years—from allegedly disfavoring conservative coverage to allowing Russian bots to influence news feeds during the 2016 presidential election.
Throughout the story, one theme remains strong: Facebook’s employees and CEO are only now beginning to understand their responsibility to the world and the dangerous power of the platform they’ve created. Zuckerberg, like many in our enlightened and progressive age, believes in human perfectibility. He believes in circumstantial error, not sin. He believes (or at least believed) in humankind’s ability to craft the perfect scenarios and platforms in which to break down all malice and misunderstanding, while not necessarily believing that evil is endemic to the human condition.
Which means that, when Facebook came under fire for the spread of misinformation and fake news in early 2017, he didn’t respond by noting the flaws of his global empire or in its users; he doubled down on the hopeful perfectibility of his platform and its users:
Amid sweeping remarks about “building a global community,” [Zuckerberg] emphasized the need to keep people informed and to knock out false news and clickbait. Brown and others at Facebook saw the manifesto as a sign that Zuckerberg understood the company’s profound civic responsibilities. Others saw the document as blandly grandiose, showcasing Zuckerberg’s tendency to suggest that the answer to nearly any problem is for people to use Facebook more.
Of course, Zuckerberg’s hopes for his creation are understandable—none of us want to acknowledge the inherent flaws in something we’ve dedicated our lives to curating and crafting. And it’s understandable—even laudable, in a way—that Zuckerberg would refuse to let the confusion and vitriol of the 2016 general election impact his faith in humanity.
But Facebook-as-global-community isn’t working. Since creating the social media network, Zuckerberg has always emphasized his hope that it would foster greater empathy and understanding around the world: that it would enable us to understand each other better, to bridge racial and partisan divides, and to instill our interactions with greater temperance.
Instead, the opposite has happened: Facebook, via human choice and algorithmic tendency, has fostered ideological and political bubbles amongst its users, not global community and rapport. We join groups filled with like-minded people, like pages that give us the news we want to hear, unfollow or unfriend those who annoy us or make us uncomfortable. Based on the posts we spend the most time liking, commenting, and watching, Facebook gives us more of the same, rather than encouraging us to expand our ideological repertoires.
These tendencies are extremely difficult to circumvent. We can follow accounts that challenge our beliefs, keep listening to the friends who annoy us (and engage them via comments and messages when we find it appropriate)—but there remains a degree of disconnection between us and this virtual content. That disconnection makes it easy to tune out, or to respond with more anger or bombast than we might employ in personal conversation. Additionally, we too often indulge in clickbait on Facebook or other online sites rather than forcing ourselves to absorb more thoughtful, well-crafted content. Facebook’s algorithm recognizes this tendency—and passively fosters it. Thompson and Vogelstein write:
People like Alex Hardiman, the head of Facebook news products and an alum of The New York Times, started to recognize that Facebook had long helped to create an economic system that rewarded publishers for sensationalism, not accuracy or depth. “If we just reward content based on raw clicks and engagement, we might actually see content that is increasingly sensationalist, clickbaity, polarizing, and divisive,” she says. A social network that rewards only clicks, not subscriptions, is like a dating service that encourages one-night stands but not marriages.
To some extent, Facebook cannot control or circumvent this tendency. It is the result of our sinful natures, our ability to shape a product according to our own flawed image. Facebook isn’t innocent in the way it curates and shapes our news feeds, but its creators and curators will always find it hard to balance between excess and defect in their attempts to foster a virtuous online community. Do too much, and they risk playing an alarmingly intrusive role in people’s lives and mental formation. Do too little, and they encourage an anarchic atmosphere of disinformation, vitriol, and offensive content.
Besides, many will ask (and rightfully so) what right Facebook has to determine what is and isn’t propagated on its site. It is meant to be a platform for connection, not a social policeman. Facebook didn’t force its users to absorb the Russian-propagated fake news of 2016. “Sure, you could put stories into people’s feeds that contradicted their political viewpoints, but people would turn away from them,” note Thompson and Vogelstein. “The problem, as Anker puts it, ‘is not Facebook. It’s humans.’”
Michael Brendan Dougherty pointed this out in an article for National Review last week, one that astutely contrasted Thompson’s and Vogelstein’s piece with an op-ed Ross Douthat just wrote for The New York Times in which he argued that we should ban pornography. While the first (well-received) article suggests that Facebook (and we ourselves) can and should curate an online atmosphere of virtue and respect, the second was received with anger and disdain, as even many who respected the Wired article argued that—at least when it comes to porn—we should be able to do what we want. This ironic contradiction, Dougherty argues, stems from the same fantastical attitude toward online life:
Our culture-makers seem to believe in a neatly cleaved human nature. In one realm, we can expect ourselves to act as angels, and do the disinterested thing. In another, perhaps to let off some steam, we must give the Devil his due.
But perhaps the defenders of porn should consider that the common purveyors and sharers of fake news across social media are also engaged in a form of self-abuse, combined with titillation, and fantasy life. They no more believe that Barack Obama is running guns to ISIS than that the surgically enhanced 30-year-old woman in a plaid skirt is a very bad Catholic-school girl. It’s just a reality they prefer to envision. One where they can gaze into a backlit screen, click around, and imagine they aren’t wasting their lives clicking around on a backlit screen.
We love the internet because it enables us to craft the world we want, instead of forcing us to confront the world as it is. We can doctor our Instagram selfies, visit news sites that foster our less temperate and virtuous passions, follow the Facebook users who will puff up our self-esteem, dabble in the darker delights of the internet—all without ever asking ourselves whether it’s “right” or “wrong.”
But at some point, we must confront the world we’ve made online—because whether we like it or not, it will infect and alter our physical reality.
Zuckerberg has been immersed in that world longer (and on a deeper level) than most of us—which means he’s reached this moment of crisis and self-confrontation earlier than most of us. Thompson and Vogelstein write that “people who know him say that Zuckerberg has truly been altered in the crucible of the past several months. …‘This whole year has massively changed his personal techno-optimism,’ says an executive at the company. ‘It has made him much more paranoid about the ways that people could abuse the thing that he built.’”
In recent months, the authors write, Facebook has decided it’s “putting a car in reverse that had been driving at full speed in one direction for 14 years.” Although Zuckerberg has always intended “to create another internet, or perhaps another world, inside of Facebook, and to get people to use it as much as possible,” he’s now encouraging changes to the platform, which, he believes, will make people use Facebook less. Perhaps he’s realizing he needs to design for fallibility, addiction, and bombast—not for perfectibility, innocence, and self-control.
But Zuckerberg cannot edit human sin out of Facebook’s algorithms. We are ultimately responsible for our own choices, online and off, which means we all need to have our own Zuckerberg moments in coming days. That means we must realize that the internet does not, in fact, bring out our best selves, and must be approached with temperance and caution. We must confront the beasts we’ve created online: our own miniaturized Facebook Frankenstein monsters, whether they be unkind profile posts, ruined friendships, malicious blog posts, or toxic news feeds. If we don’t, we risk deeper consequences than fake news and nasty election cycles. We risk giving up real community and real virtue for fantastical and fake counterfeits.
Gracy Olmstead is a writer and journalist located outside Washington, D.C. She’s written for The American Conservative, The Week, National Review, The Federalist, and the Washington Times, among others.
Despite the unexpected political events of the last year, conservative historian and scholar Lee Edwards isn’t afraid for the future of conservatism. He has worked in Washington, D.C. for decades, observing the highs and lows within the movement, and he assures me: we’ve seen worse.
In the 1950s and ’60s, the Republican Party was run by eastern elites, and conservatism was an intellectual movement without much clout or voice in the halls of the nation’s capital. Communist fervor was overtaking Europe, eating into the policies and rhetoric of many within Washington. And, although Russell Kirk’s The Conservative Mind was published in 1953 and National Review’s founding followed two years later, conservatism’s revival in the political and popular sphere was just beginning to gather steam.
It was at this pivotal moment that Edwards began lending his voice and talents to the fledgling conservative cause. The son of Willard Edwards, a noted political reporter for the Chicago Tribune, young Edwards grew up surrounded by political powerhouses in Washington, D.C. Richard Nixon and Senator Joe McCarthy were frequent guests in his childhood home.
“My first impression of Joe—he encouraged you to call him by his first name—was that of a shoulder-squeezing, joke-telling politician who drank but not any more than the average Irishman,” Edwards writes in his new book, Just Right: A Life In Pursuit of Liberty. “He was serious about one thing—communism.”
So was Edwards’s father: from the late 1940s through the mid-’50s, Willard Edwards worked extensively with ex-communists and anticommunists in Washington. Through this network, he met Freda Utley, author of The China Story, who escaped from the Soviet Union with her son Jon in 1936. Jon’s father, Arcadi Berdichevsky, was arrested and executed during the Soviet purges of the 1930s.
Jon Utley remembers meeting Edwards when Utley was still a teenager, long before he began his own career within Washington’s conservative world. Now publisher of The American Conservative and a commentator for various publications, Utley remembers that era as foreboding and frustrating for conservatives. When he finished college, Utley feared the communists would win the Cold War. He left for South America, where he worked in insurance and finance before becoming a foreign correspondent. “When I came back in 1970, people like Lee Edwards had stayed and fought,” he recalled, adding that anticommunist prospects seemed brighter with the emergence of Ronald Reagan and other likeminded conservatives.
For Edwards, the anticommunism of his father’s generation became real in Europe in 1956, with the Hungarian uprising against the Soviet yoke and the plight of young anticommunists in Budapest. He had gravitated to Europe after graduating from Duke University and was living in Paris, where he hoped to pen the next great American novel. Instead, the events in Hungary pulled him back to his father’s cause.
Edwards heard the pleas of freedom fighters on his Paris radio, begging for help from America and its allies, and he was infuriated by the passivity of the Eisenhower administration. “As the number of fallen freedom fighters passed two thousand and tens of thousands of Hungarians fled their once-again-communist country, I took an oath,” he writes. “I resolved that for the rest of my life, wherever I was, whatever I was, I would help those who resisted communism however I could.”
Edwards abandoned his novel-writing dream and moved back to Washington, D.C. There he joined the D.C. Young Republicans and became press secretary for Maryland Senator John Marshall Butler.
In 1960, Edwards helped found Young Americans for Freedom, a group that embraced the fusionism of the burgeoning New Right. Its members incorporated libertarian and anticommunist themes into their determinations, building a platform that emphasized economist Friedrich Hayek’s philosophy and a strong foreign policy alongside traditional conservative values. “We were rebels against the zeitgeist,” Edwards recalls with a smile.
He served as editor for YAF’s monthly magazine, New Guard, for the next few years, and acquired a new job with a Washington public relations firm. In 1963, he became news director for the National Draft Goldwater Committee, which emerged in anticipation of Senator Barry Goldwater’s expected presidential run the following year.
Goldwater’s rise in the Republican Party was deeply important to the New Right, and his book The Conscience of a Conservative (ghostwritten by Brent Bozell) had a considerable impact on Edwards. In those early years, before JFK’s assassination, there was a good deal of excitement amongst Goldwater’s followers. He had, after all, a strong populist following with grassroots conservatives across the nation. His principled conservatism and staunch anticommunism appealed to many.
But with President Kennedy’s assassination, Goldwater knew he couldn’t win the presidency. Friends and advisers urged him to run anyway—not to win but to galvanize conservatives, especially young conservatives such as Edwards, who had been inspired and encouraged by Goldwater for years.
It was not an easy campaign for Goldwater—or for Edwards. While their populist following was passionate, the GOP candidate encontered vehement media hostility. Due to his strong foreign policy messaging, many suggested that Goldwater was dangerously belligerent. Lyndon Johnson described him as a “ranting raving demagogue who wants to tear down society.” Fact magazine published a special edition just before the election arguing that the senator was mentally unfit to be president.
Edwards looks back on many of these events with a sense of déjà vu, especially following the events of last year’s presidential election. While Goldwater and Donald Trump aren’t at all similar—“Barry Goldwater was the product of a movement,” notes Edwards, “whereas Trump isn’t that, he’s a businessman”—both tapped into the populist fervor of the “forgotten America” many eastern elites disdain or forget about. Goldwater’s “Forgotten American,” Ronald Reagan’s 1980 “Moral Majority,” and the Tea Party all had one thing in common: they were grassroots movements, galvanized by rhetoric that was anti-establishment in many ways.
“To our amazement, Trump understood and applied that, and won an impossible victory,” Edwards says. “I believe the populist movement is always there and engaged in this debate. They will be there in 2020 and beyond.”
Goldwater’s loss brought dark days to the conservative movement in Washington and throughout the country. “It would’ve been easy to give up,” Edwards says. “Our ideas were defeated.” Writer and intellectual Frank Meyer brought hope to Goldwater’s supporters, however, by writing in National Review that 27 million voters—those who had supported Goldwater in the presidential election—could form a solid conservative coalition.
From that hope, Edwards says, the movement struggled on, forming new alliances and organizations, working to tap into the grassroots fervor that Goldwater had encouraged. “Liberals were saying that was the end [for conservatives],” Edwards says. But in reality, as Utley noted during our interview, “Goldwater’s campaign led to the conservative resurgence. It was the start.”
Edwards went on to work with Bill Buckley, Milton Friedman, Richard Viguerie, and other prominent figures of the New Right. He started his own public relations firm and wrote for Readers Digest and Conservative Digest. But perhaps most notably, Edwards began chronicling the lives of the many renowned conservatives he had worked with, including Reagan, Buckley, and Goldwater. Through this body of work, he became known as a leading historian of American conservatism.
Edwards did not stop working within the anticommunist movement. But whereas many of his fellow anticommunists dedicated their energies to fighting terrorism after 9/11, he worked to create Washington’s Victims of Communism Memorial, which he cites as the greatest accomplishment of his career. After years of fundraising and bureaucratic red tape, the 10-foot Goddess of Democracy statue was dedicated by President George W. Bush in 2007, surrounded by champions of the anticommunist cause and survivors of communism. Edwards saw it as the fulfillment of his Paris vow, made so many years before.
Utley sees Edwards’s anticommunist work as the most important of his career. “My son asks me, ‘What was the big deal? The communists collapsed like a house of cards,’” he says. “But it didn’t look that way in those days.”
That isn’t to say that times aren’t still challenging, albeit in different ways. Edwards is troubled by a Millennial generation that, according to a poll conducted by his memorial foundation, are 50 percent more comfortable with socialism than with capitalism. He has also seen, as a professor at Catholic University, an uptick in the distractibility and shortening attention spans of his students. America’s youth have grown up in a different culture, one that encourages attitudes of entitlement and instant gratification. Trump’s tweets, he suggests, are the perfect symbol of this culture. Such an environment can make it difficult to instill the ideals of Russell Kirk and Edmund Burke, to revive and renew a movement inspired as much by virtue and philosophy as it is by history and policy.
Yet despite this, Edwards remains hopeful. “I’m an optimist. That’s how I’ve managed to survive in this town,” he says. Edwards sees many “young, brilliant, charismatic politicians” in the political arena, citing Mike Lee, Nikki Haley, Mike Pence, and Ben Sasse. He’s observed the healthy activity of groups such as the Intercollegiate Studies Institute, the Fund for American Studies, YAF, and Students for Liberty, all working to bolster conservatism’s future. He’s also observed the birth and growth of conservative colleges such as Hillsdale College, The King’s College, and others. All in all, he says, conservatism is “not a dying movement, but a clamoring one.”
Edwards’s new autobiography serves to highlight both the perils and excitements of the New Right’s early years—chronicling the frustrations of the Goldwater campaign, the fears and struggles of the Lyndon Johnson and Jimmy Carter years, and the exhilarating hope of Reagan’s presidency. But perhaps most of all, Just Right encapsulates the tenacity of a movement that has refused to die, despite adversity—largely due to the work of men such as Edwards, who have remained faithful even in those inevitable times of hardship and defeat.
Gracy Olmstead is a writer and journalist located outside Washington, D.C. She’s written for The American Conservative, The Week, National Review, The Federalist, and the Washington Times, among others.
Our world is full of wrath and bombast: bickering politicians, Facebook quarrels, tempestuous wars, cultural schisms. Amid this vitriol, it can be easy to lose hope, to look for peace and find none.
Fortunately, history can offer comfort and conviction. It reminds us that we’ve been here before: experienced similar fractures and divisions, committed comparable sins against humanity. History, too, can show us shining examples of hope in turmoil, proffering up individuals and communities that stand out like beacons amidst the pitch-black darkness of their time.
Philip Britts was one such beacon. He was a farmer, pastor, and poet during World War II, a steward of place and soil, a caring father and husband, and a zealous advocate for peace and community. Born in 1917 in Devon, England, Philip felt an early calling to the earth and its stewardship—a calling that stands out in one of his earliest poems, penned when he was 20 years old:
I stood in flowers, knee high,
Dreaming of gentleness,
Dreams, in the promise of a shining sky
That I should make a garden from a wilderness;
I would subdue the soil and make it chaste,
Making the desert bear, the useless good,
With my own strength I would redeem the waste,
Would grow the lily where the thistle stood.
…Had I not dreamed so long,
Not dreamed of so much beauty, or such grace,
Mayhap I could have trod a quieter path,
With other men, in a green, quieter place.
Those final lines are practically prophetic: Philip Britts would go on to live a life of exile and hardship, one that would send him to an early grave at age 31. Water at the Roots, a short new book from Plough Quarterly (the publishing community of the Bruderhof), stands as a testament to his remarkable life, the fruit of his mere three decades on the earth, and the vision that his poetry, essays, and sermons inspired.
Philip graduated from the University of Bristol with a degree in horticulture in 1939, and became a fellow of the Royal Horticultural Society. He married his childhood sweetheart, Joan, shortly thereafter. Philip was a staunch pacifist, and believed that his Christian faith barred him from military action. But as German troops marched into Poland that September and Winston Churchill called his countrymen to action, pacifism became an increasingly unpopular position in the United Kingdom.
It was around this time that Philip and his wife learned of the Bruderhof, a community of Christian believers founded in Germany in 1920, who dedicated themselves wholeheartedly to Jesus’s words from the Sermon on the Mount and to the example of the early church in Acts. They held property in common, and worked together to provide for all members of the community. Hitler had exiled the community from Germany in 1937, and they had fled to Cotswold, England, resuming their communal life there.
After journeying to meet the Bruderhof and observe their work, Philip and Joan were determined to join them. They sold their house, left behind family and friends, and moved to Cotswold in 1939, becoming official members of the Bruderhof the following year. Philip worked with the other Bruderhof men in the fields and vegetable gardens: ploughing, harrowing, seeding, and harvesting. The poetry he wrote at the time contains a tangible joy and zeal, a sense of calling found and fulfilled.
But this period of tranquility was short-lived. As World War II progressed, the English government declared that Germans and Austrians age 16 and older had to appear before special tribunals, and many were placed in internment camps. If the Bruderhof wanted to remain together, they could not stay in England. They began searching for a new home country, one that would allow them exemption from military service and freedom of religion. Through the help of local Mennonites, the Bruderhof found an abandoned farm in Primavera, Paraguay, and were granted exit permits to leave England. They sailed across the Atlantic and toward an unknown future.
Their transplanting was not easy. The community struggled to find the best and most adaptable plants to grow in this new soil and climate, and suffered from malnourishment and insufficient food at various times. Under such circumstances, Philip’s knowledge of horticulture became invaluable. He determined new crops for their community, working in both field and the lab to help provide for the Bruderhof. During this time, Philip also wrote a couple pieces for agricultural magazines overseas. In “How Shall We Farm?” he observed that a large gap existed “between what the farmer knows to be right, and what a competitive economy forces him to do”:
…Organic farming, by itself, as an isolated factor, can never reach its full significance. It stands for the great truth that agriculture can never be stable and permanent until man learns and obeys the laws of fertility, the cycle that includes the decay of the old and the release of the new, or, to put it biblically, to “subdue and replenish.”
But it can only realize its full meaning in the context of an organic life. Man’s relationship to the land must be true and just, but this is only possible when his relationship to his fellow man is true and just and organic. This includes the relationship of all the activities of man—of industry within agriculture, of science with art, between the sexes, and above all between man’s life and his spiritual life.
Philip didn’t stop writing poetry, despite the adversity and frenzied activities of his days. Many of these poems reflect an inner stillness and poise, a deep spiritual undercurrent, that navigated his actions and thoughts throughout this time—and that give rise to the title of this book, Water at the Roots. “We are like plants in the tropical sunlight,” he writes in one sermon. “The fierceness of the sun is like the judgment of God, revealing everything, burning, purifying. Here faith is like water at the roots. If we have faith, we can face the sun, we can turn the heat and the light into life-giving fruits, into love.”
In many ways, Philip’s writings remind me of Wendell Berry’s: so observant of nature and its rhythms, full of awe and gratitude at creation’s mysteries and intricacies. He, too, warns of the dangers of machine and humanity’s belief in inevitable, perfect progress (“Is not this the poison of the age, the belief of man in man?” he writes in one essay). He urges all of humanity towards deeper communion with land, plant, and animal—a sort of spiritual or virtual farming, a stewardship of the soul, that draws us into reconciliation with man and earth. It’s remarkable that these two men, divided by time, place, and circumstance, could draw such similar inspiration from the created world. It’s also true that their vocation calls forth awe and aspiration more than most, wedded as it is with nature’s rhythms and a sort of intentional gentleness.
Philip’s fiery faith—passionate, even radical, in its disposition—also reminded me of Dietrich Bonhoeffer, a German pastor and theologian who died in Nazi Germany. Bonhoeffer drew similar allusions in his writings—especially The Cost of Discipleship—to the personal and material costs of our faith, arguing that it calls us to extreme devotion and surrender. Bonhoeffer’s Life Together lays out a similar vision of Christian community and hospitality, one that complements the Bruderhof’s profound devotion to each other.
It is hard, in modern America, to imagine such extreme devotion to community. Our capitalistic society lauds charity and philanthropy, but looks askance at the Bruderhof’s radical sharing of wealth. It promotes farming—but old-fashioned rhythms of agriculture, such as those employed by the Amish or the Bruderhof, are viewed as reactionary, if not silly. We’re constantly marching forward, at least in our own minds, toward greater wealth, knowledge, innovation, and power. We have no time to look backwards and to consider what we might have lost along the way.
Philip travelled to Brazil in 1945 with a group of horticulturists to find new crop possibilities for the Bruderhof community. There, he contracted a fungal infection that spread throughout his body, and caused his death two years later. He was only 31 years old, and left behind a wife and four children (one still in the womb). He’d just recently been chosen to serve as a new pastor within the Primavera community, and everyone was stunned and devastated by his loss. His wife wrote at the time that “one must not go down under a flood of sorrow. Ever since we have known each other, we always felt that we had been led together by God.”
Philip had hoped to write a book about agriculture for his son called Written in the Soil, which he never got to complete. But Water at the Roots compiles his poetry, essays, and sermons for the rest of us. It’s a welcome retreat from the frenetic writings of our time, the constant invitations to anger and despair. A few lines of poetry, in particular, stick in my head:
There are so many songs that need to be sung.
There are so many beautiful things that await
The sensitive hand to pick them up
From this strange din of busy living.
(Philip wrote that when he was 19 years old.)
Not all of us will join a Christian community like Philip’s. And not all of us will work the land, as he did. But thankfully, Philip’s writing contains nuggets of wisdom and conviction for all of us: reminders to be still, to love others, to steward our possessions well. We may not be members of the Bruderhof, but perhaps we can all strive to live a little more like Philip Britts. The world would be better for it.
Are some towns better off dead?
That’s the argument National Review’s Kevin D. Williamson and Reason’s Nick Gillespie have made in the past, after considering rural blight and the devastating dearth of jobs for poorer Americans in those areas.
But a new report, recently released by the U.S. Partnership on Mobility from Poverty, responds with a more holistic diagnosis. The partnership was formed a year and a half ago with funding from the Bill and Melinda Gates Foundation, and aims to find new ways for the government and philanthropy to assist the poor.
In their report, the partnership suggests that many poor people are “stuck”—geographically, economically, and socially. And geography is the main factor behind the latter two forms of stasis, as City Lab’s Michael Anft notes in his analysis of the report: “When it comes to being poor in America, geography is still destiny. Regions of chronic intergenerational poverty, shaped by the structural inequities that are part of the nation’s history, have remained stubbornly resistant to change.”
“Place matters,” the report authors write. “The community where a child grows up greatly influences her or his opportunities for upward mobility. Comparing children in the same family who move from a low-opportunity to a high-opportunity area makes this clear. Children who move at age 12 fare significantly better than their older siblings. Children who move at age 6 or younger fare the best.”
Thus, the report aims to provide the poor with avenues out of geographically destitute areas and into “opportunity communities”—a strategy that hearkens back to Williamson’s and Gillespie’s urging that stagnant towns be allowed to die.
But thankfully, the report adds some nuance that is absent from those other diagnoses. Its authors don’t believe struggling towns and cities should be left to crumble and rot: while we should prioritize the needs of those stuck within their borders, we ought to try and revitalize these places if we can.
“For every person to live in a safe community that offers the opportunities fundamental to mobility, we must revitalize historically distressed communities, preserve and increase affordable housing in newly restored communities, and expand access to opportunity-rich communities and institutions for people living in low-mobility areas,” they write. “We must pursue all three approaches together.”
The authors call for an “intensive place-conscious” strategy that combines revitalization and affordable housing with access to the aforementioned “opportunity communities.” While we want to renew and reinvigorate struggling areas, it’s also important to acknowledge the toll a broken neighborhood takes on its youngest members. Thus, the report suggests prioritizing safe and stable housing for high-need, low-income families with young children.
This balance matters because we cannot only consider the needs of the mobile. We must also acknowledge the impact their out-migration has on those left behind. Much of rural America, especially, is growing “older, poorer, and less educated,” as Sarah Jones recently put it at the New Republic.
In her article, Jones talks to Pennsylvania State University professor Ann Tickamyer about the idea that geography is destiny, that poverty is an inescapable condition within certain regions of the United States. And while Tickamyer agrees with the statement to some extent, she adds this important caveat: “Any time you make descriptions about what the problems in rural places are, and what people should do, you’re generalizing way beyond what is reasonable. The Mississippi Delta is really different from central Appalachia and the Texas borderlands.”
Tickamyer also points out that what we leave behind may not be all bad, as we go out in search of greater gains:
The poorer you are the more you depend on a safety net that is more likely to be made up of your relatives and friends, family, community than of whatever the official safety net is. So if you are poor, sporadically employed or unemployed with kids, who provides the child care? Who helps out when you run out of money to purchase groceries or need an emergency car repair or whatever? It’s going to be the people who you are connected to in your community and in your family.
Some movers will find new forms of support within opportunity communities. Others will leave it behind. A D.C. drug store clerk once told a friend of mine that he had to help a mother who had come in with her obviously sick and unhappy child because she didn’t have anyone to watch the baby while she picked up her prescription. This is the reality for many in isolating and stratified cities: while these areas may offer financial boons, they make connection and rapport much more difficult to establish.
Leaving isn’t always the answer. So how do we help those who choose to stay, who choose familial and communal support over economic relief? This is where it becomes incredibly important to build “strong towns” and revitalize neighborhoods, to not just urge people to leave, but to take up the vital work of placemaking. Comparing this concept to that of “homemaking” makes the vision clear: we must foster an ordered place, steward its resources wisely, and ensure that it is safe and comfortable for all those who reside within it. It’s a vision that cannot be achieved without determined placemakers: community and civic leaders, philanthropists, businesspeople, and politicians who are ready and willing to dedicate themselves to their place.
This vision already exists, to some extent, in many of America’s struggling communities: I know an MIT graduate who left his job with Microsoft to return home to small-town Oregon—to one of the poorest counties in the United States—in order to “give back.” He’s helped develop STEM programs at the local community college, worked to develop greater, more affordable broadband connectivity in the community, and provided a multitude of jobs to local workers through his farm. I’ve also met a mayor who is determined to revitalize his small town and bring in new businesses. He’s rescuing it from stasis and decline through his dedicated volunteerism and work on urban revitalization. And there are a multitude of young college graduates I’ve met who have turned down D.C.’s appealing paychecks and glamor and instead returned to their hometowns—to family farm jobs, ministry work, non-profit initiatives, small-town law offices, and more.
I’m beginning to see that there’s a lot of promise in the returners: those who go out from their hometowns, learn vital skills (and perhaps earn what they can’t at home), and then return with a mind to give back, to grow, and to steward. It isn’t a perfect or a full answer. But for many of America’s struggling towns, it may be a start.
Why has cooking become so hard? Some would say it’s not: throwing together paninis or spaghetti for a weeknight dinner isn’t all that complicated. But our world increasingly sees cooking as a nigh-impossible feat of time and skill. We’re surrounded by takeout eateries, “fast casual” restaurants, and grocery stores packed with frozen meals. Blue Apron and HelloFresh promise gourmet dinners without the fuss. Rachael Ray offers her readers 30-minute meals, while Pioneer Woman promises to get the job done in 16 minutes. We want to magically transmogrify the vegetables on our cutting boards into something tantalizing—but if there’s chopping, measuring, seasoning, and (perhaps above all) waiting to be done, we quickly become disillusioned with the whole idea of from-scratch cooking.
Speed and ease define our world—a world in which people often spend hours commuting to and from work, juggling multiple jobs, or ushering kids from house to daycare to school to sports practice. In such a rushed place, there is very little room for cooking, traditionally understood.
Considering all this, Elizabeth G. Dunn is tired of chefs and cookbook authors promising “easy” from-scratch cooking to the harried, the worn out, and the stressed. While many recipes promise simplicity and speed, most of them are far from effortless, she argues in The Atlantic:
… [N]one of this actually easy. Not the one-minute pie dough or the quick kale chips or the idiot-proof Massaman curry, every last ounce of which is made from scratch, from ingredients that are sourced and bought and lugged home and washed, peeled, chopped, mixed, and cooked. Meanwhile, technology has made appetizing, affordable cooking alternatives easier and easier to come by. …[T]onight, I can order excellent pad thai from my phone in under a minute. Or, I can find a recipe for “easy” pad thai, run—literally, run—to the grocery store at lunch, hope that grocery store sells fish sauce, then spend 40 minutes making the dish and 20 minutes cleaning up. The decision to cook from scratch may have many virtues, but ease is not one of them. Despite what we’re told, cooking the way so many Americans aspire to do it today is never fast, and rarely easy compared to all the other options available for feeding ourselves.
Dunn quickly clarifies—she likes cooking from scratch. But with a full-time job and a young son, there rarely (if ever) seems time to make the sorts of meals she sees in Bon Appetit. “It’s not that the best way to feed yourself is always the fastest one—it almost always isn’t,” she writes. “…But I think we should talk more realistically about what’s involved in from-scratch cooking, the sacrifices it entails, and the fact that little of the complexity offered by today’s published recipes is really essential to cooking a delicious meal.”
Cooking is hard when you don’t have time for it and carving out that time does indeed require sacrifices. But the question is: are we really too busy to cook? Dunn is right to some extent: meals rarely (if ever) take a mere 15 minutes to put together. But when one considers the breadth and length of a 12-hour day, it seems a bit silly to think we should be putting less than half an hour into the sustenance of our minds and bodies. It makes sense that cooking food—creating the nutrients that nourish and sustain us through all the commutes, work tasks, classes, and soccer practices we might face in a day—should require some time and effort.
Food experts have had similar complaints over Americans’ meal budgets: we spend less than 10 percent of our budget on food, and of that 10 percent, only 5.5 percent is allotted for cooking at home. We spend less of our cash on food than any other country—about half as much as French households do. Cooking is not a priority to us where money or time are concerned. We’ve put cooking on the back burner (excuse the pun), replacing it with other concerns.
And yet the average U.S. adult spends over an hour on his phone, and checks it about 80 times throughout the day. That average adult also spends four and a half hours watching shows and movies. All in all, we spend about 10 hours a day staring at screens.
Surely we have time to chop and sauté some vegetables. We could boil some rice and pan-fry some chicken, even while an episode of that must-watch Netflix show plays in the background. The biggest drawback to cooking in our modern world often arises from long commutes, when workers arrive home frazzled and exhausted, with big appetites and little inspiration. But even here, there is room to cook: my mother-in-law taught me to make omelets and toast for dinner when it’s late and everybody is tired. It’s not a fancy solution by any means, but it gets the job done—and it’s much cheaper than takeout. Another solution I learned growing up was to make freezer meals: cooking a huge batch of soup or chili on the weekend, freezing half, and then pulling it out during a busy week.
From-scratch cooking isn’t impossible in our world, but its rhythms and processes are slower and harder than we’ve been conditioned to appreciate. We don’t like waiting for things. We don’t like 10 steps where one might do. And we don’t like undue labor when someone else could do the “hard” stuff for us.
But perhaps this is why we need cooking: not because it fits perfectly with our frantic and frenzied lives, but because it is so contrary to the other rhythms and patterns that characterize our days. Cooking is the perfect time-consuming ritual to remind us of what’s real: the hunger in our bellies, the crisp clack of a knife as it slices through a potato, the sizzle of olive oil in a pan. It’s a communal activity that enables us to spend time together, to draw even young persons into the kitchen and teach them important skills. My two-year-old can’t do much in the kitchen, but she can help me peel garlic, squeeze limes, or stir cornbread batter. She loves sitting on the kitchen counter, listening to music, and watching as I cook.
Today’s Americans may need easier recipes and culinary methods in order to cook from home. But perhaps the problem is not that we need these things, but rather that we’ve been conditioned to expect them in our instant-gratification culture. The answer to our cooking dilemma might not be a new five-minute meal cookbook or Grub Hub app. It might instead be a deepening encouragement to slow things down, to embrace a little difficulty in the kitchen, and to set aside the distractions (technological and otherwise) that prevent us from feeding our bodies as we ought.
Ever since Donald Trump first announced he was running for president, we haven’t been able to look away. He dominated headlines that week in June 2015, and has dominated them every week since.
At first, reports on the Trump campaign were half-humorous, half-incredulous. Some writers disapproved of his proposals; some found his rhetoric alarming. Some responded enthusiastically to his apparent outsider status and populist platform. But his dominance seemed so unlikely that many regarded him with bemusement rather than consternation.
As time went on, and Trump continued to dominate the national conversation, the media stories veered into the realm of the alarmed, astonished, and panicked. Trump’s eminence in the national conversation never dimmed, but rather morphed. Like looking into a funhouse mirror, the way in which reporters and newspapers perceived Trump mutated from fascinating and silly to frightening and grotesque. The amused stories of summer 2015 became the fearful articles of post-election 2016.
Regardless of tone or outlet, however, one thing has been constant: Trump has controlled our headlines and our public discourse for a few years now, and we have to give him credit for that dominance, at least to some extent. He’s given us flashy quotes, crazy tweets, and meme-able expressions. In 2016, he made the debates funnier, the headlines punchier, the boring business of politics flashier. From insulting Mika Brzezinski last year to recently declaring his nuclear button bigger than North Korea’s, Trump has always managed to give voice to the ludicrous and offensive, something that keeps him front and center.
Of course, he’s also kept the national conversation infantile, brutish, and nasty. He’s distracted from real policy issues, as well as from the presidency’s true meaning and intended role. But for Trump, media dominance seems like a largely successful strategy. Even if he becomes one of the most disliked presidents in American history, no one can deny that he’ll also become one of the most notorious.
Kevin Williamson recently described the second year of Trump’s presidency as “Trump: Season 2.” And if you think about it, Trump’s time in the spotlight has been much like a reality TV show: characterized by addictive controversy, incendiary rhetoric on all sides, and our own inability to look away from the spectacle. Anyone who’s watched an episode of “The Bachelor,” “Survivor,” or “Top Chef” knows that drama and intrigue keep viewers glued. If we think something crazy and ridiculous is on the immediate horizon, we won’t look away or turn the channel. In the same way, the palace intrigue style of journalism that’s been common throughout Trump’s first year—a style that seems to dominate Michael Wolff’s new book Fire and Fury—focuses on feeding our appetites for gossip and outrage, keeping viewers tuned in until next week.
Of course, when it comes to “The Bachelor” and “The Apprentice,” we can choose to ignore the spectacle. We can choose not to turn on the TV in the first place. But what do we do when Trump’s belligerence threatens nuclear war and diplomatic relations with key allies abroad? What’s the wise and measured stance to take in the face of a presidential sitcom that never seems to end—but rather seems to dangerously escalate?
As a wealthy and prominent New Yorker and as a reality TV star, Trump is used to shock and attention. He craves the latter, it seems, above all else. This, to some extent, is what we get for making such a person president.
But much of this reality TV drama is also the fault of our media and its methodology. Our 24/7 news cycle challenges journalists to garner clicks, to keep people fixated, to always be on top of the “next thing.” A news story back in the day would take at least 24 hours to go to press—and could dominate discourse for weeks afterwards. Now, news breaks in mere minutes, and the hot takes are expected instantaneously. We never want to slow down the cycle too much, lest we lose the attention of the ever-distracted public, give up their eyeballs and clicks via slow reads or uninteresting story angles.
Trump fascinates all Americans, it seems: we hate him or love him, fear him or idolize him. That encourages journalists to make every story Trump-centric, to feed the love or hate or fear via their reporting. Some of this reporting is a bit self-conscious or self-aggrandizing—“Trump versus the media” has become a navel-gazing war that most journalistic outlets are all too happy to fixate on, and it’s been profitable for many of them. But amid the back and forth, the endless and breathless coverage of Trump’s latest sensational comment (or two or five), we often forget to step back and consider the bigger picture. We forget, to some extent, what’s real. The tax bill is real. Congress’s inability to get health care reform passed is real. Trump’s incendiary tweets regarding North Korea are, unfortunately, all too real. But Michael Wolff’s collection of gossipy intra-White House drama? That’s “Trump: Season Two.” And it’s a distraction from the issues we should really be focussing on.
It’s perhaps telling that, in America today, we are making our presidencies into sitcoms rather than making sitcoms about our presidencies. It’s a trend that cannot possibly lead to ethical, wise, or prudent governance. So while it’s impossible to entirely look away, to refuse to read anything about Trump’s latest action or tweet, perhaps we can be more mindful of what we look at or click on. Who writes these stories, how, and what sort of tone they strike: all these things serve to encourage or dissuade our real-life reality TV drama. We’re the viewers and the voters, after all: we should be able to tell our media, and our president, what we want to watch.