This article was supported by a grant from the Richard H. Driehaus Foundation.
It seems fatuous to argue, especially in a healthy economy, that the upper middle class faces overwhelming financial insecurities. After all, U.S. stocks have entered the longest bull market ever recorded, the labor force has markedly improved, and small business optimism is at a level unseen since the early 1980s. It appears that happy days are here again. But this halcyon period—marked by invigorating statistics—still hasn’t prevented even upper-middle-class Americans from feeling discontent. For countless families, especially in thriving metro regions, a six-figure salary fails to deliver economic security. Their sense of vulnerability is real, not imagined.
What defines the upper middle class? According to the Pew Research Center, middle-class households, as of 2010, had incomes ranging from $35,294 to $105,881. In 2016, U.S. Census Bureau data showed that the median household income was $59,039. Based on Census findings from that year, the highest earning households—before the top 5 percent ($224,251 and upward)—ranged from $74,878 to $121,018. Reviewing these findings, a household income ranging anywhere from $75,000 to $200,000 could fall under the upper-middle class.
A six-figure income should bring long-term stability. But members of the upper-middle class find themselves prisoners of voluntary yet inescapable costs. A multi-generational phenomenon has unfolded, its roots traceable to the economic slowdown of the early 2000s and the subsequent Great Recession. There is a feeling of anxiety among Baby Boomers who cannot retire, Gen. Xers saddled with expensive mortgages and child care costs, and Millennials paralyzed by insurmountable student debt. Data cannot measure emotion. The sense of unease is palpable despite the economy’s booming conditions.
A helpful cultural reference point is HBO’s Divorce, which concluded its second season earlier this year. The comedy-drama focuses on the angst and dysfunction of a middle-aged divorced couple in Hastings-on-Hudson, an idyllic town in New York’s prosperous Westchester County. Frances DuFresne, played by Sarah Jessica Parker, quits her day job in the city to open an art gallery. Her ex-husband, played by Thomas Hayden Church, is a former Wall Street executive now struggling as a contractor. The estranged couple, raising two children, are undeniably upper-middle class. Their professional background, cultural tastes, and suburban lifestyle personify affluence. But their financial insecurity, mainly the result of career choices, remains a theme throughout the series. The DuFresnes’ social circles remind them that their economic position, while favorable, is vulnerable compared to the higher earners inhabiting their bucolic suburb.
The characters portrayed in Divorce exemplify a modern reality: many upper-middle-class households are high earning but asset poor. In 2015, Quartz’s Allison Schrager illustrated how “America’s upper middle class have almost no emergency cushion and are woefully unprepared for retirement.” Reviewing Federal Reserve data, Schrager showed the precarious financial position of upper-middle-class individuals aged 40 to 55 with household incomes ranging from $50,000 to $100,000. The data indicated that this income bracket had fewer assets than ever (assets exclude a house, car, or business, but include retirement funds). As Schrager noted, even a high earner who worked for many years typically had only $70,000 in financial assets. Approximately 25 percent of upper-middle-class 40- to 55-year-olds, meanwhile, had less than $17,500 in financial assets.
Such findings suggest that seemingly high earners are living paycheck-to-paycheck. While Federal Reserve data has since found that median family income grew 10 percent between 2013 and 2016, a disproportionate number of upper-income Americans still cannot retire. In addition to their own financial woes, they must support their elderly parents, which involves innumerable costs. Overwhelming debt has become a vicious trap.
In one Brookings Institution study, researchers reported that nearly one quarter of households earning $100,000 to $150,000 a year claim to be unable to pull together $2,000 in a month to pay bills. Sustained economic growth has not repaired this cycle of debt. According to Deutsche Bank economist Torsten Slok, Americans have more debt than cash than at any time since 1962. The 2018 Northwestern Mutual Planning and Progress Study found that the average American’s personal debt (independent of home mortgages) now exceeds $38,000. Stock market growth and rising home prices have not altered this trend.
In a Washington Post report last year, Todd C. Frankel demonstrated how modern life adds up for an upper-middle class family. Frankel reported on a couple in suburban Atlanta with a combined income of $180,000, an indisputably high earning level. But financial uncertainty rises from a mortgage, three children, day care costs, and the prospect of college tuition. “I don’t feel wealthy,” the wife, a tax manager, told Frankel. “I don’t have a bunch of money stashed away anywhere.” While the 2017 tax reform bill brought relief for many Americans, limits on state and local tax deductions have further engendered economic unease.
In her new book Squeezed, Alissa Quart captures how middle-class American families are struggling to attain the standard of living once enjoyed by their parents. And in an important chapter on the upper middle class, she profiles “life at the bottom of the top.” Quart argues that higher earners, like most Americans, contend with income disparity and the extreme wealth enveloping metro regions. In the San Francisco Bay Area, for instance, upper-middle-class families go broke hiring tutors and maintaining lifestyles that permit their children to compete with their wealthiest peers. The parents, working professionals, are emotionally ravaged by endless costs. They discover few perks in geographical serendipity, graduate degrees, or traditionally high-earning professions like law.
Quart reveals how the legal profession has induced economic stress since the 2008 recession. In the past decade, law firms and corporations have hired fewer lawyers. Yet for lawyers just entering the profession, student debt is a crippling part of their lives. As Quart notes, student debt at the average law school increased from $95,000 to about $112,000 in 2014. It is difficult to fathom how simple steps in life—getting married, buying a home, starting a family—are financially possible with such debt levels. But the struggle transcends age. Quart profiles a 59-year-old Mississippi lawyer who, following health setbacks, was ultimately “pushed out” by her employer. Life continued at its indifferent pace. The mother still had to pay for her son’s college tuition during her initial medical leave. “This is a vastly different life from what I expected to be having at this age,” she told Quart. “The six-figure salaries and benefits are long gone.”
The upper middle class’s discontent also transcends political ideology. A seemingly high-earning Republican household in suburban Cleveland confronts expenses similar to a high-earning Democratic household in suburban Philadelphia. These are people who tune out the minute-by-minute plot twists of the Trump presidency. If anything, they are streaming Netflix or watching HGTV for a nurturing distraction. Their daily focus is on remaining financially viable.
Aspirations prove costly regardless of geography. A four-year degree at a public college, for example, costs nearly twice as much as it did in 1996. Exorbitant college debt now dictates the financial future of Baby Boomers, Gen. Xers, and Millennials. Boomers, at the peak of their earnings, postpone retirement and support children with student loans. Gen. Xers, nearing the height of their careers, remain broke due to years of paying off higher education debt. Millennials, still young in their professional lives, primarily work to pay off monthly federal and private student loan bills. Credit cards are a necessary prescription for each generation’s economic survival. In 2017, the nation’s total credit card debt was over $1 trillion.
Economic insecurity is not limited to higher education. The cost of health care has also doubled since the 1990s. Obamacare only accelerated the costs incurred by households. The Journal of the American Medical Association has reported studies suggesting that the consolidation of medical practices actually “drives up costs.” Obamacare hastened the swallowing of regional hospitals by larger health care systems. This merger frenzy has empowered hospital systems to negotiate with insurance companies. But the mergers have increased costs, eliminated competition, and created barriers to care. The upper middle class, like so many others, are absorbing the costs of this transformed landscape. Rising premiums only add to their financial burden.
Of course, the upper middle class is in a better position than most Americans. In Dream Hoarders, Richard V. Reeves correctly unveiled how they are collectively removed from the socio-economics of the nation’s majority. Their economic outcomes remain favorable compared to the struggles of countless working-class Americans. But a sizable number of higher earning households are not “opportunity hoarding.” There is a cost to working parents ensuring their children have better lives than their own. In the booming 2010s, this segment of the population thought they would be in a better place than what they’d anticipated during the booming 1990s. Yet their diplomas did not translate into liquid cash. Upper-middle-class families, while affluent and well connected, have been met with empty pockets and unfulfilled dreams in this brave new economy.
Charles F. McElwee III is a writer based in northeastern Pennsylvania. He’s written for The American Conservative, City Journal, The Atlantic, National Review, and the Weekly Standard, among others.
South Philadelphia’s rooftops offer a stunning panorama of a city balancing its past and future. A sea of row homes, occasionally broken by old school buildings and church steeples, flow northward toward Center City, where towers of glistening glass spread symmetrically within the confines of the Schuylkill and Delaware rivers.
The 60-story Comcast Technology Center, nearing completion, is the tallest skyscraper, its structure reminiscent of a nineties-era cellphone. Nearby is Liberty Place, comprised of two office towers that imitate New York City’s Chrysler Building. In the mid-1980s, One and Two Liberty Place broke the “gentlemen’s agreement” that Center City’s buildings should not exceed the height of the statue of William Penn atop City Hall. Three decades later, City Hall—a gigantic Second-Empire pile once described by Walt Whitman as “silent, weird, beautiful”—is overshadowed by Philadelphia’s contemporary architectural landscape.
The occupants of City Hall, especially in the 1990s, played a pivotal role in Philadelphia’s resurgence. At the time, the city was confronting skyrocketing crime, stark population loss, shuttered manufacturers, fleeing employers, a public housing crisis, and the prospect of bankruptcy. Ed Rendell inherited this perfect storm when he became Philadelphia’s 96th mayor in January 1992. Rendell’s first term, often credited with reinvigorating the city, was the focus of Buzz Bissinger’s 1997 masterpiece, A Prayer for the City.
It’s now been two decades since Bissinger chronicled the spectrum of feelings that a city conjures for its mayor and constituents—and revisiting the landmark book reminds us how dramatically Philadelphia has transformed since those dark days of Rendell’s first term.
Bissinger, a Pulitzer Prize-winning journalist widely known as the author of the West Texas saga Friday Night Lights, composed a story that reveals the urban experience with clarity and empathy. In the anthology of books on American cities, Robert Caro is celebrated for The Power Broker, his magisterial biography of New York urban planner Robert Moses. But A Prayer for the City should be remembered as the book that best describes a city on the brink of salvation or perdition.
A Prayer for the City movingly describes the boundless optimism generated by mundane moments, the tensions aroused by the slightest provocations, and the heartbreak and despair induced by the unknown. Bissinger successfully documents the American city with all its intensity, violence, decay, and political intrigue. He reveals an urban culture from a period when only negative trends prevailed.
Bissinger’s main characters are Rendell, the city’s beleaguered yet resilient guardian, and David L. Cohen, his affable and brilliant chief of staff. Rendell, a larger-than-life figure, enjoyed a political comeback with the 1991 mayoral election. In the 1980s, Rendell hoped to leverage his District Attorney years by running for governor. In 1986, Rendell was crushed by Bob Casey, Sr. in Pennsylvania’s Democratic gubernatorial primary. A year later, Rendell ran against Philadelphia Mayor W. Wilson Goode, but the outcome remained the same. In Bissinger’s telling, Neil Oxman, a political consultant, recalls that somehow, in 1990, Rendell “woke up…and said, ‘I’m going to do this the right way.’” Rendell prevailed in the 1991 Democratic mayoral primary. Then his general-election opponent, the legendary former Mayor Frank Rizzo, died suddenly over the summer. Rendell coasted to victory that November.
When Cohen was named Rendell’s campaign manager in 1990, he was working at one of the city’s top law firms. Despite their contrasting personalities, a loyal friendship commenced between the two men. A meticulous professional, Cohen was “known for the way he learned the nuances of group insurance by reading some five thousand pages on it.” He was also “known for the way he personally inspected every piece of mail his secretaries typed up for him, even the envelopes.”
Rendell, meanwhile, was often a canvas of emotions ranging from charm and irreverence to rage and gloom. But neither Cohen’s efficiency nor Rendell’s charisma could prepare them for City Hall’s chaos. “To be the mayor of an American city meant facing potential tragedy twenty-four hours a day,” writes Bissinger.
Post-election developments, whether anticipated or unexpected, left the Rendell-Cohen duo with limited time to save Philadelphia from its seemingly endless problems. In early 1992, a computer model forecasted that Philadelphia faced a $1.246 billion budget gap over the following five years if the city didn’t take action.
Rendell went public with the city’s dire fiscal situation. Candor proved an advantageous strategy, with the Rendell administration ultimately winning significant concessions from Philadelphia’s omnipotent unions. Bissinger, who had open access to City Hall for four years, reveals the backstage negotiations that subsequently protected Philadelphia from financial collapse, scored bipartisan praise and media accolades, and insured Rendell’s political future.
As mayor, Rendell embraced 1990s-era revitalization strategies. He favored the entertainment approach common among big cities. A new convention center, the construction of luxury hotels, and the opening of major-chain restaurants like Hard Rock Café heralded hope for Center City’s future. But citywide challenges often overshadowed Rendell’s efforts to improve downtown, promote tourism, fund the arts, and attract suburban visitors.
Philadelphia’s post-industrial trends, far from unique, proved overwhelming. Rendell delicately navigated the city’s demographic complexities as a peacemaker, hoping to subdue or maintain support from aggrieved politicians and resentful residents. The statistics fueled the tension. In the early 1990s, Philadelphia’s poverty rate was 20 percent, its high school dropout rate was 40 percent, and only 32 percent of the region’s jobs were within city limits. Philadelphia was no longer a “Workshop of the World,” where the working class thrived in a neighborhood-centric city with sprawling mills and factories. As Bissinger recounts, the city had “become the Manufacturing Mausoleum of the World.”
Bissinger features a supporting cast alongside Rendell and Cohen, unveiling city characters who represent the cycle of urban life. He profiles Jim Mangan, a welder who must regroup when the federal government closes Philadelphia’s Navy Yard. Mike McGovern, an assistant district attorney proud of his city, grapples with its realities, and aggressively pursues justice against violent criminals. Linda Morrison, a libertarian reformer, meets defeat in her pursuit of city living. Fifi Mazzccua, a loving matriarch, finds comfort in her church despite family tragedy in the city’s “Badlands.”
In the background, Rendell and Cohen aggressively respond to Clinton-era policies that risked placing the city on life support. “Forget all the good things I’ve done; Philadelphia is dying,” Rendell says during a conference call. “It’s happened a lot more slowly since I took office, but we’re dying.”
Philadelphia has transformed since Bissinger’s mid-90s portrait. City Hall now enjoys the fruits of an expanding higher-education sector, growing hospital systems, start-up companies, and new real estate development. Millennials flock to neighborhoods that only a decade ago were ridden with crime and blight. North Philadelphia’s Francisville, for example, is a haven for bike-riding residents living in newly-built apartment buildings. As the U.S. Census notes, as of 2017, Philadelphia’s population has risen eleven years in a row. An expanding job market, with suburban companies opening Center City offices, continues to attract new residents.
The city’s Navy Yard, which dates to 1776, is now a mixed-use campus with over 11,000 employees working in the office, industrial, and R&D sectors. Penn’s Landing, along the Delaware River, is being redevelped to connect the waterfront to neighborhoods by capping a portion of I-95 with a 12-acre park. Even the Divine Lorraine Hotel, long a source of wonder for its deteriorating majesty, is enjoying a $44 million rehabilitation.
Across from 30th Street Station, Philadelphia’s central transportation hub, the Schuylkill Yards project will transform 14 acres of barren, Drexel University-owned land into 7 million square feet of commercial buildings, retail space, and research labs. The 20-year, $3.5 billion project is part of West Philadelphia’s rapid gentrification. As WHYY Plan Philly’s Jim Saksa recently reported, in 1995 the average home in the University City neighborhood sold for about $80,000. Homes just west of the University of Pennsylvania’s campus are now valued at nearly $500,000.
Philadelphia’s 10-year property-tax abatement played an important role in the city’s population growth and construction boom. The measure, introduced in 2000, exempted new or renovated residential properties from taxation on improvements for a decade. The abatement created a construction frenzy, reshaping the character of neighborhoods like Northern Liberties and Fishtown. But the policy, while economically critical, also has downsides. In many cases, developers replaced historic properties with cheap, stucco-clad apartment buildings. Property taxes, in turn, increased for existing residents already dealing with a burdensome wage tax. For many families, the city’s struggling public schools have made private school tuition an additional unavoidable expense.
The city has changed since the Rendell years, but negative socioeconomic trends linger. Philadelphia continues to have one of the nation’s poorest populations and it trails the growth of other U.S. cities. In 2017, the Brookings Institution released a report addressing how Philadelphia can excel in a global economy and serve its local population. While Brookings acknowledged improvements in Philadelphia’s entrepreneurial ecosystem, it also warned that the city ranked lower in innovation and growth measures compared to its metro peers.
And so, Philadelphia, despite its renaissance, still wrestles with cognitive dissonance. The city embraces its future, but history and tradition restrain its forward pace. The city attracts new employers, investment, and residents, but the statistics remain sobering. The city even pursues efforts at reform, but machine-style politics—Lincoln Steffens famously called Philadelphia “corrupt and contented”—inhibit lasting improvements.
Bissinger showed how a big city’s economic, cultural, and political complexities can make its imperfections an inescapable reality. He captured how Rendell, as mayor, triangulated Philadelphia’s political ecosystem to restore the city’s future. Looking back, it’s indisputable that the city’s comeback began with Rendell’s administration. His efforts, largely successful, delivered career success. Following his chairmanship of the Democratic National Committee, Rendell served two terms as Governor of Pennsylvania. Cohen, who remained his chief of staff until 1997, is now Senior Executive Vice President at Comcast.
Would Rendell thrive in Philadelphia’s current political landscape? He was a Democratic mayor in a Democratic city, but a political leader capable of working with all sides. Of course, there are not many Republicans in Philadelphia. Registered Democrats presently outnumber Republicans by 7 to 1. This majority has its origins in the 1950s, when Joseph Clark broke up the GOP machine that had ruled the city for decades. The Democratic City Committee has ruled Philadelphia since that time, perpetuating the very machine politics that Clark hoped to extinguish after World War II.
But questions are now arising about the establishment’s lasting power. In May, progressive Democrat Elizabeth Fiedler, a former reporter, pulled off an upset primary victory in the South Philadelphia legislative seat held by retiring state Rep. Bill Keller. Left-wing, grassroots-based groups played a pivotal role in Fiedler’s win. Based on national trends, it’s unlikely that candidates like Fiedler would replicate the spirit of moderation and compromise favored by Rendell.
Will the old machine withstand this left-wing ascendance? Corruption does not help their cause. Between 2000 and 2015, nearly 40 Philadelphia politicians found themselves under investigation. In recent years, the city’s district attorney went to prison for bribery, a long-time Congressman was sentenced to prison for influence peddling, and five state legislators were caught trading favors or laundering money. The Traffic Court, meanwhile, dissolved after a ticket-fixing scandal, and the FBI raided the City Council majority leader’s office.
For now, the construction boom continues, the Democratic machine endures, and city leaders celebrate favorable trends. It’s a city that retains an almost chiaroscuro quality, finding enlightenment through investment and population growth, but also darkness as a one-party system maintains its tribal grip over the city. Bissinger captures these qualities in A Prayer for the City, which is not only Philadelphia’s story, but also a history of the American city. Reading the book twenty years later, one can draw an encouraging conclusion. Despite the challenges, Philadelphia’s prayers were largely answered. As Bissinger shows, the path to resurrection is long and perilous, but it can be successfully navigated, particularly by a gifted politician like Rendell.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania.
Weekday mornings in early June promise a sweet, almost dreamy period, one that heralds pleasant weather, whispers the possibilities of summer, and forever reminds us that the school year is over. The post-Memorial Day stretch permits a few more languid days without air conditioning, and so windows remain open for birds to fill bedrooms with their songs at daybreak. A cool, almost misty air enters the homes as early commuters drive by to begin another shift.
For children, the first weeks of June are filled with joy, a sense of finality, and the excitement of entering a new grade and unknown future. Life has yet to burden innocent minds with high expectations and unrealistic aspirations. Childhood allows the embrace of days soaked in sunshine and chlorine. This feeling is eternal, one of the few sentiments in American culture that have withstood the passage of time. And so it was on June 5, 1968, when children woke up, overwhelmed with anticipation and impatience as they entered their final stretch of academic confinement. But that Wednesday morning was different, with the nation learning that Robert F. Kennedy had been shot. He died within 24 hours. Fifty years later, we have yet to fully digest the emotional and cultural consequences of Kennedy’s assassination.
Kerry Kennedy, just eight years old, woke up that morning and turned on the television to watch Bugs Bunny. She had accompanied three of her ten siblings to California for her father’s final stretch in the state’s crucial primary. Her parents headed to Los Angeles’ Ambassador Hotel the previous evening to await the returns. It was through the television that she learned what happened. A news flash interrupted the cartoon, telling Kerry what happened to her father. By late evening, Kerry, along with her fellow Americans, learned that Kennedy died. It was a cruel encore of just five years before, when the 35th president was killed in Dallas. An 8-mm film forever burned Americans’ disbelieving eyes with John F. Kennedy’s final drive. Now his brother, who just won the Democratic primary in America’s largest state, met a similar fate.
The images of bedlam from the hotel’s kitchen floor are permanently stored in America’s repository of tragic history. Kennedy, a figure even more complex than his older brother, lay mortally wounded with rosary beads stuffed into his hand. There was “a kind of sweet accepting smile on his face,” recalled the journalist and friend Pete Hamill, “as if he knew it would all end this way.” His death, just months after the Rev. Martin Luther King’s assassination, occurred in a year already shattered by the carnage of Vietnam, cultural change, urban riots, and a viscous political realignment. On the morning of June 6th, America’s children entered school knowing summer was imminent, but also carrying that strange, empty feeling when a death occurs or tragedy strikes. Coverage of Kennedy’s death saturated televisions, reminding children that they were living in a nation that would always carry the weight of that shocking event.
Revisiting Bobby Kennedy’s assassination a half century later is to relive the feelings of that period. The children of the late 1960s are now middle-aged adults, well into their baby boomer years and perhaps even retired. But it’s difficult for that generation, or even their descendants, to look back at that decade without reflecting on the cataclysmic deaths of two brothers, one who occupied the White House, the other who could have changed the outcome of the 1968 presidential election. The passing decades have faded much of the Kennedy family’s mythology, one that Americans long nurtured because they associated Jack and Bobby with happier times. “Try to think of the era without them and see if you can do it. It’s impossible, really,” wrote Chris Matthews in his latest book, Bobby Kennedy – A Raging Spirit. “When Jack Kennedy was president in those upbeat years of the early 1960s, then again when Bobby ran for president, the special Kennedy atmosphere captured the day. There was a spring in the country’s step, an excitement that could also, to those threatened, mean trouble.”
A sense of elation commenced with Bobby’s late entrance into the Democratic primary. Kennedy previously told reporters he would “not be a candidate for president under any foreseeable circumstances.” But on March 12, Minnesota’s Sen. Eugene McCarthy, the darling of the upwardly mobile and antiwar left, humiliated President Lyndon Johnson by winning 42 percent of the popular vote in New Hampshire’s primary. Kennedy, who had agonized over his entrance into the race, knew Johnson’s performance opened the way for his campaign. America’s position on the home front and especially abroad reminded him that his long-time rival, Johnson, was in a position beyond political redemption. Just days after McCarthy’s overperformance in New Hampshire, Kennedy announced his presidential bid from the Caucus Room of the Old Senate Office Building. Kennedy’s announcement occurred at the same age and from the same room where his brother declared his run in 1960.
America had dramatically changed since John F. Kennedy seduced voters with the promises of the New Frontier. A young family, the campaign jingles, the embrace of television, and the prospect of America’s first Catholic president injected a sense of patriotic adrenaline into the 1960 campaign. There were “high hopes” for Jack and a sense of cultural validation for Catholics who remembered Al Smith’s failed presidential bid in 1928. In 1960, the Everly Brothers and Bobby Darin crooned through the radio, Harper Lee’s To Kill a Mockingbird proved a national sensation, and Americans flocked to movies like Spartacus in magnificent downtown theaters.
But the frivolity and innocence, however illusory, were shattered on November 22, 1963. Kennedy’s assassination violently shifted America’s cultural fault lines. One afternoon accelerated the nation’s sociological maladies, intensified its political divisions, and evaporated its black-and-white contentment. Americans proceeded on a Technicolor path of disruption, one that had transformed the nation by the time of Bobby’s announcement on March 16, 1968. It was that year when The Doors and Cream blasted from transistor radios, John Updike’s Couples landed on the cover of Time, and 2001: A Space Odyssey played in new suburban cinemas. The country had experienced a dervish frenzy, and Bobby was fully aware of his nation’s turbulent course.
The country was rocked by young students protesting a worsening war in Vietnam. Racial tension exploded and riots destroyed urban neighborhoods. America’s political evolution forever altered its electoral geography. Bobby was embarking on a remarkable campaign that challenged the incumbent president, a man he despised for many years. But the source of this strife stemmed from the White House years of Bobby’s brother. “While he defined his vision more concretely and compellingly than Jack had—from ending a disastrous war and addressing the crisis in the cities to removing a sadly out-of-touch president—he failed to point out that the war, the festering ghettos, and Lyndon Johnson were all part of Jack Kennedy’s legacy,” wrote Larry Tye in his biography of Bobby.
For the 1968 primary, Kennedy metamorphosed into a liberal figure with an economic populist message. Kennedy’s belated entry turned into an audacious crusade, with the candidate addressing racial injustice, income inequality, and the failure of Vietnam. He balanced this message with themes touching upon free enterprise and law and order. Kennedy hoped to appeal to minorities and working-class whites. He quickly became a messianic figure, and the press embellished his New Democrat image. By late March, Johnson announced that he would not seek reelection during a televised address. Through his departure, Johnson worked to maintain control of the party machine by supporting Hubert Humphrey, his devoted Vice President. But in the following weeks, Kennedy built momentum as he challenged McCarthy in states like Indiana and Nebraska. His performance in both states, where anti-Catholic sentiments lingered, testified to Kennedy’s favorable electoral position.
In April 4, Kennedy learned that the Rev. King had been assassinated. He relayed the civil rights leader’s death in a black neighborhood in Indianapolis. His words helped spare Indianapolis from the riots that erupted in cities across the country, ultimately leading to nearly 40 people killed and over 2,000 injured. MLK’s assassination served as an unsettling reminder to Kennedy’s family, friends, campaign aides, and traveling press. During Kennedy’s first campaign stop in Kansas, the press corps stopped at a restaurant where the legendary columnist Jimmy Breslin asked, “Do you think this guy has the stuff to go all the way?”
“Yes, of course he has the stuff to go all the way,” replied Newsweek’s John J. Lindsay. “But he’s not going to go all the way. The reason is that somebody is going to shoot him. I know it and you know it. Just as sure as we’re sitting here somebody is going to shoot him. He’s out there now waiting for him. And, please God, I don’t think we’ll have a country after it.”
Despite what happened in 1963, the Secret Service had yet to provide protection of presidential and vice presidential candidates and nominees during the 1964 election or the 1968 primary. But all the signs were there that Kennedy needed protection. The frenzied crowds increased in size, taking a physical toll on the candidate. In one instance, “he was pulled so hard that he tumbled into the car door, splitting his lip and breaking a front tooth that required capping,” writes Nye. “He ended up on a regimen of vitamins and antibiotics to fight fatigue and infection…For most politicians, the challenge was to attract crowds; for Bobby, it was to survive them.” In California, just 82 days after his announcement, Kennedy met the fate that so many feared.
Bobby Kennedy was a complicated figure from a family that continues to engage America’s imagination. In his autobiography, the novelist Philip Roth, who recently passed away, reflected on Kennedy’s assassination:
He was by no means a political figure constructed on anything other than the human scale, and so, the night of his assassination and for days afterward, one felt witness to the violent cutting down not of a monumental force for justice and social change like King or the powerful embodiment of a people’s massive misfortunes or a titan of religious potency but rather of a rival—of a vital, imperfect, high-strung, egotistical, rivalrous, talented brother, who could be just as nasty as he was decent. The murder of a boyish politician of forty-two, a man so nakedly ambitious and virile, was a crime against ordinary human hope as well as against the claims of robust, independent appetite and, coming after the murders of President Kennedy at forty-six and Martin Luther King at thirty-nine, evoked the simplest, most familiar forms of despair.
For those schoolchildren and their parents in June 1968, Kennedy’s campaign offered a sense of nostalgia. They remembered the exuberance of his brother’s campaign, the optimism of his administration, and the possibilities of the 1960s. For the nation’s large ethnic Catholic voting bloc, another Kennedy reminded them of that feeling of validation in the 1960 election. Of course, it had been a tumultuous decade for these voters. They lived in cities that had precipitously declined since JFK’s campaign visits in 1960. Railroad stations ended passenger service, theaters closed, factories shuttered, and new highways offered an exodus to suburbia. As Catholics, they prayed for the conversion of Russia, adapted to Vatican II reforms, and adjusted to new parishes in the developing outskirts. Young draftees were shipped off to a catastrophic war, which only intensified their feelings of disillusionment. Their disenchantment raised questions about their sustained support for Democrats. Kennedy may have proved formidable for Nixon in the general election, but the Catholic vote was increasingly up for grabs.
Pat Buchanan understood this electoral opportunity for Republicans. In a 1971 memo, Buchanan argued that Catholics were the largest bloc of available Democratic voters for the GOP: “The fellows who join the K.of C. (Knights of Columbus), who make mass and communion every morning, who go on retreats, who join the Holy Name society, who fight against abortion in their legislatures, who send their kids to Catholic schools, who work on assembly lines and live in Polish, Irish, Italian and Catholic communities or who have headed to the suburbs—these are the majority of Catholics; they are where our voters are.”
In subsequent presidential elections, Catholic voters flocked to Democrats and Republicans. Their electoral preferences were driven by the issues of the moment and often by location. The geographical divide of our politics has only intensified. The 2016 presidential election encapsulated this trend. Voters in Appalachia and the Rust Belt overwhelmingly supported Donald Trump that year. Many of these voters previously supported Obama in both 2008 and 2012. In 1968, these voters likely appreciated Kennedy’s campaign message. But the tragedy of the nation is now a loss of optimism—the belief that tomorrow will be a better day. Americans are overwhelmed by ideological tension and socio-economic angst. The prosperity enjoyed by large metropolitan regions has not spilled over into the heartland. There is no nostalgia for 1968 because countless Americans understand that the nation has failed to address income inequality, job displacement, urban decline, and mass poverty. It was so long ago, but America did lose its innocence on November 22, 1963. Bobby Kennedy’s death in 1968 served as a reminder that it would never return.
Charles F, McElwee III is a writer based in northeastern Pennsylvania. Follow him on Twitter at @CFMcElwee.
Voices from the Rust Belt, ed. Anne Trubeck, Belt Publishing, 256 pages.
They call the wide expanse from the Northeast to the Upper Midwest the Rust Belt. The region’s geographical lines, subject to perpetual debate, include cities and towns built by the natural resources that fueled America’s industrial revolution. Their population centers, far from their peak, sit atop Appalachian plateaus, straddle sleepy rivers, and command turbulent lakes.
The Rust Belt largely owes its ambiguous designation to Walter Mondale. During the 1984 presidential campaign, the Democratic candidate railed against Ronald Reagan’s trade position in a speech to steelworkers in Cleveland. The president had lifted quotas on steel imports, which negatively impacted an already imperiled industry. “Reagan’s policies are turning our industrial Midwest into a rust bowl,” declared Mondale.
Mondale’s reference transformed into “Rust Belt,” a term the press has since bestowed on the Northern and Midwestern regions of the country experiencing the maladies of a post-industrial economy. There is no field guidebook dictating what qualifies as the Rust Belt. From Lawrence, Mass. and Trenton, N.J. in the Northeast to Flint, Mich. and Dayton, Ohio in the Midwest, reporters have filed stories that typically chronicle the anguish of these once prosperous cities. Deployments into heartland communities almost always follow a storyline of decline and despair. The subjects profiled for these pieces are found in the daytime bars, fast food lines, and the least desirable neighborhoods of communities created and destroyed by coal or steel. Dispatches from this American safari only intensified when reporters and pundits probed how the “unthinkable” occurred on election night in November 2016.
A new anthology, Voices From the Rust Belt, takes a different approach. The book owes its publication to Donald Trump’s victory, but editor Anne Trubek, founder of Belt Publishing and Belt magazine, has curated an anthology that challenges the narrative common in post-election media portrayals. In Voices, Trubek has provided a civil platform for various writers to celebrate, lament, reminisce, and report their experiences in cities like Buffalo, Cleveland, Akron, and Detroit.
Trubek calls for us to “resist the urge to make of this place a static, incomplete cliché, a talking point, or a polling data set.” “We have created not only income inequality but also narrative inequality in this nation: some stories are told over and over while others are passed over, muted,” she writes. This book is an important compendium to understanding the Rust Belt experience, one that offers an authenticity often lacking in what amount to anthropological hit pieces.
The book’s essays confront the themes driving modern political disillusionment. There are stories involving poverty, violence, corruption, racial tension, gentrification, heroin, and environmental devastation. The writers recount their observations from communities broken by technological advances, globalization, and unfettered free trade.
There was a period in America when Rust Belt cities accelerated America’s economic supremacy. They thrived during the nation’s “special century,” coined by economist Robert Gordon to describe a period of unrivaled ingenuity and economic growth between 1870 and 1970. In The Rise and Fall of American Growth, Gordon has argued that the period after the Civil War, through the Gilded Age and World Wars, and up until Nixon’s first term comprised a unique revolution: “No other era in human history…combined so many elements in which the standard of living increased as quickly and in which the human condition was transformed so completely.”
Rust Belt cities catapulted this transformation. But they now suffer the consequences of an economy that has experienced the weakest expansion in the post-World War II period. For these cities, stagnation translates into relocation for better opportunities or an improved quality of life. In Voices, G.M. Donley, a resident of Ohio’s Cleveland Heights, writes that the Rust Belt “has proven to be an ideal laboratory for exploring the practical problems associated with a society habituated to easy mobility, specifically what results when mobility outruns population growth.” In struggling cities, the number of abandoned properties increase as people “filter” out of undesirable housing. A city’s overall property values, meanwhile, decrease when an oversupplied market outpaces regional demand. In addition to a housing oversupply, Donley writes that the flight to a “Better Place” results in abandoned retail space, an overburdened tax base, persistent segregation, and the use of better schools as real-estate marketing.
In another essay, TAC contributor Jason Segedy poignantly reminiscences about his hometown of Akron in the 1970s and 1980s. Segedy’s childhood marked the twilight of the city’s dominant tire industry. In 1982, Akron built its last passenger-car tire, and 90 percent of the industry’s jobs disappeared within two years. The rubber sector’s collapse marked the end of an era, one that heralded strikes, layoffs, and mass unemployment. Segedy recalls that the period was “followed by its echoes and repercussions—economic dislocation, outmigration, poverty, and abandonment—as well as the more intangible psychological detritus: the pains from the phantom limb long after the amputation, the vertiginous sensation of watching someone (or something) die.” “When the mythology of your hometown no longer stands up to scrutiny,” writes Segedy, “it can be jarring and disorienting. It can even be heartbreaking.”
Heartbreak is a theme that links together various parts of the Rust Belt. Generations of people who grew up in the region’s cities vividly remember the sights and sounds of industry and commerce. Their childhoods were spent in well-kept rowhomes, apartments, duplexes, and bungalows. Their neighborhoods, often ethnically divided, were within walking distance of a church and perhaps a short drive to the factory or mill. Their grandparents or great-grandparents had arrived in these neighborhoods a century before. The ancestors’ settlement didn’t mark the attainment of some lofty American Dream. Instead, they humbly sought survival in the nation’s growing mass-production economy. In the early 20th century, they were assimilated by industries and the labor movement. Many ultimately improved their family’s position within America’s economic bracket.
Generations of the Rust Belt’s young men disproportionately served in combat. When they returned, they either worked in the factories or earned a college degree through the G.I. Bill. College graduates typically came back as accountants, attorneys, bankers, and businessmen. If they sought opportunities elsewhere, they maintained strong familial ties to their hometown. The typical Rust Belt city, while lacking the vitality enjoyed between the World Wars, remained stable and a favorable place to raise families. But municipal government corruption, accompanied by policy failures at the state and federal level, gradually eroded many cities—particularly their economic, civic, and cultural infrastructure. From the 1950s through the 1970s, interstate highways sliced urban grids and redevelopment projects erased residential neighborhoods. Suburban malls proliferated, extinguishing the neon signs of downtown businesses. Theaters shuttered, churches closed, and neighborhood schools were consolidated into bloated school districts. And then the factories, mines, and mills closed.
The 1980s and 1990s marked an era of sustained decline. An interconnected economy created billions of new consumers worldwide—but devastated the Rust Belt’s cities. Technological advances made outsourcing possible, and the factories subsequently closed. Distribution centers sprouted from the factory dust, supporting working-class families on lower wages. Immigration policies, frozen in 1965, resulted in new migration into affordable urban neighborhoods. Cities, burdened with a declining tax base, were often ill-prepared to absorb this rapid demographic change. An urban cycle of life and death, one which began in New York City after World War II, continued westward into the Rust Belt. While large metro areas have since enjoyed salvation, the Rust Belt’s interior cities remain in purgatory.
But the Rust Belt isn’t limited to declining post-industrial cities, making the region far more complex than what is presented in media portrayals. In an essay on Detroit’s suburbs, James D. Griffioen observes that, “One of the reasons metro Detroiters get so upset when journalists and photographers represent Detroit as a city of ruins is the reality that there are millions of people living in safe, well-kept neighborhoods in dozens of prosperous suburban communities.” The Rust Belt’s thriving communities outside cities like Detroit are overlooked in favor of snapshots depicting dilapidated homes, used syringes in city parks, and the ruined majesty of shuttered factories. And yet millions of middle-class families live in affluent suburban hamlets often minutes from these urban areas. It is their story that is frequently ignored.
They’re from families who started in urban neighborhoods. Their households are headed by baby boomers who grew up when the city enjoyed a happier existence, one that was rough yet safe and content—a city with vice but less violence.
The baby boomers were commonly the first in their families to attend college, relocating to suburban neighborhoods of 1990s-era homes with those ubiquitous arches and big windows. They earned six-figure salaries and worked in local law or accounting firms. Their millennial children attended highly-ranked public or parochial schools, played in youth sports leagues, and had an overall comfortable upbringing.
The baby boomers’ elderly parents, meanwhile, usually remained in the old neighborhood, living in a multi-generational homestead and supported by a healthy pension. The neighborhood slowly witnessed demographic change. The old family home needed frequent repairs, but the baby boomer children were there to pay for upkeep or look after their parents. When the parents’ health declined or when the neighborhood grew unsafe, the baby boomers placed them in overpriced nursing homes and commenced their gradual path to financial insecurity.
Then the Great Recession hit around 2007, just in time for boomers to see their parents pass away and children matriculate into college. The urban family homestead sat on the market for years, its value plummeting in a neighborhood overwhelmed by decline. While the city’s downtown may have experienced somewhat of a renaissance, the residential neighborhoods were overwhelmed by blight and crime. When the house finally sold, the baby boomers pocketed under $50,000, leaving little after paying nursing home bills. They also supported their millennial children with college tuition, taking out loans to subsidize bills that exceeded $50,000 each year. This yearly tithe to an increasingly bureaucratic higher-education system paralleled the recession wiping out the value of 401(k)s and Individual Retirement Accounts.
When the millennial children of the boomers graduated, they returned home seeking employment. They came home to an economy that offered underemployment, temp jobs, and internships. They found themselves meeting metrics in offices designed with open-desk plans, grossly underpaid and immersed in the tenets of corporate social responsibility. They faced a future far different from those who entered the workforce in the 1980s.
Their aging parents, meanwhile, confronted a world they didn’t expect in the 1990s. In 2018, they remain burdened with debt and their prospects for retirement remain far away. As a result of the Affordable Care Act’s passage in 2009, their health insurance premiums have skyrocketed with fewer options for medical care. The Obama-era legislation accelerated the consolidation of healthcare systems, which swallowed smaller regional hospitals. This, in turn, hurt the local attorneys or accountants when the new behemoths went in house for legal work and accounting.
This story could serve as the plot for a modern sequel to American Beauty. But it is the true story of the Rust Belt, which also exemplifies what is occurring in Middle America.
The economic instability corroding the middle class hastened a political response. Middle-class families in the Rust Belt resented how the media reported their frustrations. This assured Trump’s victory in 2016, and it is why Rust Belt voters continue to support him. These middle-class voters, disillusioned by the Bush years, supported Obama in 2008 and even 2012. The swing of Obama voters to Trump played a pivotal role in the presidential election. As the New York Times recently reported, about a third of the 650 counties that supported Obama twice flipped to Trump. Many of these counties were in Michigan, Ohio, Wisconsin, and Pennsylvania. They were comprised of families who were impacted by the policy failures of the past three decades.
Although Trump was elected president, the Rust Belt’s disillusionment will only continue. Voters, regardless of party affiliation, are still digesting the consequences of globalization, technological change, and ineffective immigration policies. Although inheritors in the belief of the New Deal, they reject central tenets held by both Democrats and Republicans. They witness America’s income inequality, understanding that adjusted middle-class wages have remained frozen for nearly forty years. They’re also cognizant of the nation’s crisis of wealth, or the alarming gap between a household’s total assets minus debts. In a recent paper, academics Christina Gibson-Davis and Christine Percheski studied this phenomenon. Writing for the Times, Gibson-Davis and Perheski found that about a third of families with children in 2013 had only debt and no wealth. This partially explains a United Way report finding that over 40 percent of U.S. households cannot pay for today’s middle-class basics like childcare and cell phones, while reminding us that positive economic numbers overshadow unsettling trends.
Voices from the Rust Belt manages to tell the Rust Belt’s complicated socio-economic story in a comprehensive way, through writers actually living in the region. As Jacqueline Marino from Youngstown, Ohio, explains, hometowns “are the places where we learn to feel love and hate and the spectrum of other meaningful emotions.” The Rust Belt’s many communities, grasping for life or enjoying a revival, deserve further study along the lines of this necessary manual. For too many years, America’s elected and unelected elite suffered from what Charles Dickens called “telescopic philanthropy,” or the habit of focusing on global causes and neglecting the struggles within their own country.
The Rust Belt deserves continued reporting, but in a style that appreciates its pain, struggles, and complexity. Trubek’s Voices offer a helpful primer as reporters continue their safaris into the heartland.
Charles F. McElwee III is a writer based in northeastern Pennsylvania. Follow him on Twitter at @CFMcElwee.
What do November’s midterm elections portend for Democrats and Republicans? This week, all eyes are on Pennsylvania’s 18th Congressional District as a potential bellwether.
In this Tuesday’s special election, Democrats hope a victory will signal their momentum to win back the U.S. House this year. Yet this week’s primary does not necessarily offer a clear vision of the immediate future. In this period of political tumult, the 18th District instead provides a historical lesson, reminding us that voters and electoral trends are far more complex than a “Trump base” or “Blue Wave” revolt. As James Broussard, director of Lebanon Valley College’s Center for Political History, observed, “It’s appropriate that so much national attention is focused on this race, because a couple of 18th District races had a historic impact in the past,” said Broussard. He noted that the district “is where both John Heinz and Rick Santorum, two very different Pennsylvania Republicans, got their start in national politics.”
The district’s boundaries have been reconfigured over the decades, and are set for a more dramatic redrawing in the upcoming general election. But today, as in the past, they principally encompass Pittsburgh’s southern suburbs and rural portions west and east of the city. The area remains a demographic reflection of 1970s America—an assortment of tony hamlets, working-class towns, and rustic areas largely unchanged with the passage of time. This part of western Pennsylvania, while categorically post-industrial, has not experienced the dramatic level of socio-economic change impacting regions further east beyond the Susquehanna River.
The district currently registers 70,000 more Democrats than Republicans, but for the last fifteen years it was represented by GOP Rep. Tim Murphy, whose scandalous downfall in late 2017 resulted in this special election. If the Democratic candidate, Conor Lamb, prevails over Republican state Rep. Rick Saccone, the conventional wisdom holds that the district’s voters—who flocked to Donald Trump in 2016—are rejecting the policies of Trump’s White House. While a level of disillusionment with Trump certainly exists, many voters are exercising their independent judgment about the candidates, and in Pennsylvania, ticket splitting is still a frequent electoral phenomenon. Ideological agnosticism lingers, with voters who supported Barack Obama in 2008 having favored Trump over Hillary Clinton in 2016.
Rick Santorum, who represented the area from 1991 until entering the U.S. Senate in 1995, also conveyed working class frustrations as a competitive presidential candidate in the 2012 Republican primary. Santorum’s 2014 book, Blue Collar Conservatives: Recommitting to an America That Works, later influenced Trump’s own campaign message. The 18th District arguably served as Santorum’s political terrarium, where he cultivated his understanding of blue-collar voters.
Among Santorum’s leading critics during his 1994 senate run was Teresa Heinz Kerry, the widow of another product of the 18th District. John Heinz represented the area in Congress from 1971 until his statewide Senate victory in 1976. (That campaign had provided for Santorum’s own entry into the political arena, as a Penn State student campaigning for Heinz.)
Heinz, who was a moderate Republican, embodied a different period in American politics, one devoid of the ideological instability corroding both parties. His success in the 18th District, bipartisan support as a senator, and rise as a national political figure, offer a historical perspective worth revisiting in today’s hostile environment.
* * *
An heir to a food-processing empire, Heinz’s background was patrician by any American’s standards. Born in 1938 in Pittsburgh, Heinz grew up in San Francisco after his parents divorced. An only child, Heinz spent summers in Pennsylvania with his father, chairman of the H.J. Heinz Company. Heinz’s education included boarding school at Exeter, a history degree from Yale, and an MBA from Harvard. He met his future wife, Teresa, during a summer spent in Geneva. After college, he returned to Pittsburgh to work for his family’s company, married, and had three sons. An intellect, philanthropist, and arts enthusiast, Heinz taught at Carnegie Mellon University, endowed community endeavors, and collected paintings that were later exhibited at the National Gallery of Art.
Throughout the 1960s, Heinz became active in state Republican politics. Heinz’s exposure to politics began in 1964 when he briefly worked for Sen. Hugh Scott, another moderate Republican who he would go on to succeed in 1976. In 1971, Heinz announced his candidacy for the 18th Congressional District after the sudden death of Republican Rep. Robert J. Corbett. Heinz established his campaign headquarters in Sharpsburg, where his great-grandfather began packing foodstuffs in the late 1860s. During the special election, Heinz advocated views that resonated with a cross-section of voters who had been shaped by the New Deal era. He supported withdrawal from Vietnam and called for a federal income tax cut for families earning less than $12,000 each year. Democrats significantly outnumbered Republicans in the district, but Heinz won by a large margin a victory, and at 33, became the youngest Republican member of Congress.
When asked to comment on his first 100 days in office, Heinz said that he “attempted to wear no label, neither ‘liberal’ nor ‘conservative’ nor ‘pro-labor’ nor ‘pro-management.’” Maintaining his commitment to pragmatism, Heinz stated that he “acted in each case on the basis of what I believe is right for my constituents, for our state and for our country.” It was this sentiment that made Heinz politically popular in Pennsylvania. He personified a civility that resonated with colleagues and voters from both parties. This approach proved favorable when Sen. Scott announced his retirement in 1975.
In the 1976 Republican Senate primary, Heinz defeated Arlen Specter, Philadelphia’s former District Attorney and later his colleague in the Senate. In the general election, Heinz narrowly defeated Rep. Bill Green III, a powerful Democrat from Philadelphia.
Heinz’s three Senate terms were marked by his moderation. His interests ranged from environmental protection and regulatory reform to improving elderly care and protecting the nation’s interests in global commerce. Heinz expressed views that triggered support or opposition from both parties. One of his most expressed interests was guarding Pennsylvania’s economy when the post-industrial transition imperiled Pittsburgh’s steel industry and the state’s once prosperous communities. Heinz was also vocal on trade and antitrust issues that have been revived by the Trump presidency. As a senator in 1984, he criticized the Reagan administration’s rejection of tariffs and quotas for the steel industry. In the mid-1970s, Heinz chaired the House Republican Task Force on Antitrust and Regulatory Reform.
Mark DeSantis, a Pittsburgh-based entrepreneur and adjunct faculty member of Carnegie Mellon’s Heinz College, worked on Heinz’s staff in the late 1980s. DeSantis lamented the absence of moderating forces in contemporary politics, observing that polarization is occurring in Pennsylvania. Present discourse is shaped by battle lines and angry rhetoric. Recalling Heinz’s approach, DeSantis said the Senator was “about getting things done.”
According to DeSantis, Heinz “had the courage of his convictions. When he believed in something, he stuck by it, even if it made him unpopular.” DeSantis remembers Heinz’s commitment to serving citizens. “He was very sensitive about Pennsylvania being treated fairly,” DeSantis said, “and making sure that the state was not blindsided by legislation” that would adversely impact the state’s economy or its citizens. “His kind is so desperately needed right now.”
DeSantis recounted Heinz’s friendship with Daniel Patrick Moynihan, New York’s longtime Democratic Senator. Both Heinz and Moynihan were fiercely intellectual, possessed encyclopedic knowledge, and held diverse political views. “Those two gentlemen personified, in my opinion, what democracy should be,” said DeSantis. They engaged in “thoughtful, sharp debate on the issues with respect for each other’s opinions and an acknowledgment that the whole point of the debate was to advance the cause of this country and its citizens.”
The way Heinz and Moynihan led their careers is a significant departure from the present Congressional ecosystem. DeSantis notes that the Senators didn’t turn their debates into “bitter personal disagreement or the belief that the opposing party was trying to destroy the country.” This level of civility existed in Congress even during the 1980s, but slowly faded with the proliferation of cable news, public scandals like Sen. Gary Hart’s alleged affair, Newt Gingrich’s Republican Revolution of 1994, Clinton’s White House tumult, and finally the foreign and economic policy failures of the 2000s. These events paralleled the rise of social media, the fracture of civic and religious institutions, the decline of countless communities, and the ideological revolts in both parties.
Heinz’s career tragically ended in April 1991 when his chartered plane collided in the air with a helicopter in suburban Philadelphia, killing the senator and six other people. Heinz’s untimely death at 52 shocked Pennsylvania and the nation. He was a top figure in national politics, securing his place as a future presidential contender. Had Heinz lived, he could have challenged Clinton’s ascendance and served as an influential and moderating voice within his party. Harris Wofford, a leading Democratic figure, won the special election over former Governor Dick Thornburgh to serve out Heinz’s term. Wofford was then defeated in 1994 by Santorum.
* * *
The 18th District, which under new boundaries is slated to disappear by Pennsylvania’s May primary, has trended Republican for several election cycles. Democrats hope that a victory will signify redemption following their party’s failures in 2016. But the district’s electorate, like any voting base, is more complex than what is commonly presented. Lamb, the Democratic candidate, is running as a conservative sympathetic toward Trump’s views. A former Marine and assistant U.S. Attorney, he hails from a Democratic dynasty in Pittsburgh-based Allegheny County. His grandfather, Thomas F. Lamb, served as the state Senate’s Majority Leader in the early 1970s, just as Heinz began holding elected office.
Lamb’s Republican opponent, Saccone, entered the state House as part of the Tea Party wave in 2010. A retired Air Force counterintelligence officer, Saccone has claimed that he was “Trump before Trump was Trump.” But Lamb’s campaign seeks to establish that a Blue Dog-style Democrat can still win traditional Democratic voters. If Lamb wins, it remains to be seen how the national Democratic apparatus will receive an understated and conservative member of their party. Democrats face an ongoing identity crisis, one which will become even more prominent if Lamb defeats Saccone.
If Heinz were alive, DeSantis believes he would be disappointed by the bitterness, along with the lacking substance, of this special election. Heinz’s effectiveness stemmed from bipartisanship and the ability to deliver for his constituents.
Of course polarization has always existed, and has shaped gerrymandered maps, including the state’s newest districts, which were recently approved by the elected Democratic majority of the Pennsylvania Supreme Court. But if anything is to be learned in 2018, perhaps it’s that voters yearn for some level of civility. Regardless of next week’s outcome, the 18th District’s voters remind us that it is possible for politicians to hold nuanced views that don’t easily fit into either party’s purist platforms. It was Heinz’s mastery of this non-ideological approach that endeared him to his first constituency and eventually all of Pennsylvania.
Charles F. McElwee III is a writer based in northeastern Pennsylvania. Follow him on Twitter @CFMcElwee.
What is the Irish Catholic experience? This question is harder to answer in an age of acceleration and fragmentation. After all, filtered photos, snaps, and Twitter characters dictate countless daily lives. A pixelated existence cannot accommodate attention to oral histories, familial traditions, and religious customs. The dispersing of Irish Catholic hamlets to suburbia, accompanied by the closure or demographic change of parishes, has further erased remnants of this once identifiable cultural tribe.
For those endowed with repositories of childhood memories, to be Irish Catholic, at least in America, is to absorb and accept certain practices and mannerisms. It is to weather the biting wit that checks pride or folly, to receive left-handed compliments from unimpressed relatives, and to obey the sanctity of multi-generational grudges.
To be Irish Catholic is to remember those who suppress scandalous stories, sympathize with the downtrodden, and nurse a quiet suspicion of success. It is to witness the peculiar dynamics of an officious mother and her bachelor son. It is to bear the scars of Catholicism while also finding consolation in its saints and prayers. And it is to nurture a primordial melancholy, either finding comfort in sadness or forgoing sobriety to numb the pain. Alcohol is inescapably a part of this DNA, as the Irish-American journalist Pete Hamill wrote in his memoir. “The culture of drink endures because it offers so many rewards: confidence for the shy, clarity for the uncertain, solace to the wounded and lonely, and above all, the elusive promises of friendship and love.”
Millennials will likely be the last generation to fully comprehend such tribal qualities. The Irish Catholic experience peaked during the Second Vatican Council, but has slowly faded with the death of older relatives, the changed cultural makeup of urban neighborhoods, the dissolution of cash-strapped and scandal-ridden parishes, and an overall indifference towards tradition in this modern era.
Irish Millennials can still point out the Sacred Heart and the Infant of Prague in their grandparents’ homes, recall the meaning of Forty Hours’ Devotion, and recite rebel songs played at parish festivals. As children, they heard U2, The Cranberries, and The Corrs on the radio, perhaps inspiring curiosity about their heritage. But as adults, many Millennials have come to believe that ancestral appreciation is just another contemporary casualty, one that can be digitally embalmed by Google. For this isn’t the first time that the Irish have “exchange[d] themselves for the future,” as Breandan Mac Suibhne reminds us.
Mac Suibhne’s book, The End of Outrage, studies the Irish habit of ambivalently accepting the present while willfully forgetting the past. In America, the Irish nurtured a post-Famine identity for over a century. Immigrants arrived in steel, mining, and factory towns where they preserved elements of their culture through parishes, fraternal organizations, and saloons.
Subsequent generations inherited these social bonds. They embraced their heritage while also fostering the perception of an Ireland frozen in the past. In the mid-1800s, many Irish arrivals purged their previous lives of sorrow, shame, and heartbreak. This left their descendants with distorted ideas of a land unchanged by time. Mac Suibhne, well versed in this transatlantic catharsis, investigates the haunting silence that enveloped Irish Catholicism in one isolated part of Ireland. Through a panoramic exploration of County Donegal, he recounts how rural Ireland adjusted after the Potato Famine in the 1840s.
A Centenary College historian and National Endowment for the Humanities fellow, Mac Suibhne paints an evocative canvas of clashing tribes and morally opaque characters. He resurrects Donegal as it existed in the 19th century—a self-contained region confronting change and loss in a rapidly industrializing world.
Donegal is located in northwest Ulster, a culturally orphaned and geographically remote region endowed with dramatic landscapes. The county’s rugged coastlines, daunting mountains, and lush valleys remain almost unnaturally beautiful. Donegal’s geological composition perpetuates a sense of the mystical between its people and their land.
It’s from here that Mac Suibhne hails. He charts post-Famine adjustment in his birthplace, Beagh, a small community in southwest Donegal. This townland was settled long before the Famine, when families subsisted on oats and lived on land unsuitable for cultivation. In the mid-19th century, the region was “steeped to the lips in poverty,” with class division, mass starvation, merciless landlords, and an oppressive English regime creating unrest. Such conditions incited violence from a secret society, their acts of “outrage” serving as the basis for Mac Suibhne’s study.
This enigmatic and oath-bound society, the Molly Maguires, followed a tradition in Ireland’s countryside. In the 1840s, the Molly Maguires were the latest retributive group following a succession of rural societies like the Whiteboys and Ribbonmen. The Molly Maguires were especially prevalent in Donegal, where they committed disruptive or violent acts against tenants, farmers, landlords, and authority figures.
While the Mollies’ conduct is widely documented, the origin of their name remains a mystery. Stories range from an evicted old widow named Molly Maguire to the practice of mummery, with men dressed in costume or disguised as women. Regardless, this agrarian practice pervaded west Donegal, with Mollies intimidating those who evicted tenants, mistreated the elderly, or overcharged the poor. The Mollies’ presence paralleled Donegal’s transition from communal farms to squared agriculture. The “squaring” of farms particularly transformed the countryside, forcing families to fend for themselves in a region often grappling with poverty, oppression, hunger, disease, and death.
But Donegal’s struggles persisted regardless of the Famine. The county, especially in the west, escaped the worst of the potato blight and ended up having the ninth lowest mortality rate in Ireland. Mac Suibhne explains that the impoverished class, accustomed to weather-related crop failures and support from relief agencies, were better prepared when the Famine peaked and food prices soared in the late 1840s. When the Famine ended, Molly Maguire activity slowly declined in west Donegal. Mac Suibhne recounts that following the hunger, “tensions emerged in the Molly Maguires when one element sought to curtail such activity, while another sought, unsuccessfully, to expand it.”
But the Mollies remained active in Beagh, where they targeted James Gallagher, the townland’s largest landowner. After centuries of oppressive serfdom under the Protestant gentry, the Irish believed possessing land was sacrosanct. The fact that Gallagher acquired land under dishonorable circumstances enraged the Mollies. Among Gallagher’s targets were neighbors who gave up land for passage to America, partial forgiveness of debts, and much needed whiskey. When Gallagher set to evict subtenant families on his new properties, the Mollies responded to this injustice with a threat. In 1856, the Mollies raided Gallagher’s house, leaving a letter warning “your royal highness” to “relinquish your idea of dispossessing people.” Aware that Gallagher’s father was in the poorhouse, the letter also advised Beagh’s preeminent landowner to be a better son.
Donegal was experiencing social and cultural change in the 1850s, as the county’s isolation failed to insulate its people from a global industrial age. Perhaps this contributed to the Mollies’ weakening grip on the countryside. When Gallagher ignored their threatening letter, the Mollies resolved to assault the landowner and his wife. The prospect of murder terrified Patrick McGlynn, a schoolteacher who authored the letter. McGlynn turned informer to protect himself, with his betrayal resulting in the arrest of two dozen alleged Mollies.
While Mac Suibhne illuminates west Donegal families, unlocks Molly Maguire mysteries, and investigates McGlynn’s betrayal, he also provides a cultural tour de force of life in rural Ireland. Through painstaking detail and research, Mac Suibhne manages to reconstruct life as it existed in Donegal during the mid-19th century. His book movingly portrays a people forgoing their mystical past in order to accept an uncertain present. And Catholicism was part of this transition. Mac Suibhne demonstrates how the Church conquered Donegal’s souls.
Donegal’s Irish blended Catholicism with pagan rituals. As the late novelist John McGahern observed, the Irish “went about their sensible pagan lives as they had done for centuries, seeing it as just another of the fictions that they’d been forced to kowtow to, like all the others since the time of the Druids.” According to Mac Suibhne, the rural Irish of the mid-1800s “vibrated between two cosmologies, one ancestral or fairy and the other Christian. Central to the non-Christian system were ritualized gatherings around fires or wells, often on dates determined by solar or lunar cycles.” While Donegal’s people were devoutly Catholic, Mac Suibhne writes that “they were remarkably indifferent to the requirements of their Church.”
Missing Sunday Mass was an act of mortal sin, but only a quarter of parishioners in towns like Ardara and Inishkeel would show up to the chapels. Weekly Mass was for the well-off parishioners who could afford sacramental fees. Avaricious priests, reared on comfortable farms, presided over these parishes. For Donegal’s peasants, the Church did not figure prominently in their lives. This changed by the 1850s, when Dublin’s archbishop deployed shock troops to discipline this unruly flock.
The Redemptorist Order of Ireland carried out missions in Donegal to discipline the poorer Catholics, regularize the practice of sacraments, and empower priests as the moral arbiters of their spiritual and temporal lives. The Redemptorists targeted children to be “more firmly oriented towards chapel and the norms of the ‘better classes.'” One priest, ironically named Father Furniss, wrote a children’s book about Hell to instill fear, ensure obedience, and solidify the sacramental obedience of Donegal’s youth. The missions worked, with Furniss recalling one extremely wet winter morning when “children came in crowds soaked with the rain. A simple crucifix was held up before them, and at the very sight of it, there was a universal screaming and shouting through the whole church.”
The Redemptorists reared a generation terrified of sin and disciplined in the Church’s teachings. One mission succeeded in the recantation of Ribbon Society members, with one leading priest writing that “members of this society…came to us in masses and abjured their membership and accepted the Holy Sacraments. Many swore on my mission cross.”
Through the Church’s discipline and England’s command, Donegal’s rural Catholics experienced rapid change after the Famine. In just one example, the clergy’s control of the national school system increased Catholics’ literacy levels. Mac Suibhne writes that the “Catholic poor themselves, abandoning the old and particular and adopting the new, becoming English-speaking and literate, and keeping holy the Sabbath day, now appeared less exotic to their rulers…” Ironically, Mac Suibhne observes that the long-term consequences “within the broader Catholic community—not least by the removal of the poorest of the poor—had allowed the smallholders of the west of Ireland to stand front and centre in a national opposition that having effectively dispensed with the landlords now aimed to semi-extract the country from the United Kingdom.”
As Donegal modernized—slowly losing its mysticism, ancient customs, and native tongue—scores of impoverished countrymen left their homeland and sailed across the Atlantic. They departed Ulster for Philadelphia, following a shipping link established under William Penn. During the colonial era, this shipping link imported Scots-Irish Presbyterians to fight Native Americans in Pennsylvania’s frontier. But Donegal’s rural Catholics arrived en masse to work in the booming anthracite coal mines of northeastern Pennsylvania. The mines, which fueled America’s industrial advances and enriched its leading titans, required cheap and expendable labor. Donegal’s Irish, particularly in the west, fulfilled this demand.
In the 1840s and 1850s, Donegal’s Irish settled in the deeply forested coal fields of northern Schuylkill County, southern Luzerne County, and western Carbon County. They inhabited shanties in remote outposts and developing towns like Girardville, Mahanoy City, Shenandoah, and Summit Hill. The burgeoning commercial center for this region was Hazleton, the highest point for settlement in Pennsylvania. In Hazleton, Gaelic-speaking immigrants lived south of the main road in a neighborhood known as Donegal Hill. They formed Catholic missions in Hazleton and its surrounding patch towns. Their lives centered around the mining experience, fraternal organizations, and their growing parishes. The largest church, St. Gabriel’s, was established in 1855 by Philadelphia Bishop John Neumann, who was later canonized a saint.
There were distinct similarities between Donegal and the Appalachian region where they arrived. Donegal’s Irish left an isolated area to be greeted with more isolation in America. Both regions were endowed with forbidding geography, hazardous weather, and breathtaking scenery. Their property struggles continued, not because of squared farms, but due to company-owned shanties in makeshift coal towns. The Irish assembled small Catholic missions to practice their religion in a region otherwise hostile toward their cultural identity. In Donegal, they faced the oppression of English rule. In the coal towns, they suffered the prejudice of English and Welsh coal operators.
The Irish laborers’ relationship with these operators bordered on serfdom. This relationship formed at an early age, with breaker boys satisfying a high demand for child labor. Mining life also engendered permanent uncertainty. As company housing tenants, families risked eviction without forewarning. Coal companies required miners to purchase all goods and mining supplies through their stores, which created permanent indebtedness. The companies also exploited the miners’ ethnic divisions and language barriers within their labor hierarchy.
The Coal and Iron Police, a private force created by the Pennsylvania General Assembly and employed by the coal companies, monitored all town movement and activity. The mining conditions, meanwhile, were abysmal, and each day was a life-threatening descent underground. Families anticipated injury or death, and it was common for a miner to be mangled by a coal breaker. The company typically placed the body in a sack and delivered it to the deceased miner’s family.
The mining experience created a collective sense of inferiority. Alcohol consumption offered solace; taverns proliferated throughout the region. While alcohol offered distraction and Mass provided spiritual comfort, families still struggled to support themselves in this new land. Their lives were arguably worse and less stable than the region they left behind. The medieval conditions precipitated a response, with Irish laborers becoming politically active and reviving the Molly Maguires’ agrarian violence in the mine patches.
Mac Suibhne addresses Pennsylvania’s Molly Maguire saga at length, the secret society’s activities occurring between the 1850s and 1870s. “The Molly Maguire troubles,” writes Mac Suibhne, “involved resistance to anti-Catholic and anti-Irish discrimination in the nativist rage of the 1850s; opposition to the draft in the Civil War of the early 1860s, when wages were high in the coalfields and poor men did not want to fight the rich men’s war; and a determination, from the mid-1860s through the mid-1870s, to maintain wage levels and improve conditions by organizing unions.”
Throughout the region, the Mollies carried out threats and acts of violence against the mine bosses, ranging from beatings to murder. Their activities paralleled attempts to organize labor, but the Coal and Iron Police responded to such futile efforts with their own violence. The Ancient Order of Hibernians, meanwhile, was an oath-bound fraternal organization that provided support to impoverished Catholics in the mine patches. The AOH was accused of serving as the cover organization for the Mollies, and the Church ultimately threatened all members with excommunication. The alliance between the Church and the coal companies played a critical role in breaking up the Mollies.
The cataclysmic moment arrived when Franklin Gowen, the omnipotent president of the Philadelphia and Reading Railroad Company, hired undercover Pinkerton detectives to infiltrate the Mollies. One of the detectives, the Irish-born James McParland, gained access to their innermost circle and appeared as a surprise witness at showcase trials. The nationally covered murder trials, which were packed with clearly prejudiced German juries, resulted in the hanging of twenty men between 1877 and 1879. On June 21, 1877, known as “Black Thursday,” 10 of these men were hanged in the Schuylkill County and Carbon County prisons.
Many of the accused had west Donegal connections. Mac Suibhne recounts a few of their stories. Tom Fisher, an AOH delegate and township tax collector who was hanged in 1878, offered a dying declaration. “On the night that he was alleged to have killed a man, he had ‘a few social drinks’ with a fellow named Boyle and another named Breslin in Cornelius T. McHugh’s saloon in Summit Hill and he then had a few more in the barroom of Jimmy Sweeney’s Hotel,” Mac Suibhne writes. “And at no stage that night did he converse, ‘in English or in Irish,’ with a Mulhearn or an O’Donnell or leave Summit Hill for Tamaqua.” Mac Suibhne concludes that, “But for the place-names Fisher might have been talking of Ardara.”
The Black Thursday hangings vindicated the Church, which sided with the mine bosses. Philadelphia Archbishop James Wood, a convert, complied with Gowen’s request to condemn the Mollies. Priests escorted the condemned men to the scaffold as they muttered prayers, requested forgiveness, and held crucifixes. They focused the final minutes of their lives regretting not heeding the Church’s teachings on secret societies. Their deaths incited fear among the Irish. While labor unrest ultimately intensified across the region, the Mollies’ story simply faded out of shame. It was through historians that this saga was revisited in the 20th century. But even today, the Molly Maguires and their very existence remains a source of contentious debate among academics, descendants, genealogists, and regional historians. Mac Suibhne’s book is an important addition to the canon. He offers a unique perspective as the descendant of Donegal Irish from both countries.
Mac Suibhne’s book is a sweeping historical tale of those Donegal Irish “who saw it all.” He writes of William McNelis, born in 1837, who in later years recalled “cures and charms” and told stories of the “fairies and ghostly lights.” Mac Suibhne also writes about Nahor McHugh, born in 1820, who saw the Famine, the change to squared farms and English-speaking homes, and his own children leave for Pennsylvania’s mines. “Nabor McHugh died in April 1904,” writes Mac Suibhne, “and he was doubtless as keenly mourned in Hazleton, Pennsylvania, as in Beagh, for most of his children were there.” Through the 20th century, Hazleton’s Donegal Hill residents retained elements of the culture created during McHugh’s lifetime.
The neighborhood’s parish, St. Gabriel’s, still stunningly dominates Hazleton’s landscape. Built in the French Gothic Revival style, St. Gabriel’s was modeled by a Hazleton architect to look like St. Patrick’s Cathedral in Manhattan.
For most of its existence, St. Gabriel’s was the city’s Irish parish. When entering St. Gabriel’s today, the glass choir loft door depicts St. Patrick, illustrative of the parish’s cultural history. But St. Gabriel’s is now principally a Latino parish, its side altar honoring the Dominican Republic’s patron saint and its priests holding Sunday Mass in Spanish.
Donegal Hill remained mostly Irish through the 1990s, but in recent years Hazleton has been transformed by Dominican immigration. While many Dominican residents hail from the island, they are typically second- to third-generation Americans from New York City or northern New Jersey.
In this globalized period of technological upheaval, the descendants of Donegal’s Irish immigrants in communities like Hazleton are experiencing the latest cycle of a long pattern of vanishing memories. Their neighborhoods are facing economic decline and rapid demographic change, the older residents relocating to suburbs, retirement communities, or to live with family members.
Hazleton, which is enjoying a revived downtown and a stable regional economy, has fared better than the older towns “over the mountain” in Schuylkill County, where many Irish families left long ago. In these communities, the parishes are closed, blight overwhelms residential blocks, and an opioid crisis persists. The dispatches of national reporters to these towns only perpetuate a low community morale. This region inspired novelist John O’Hara’s stories and produced the Swing Era’s Dorsey Brothers, but media outlets disregard this past to finish quick hits about “Trump Country.”
If anything, the Irish Catholic experience, particularly in northeastern Pennsylvania, is one that lingers with the accumulating centuries of sorrow. Those endowed with this emotional inheritance maintain pride, but they also understand their people’s struggle to overcome cultural and economic injustice. In the anthracite coal region, Irish immigrants fled Donegal for a better life. But today’s Donegal is far healthier than this region of America. In 2017, National Geographic named Donegal the “Coolest Place on the Planet.” In the Age of Trump, Pennsylvania’s former mining towns are destinations for reporters’ insensitive commentary. If they stayed a little longer, perhaps studying the region’s history or dropping by cemeteries with Gaelic inscriptions, they would leave with a better idea of this unique place.
The End of Outrage is a historical companion to understanding the Irish Catholic experience not only in Donegal, but also in northeastern Pennsylvania. If a monument could encapsulate the saga that Mac Suibhne presents, it is the grave of Condy Breslin at St. Gabriel’s Cemetery in Hazleton. Breslin, a Donegal-born miner who died of miner’s asthma in 1880, is buried beneath a tombstone with the following inscription:
Forty years I worked with pick and drill,
Down in the mines against my will,
The Coal King’s slave, but now it’s passed
Thanks be to God, I am free at last.
Charles F. McElwee III is a writer based in northeastern Pennsylvania. Follow him on Twitter at @CFMcElwee.
Our news feeds this year have been saturated with dispatches from Donald Trump’s America. Media outlets deployed journalists to regions besieged by rapid globalization, technological advances, and demographic change, where they attempted to explain the anger that inspired voters to embrace a figure like Trump. Their written pieces created a kind of niche journalism, presenting stories of what our nation failed to address with its transitioning economy. A number of these articles were accurate and empathetic portrayals of struggling communities, but too many of them patronized their subjects and reeked of condescension.
Typical was for journalists to decamp to down-and-out bars, where they’d score quotes of rage, racism, and radicalism from afternoon drinkers. Then it was on to boarded-up downtowns, shabby diners, and blighted street corners to paint a picture of human decay. Yes, there is a darker side to many of these post-industrial cities and towns. It’s also true that economic discontent often triggers political extremism. But too many of these accounts read like the field notes of colonial administrators visiting an uncivilized hinterland. Working-class regions are far more culturally fractious, economically diverse, and politically complex than what has been presented in the press.
The accumulating literature has also left political questions unanswered. As president, Trump happens to be leader of the GOP, but his past defies party allegiance. Likewise, if anything, his stalwart supporters are less devoted to the Republican Party than to the commander-in-chief. So what does this betoken for Republicans? What does conservatism even mean for the working class in post-industrial America?
The answers to these questions lie outside the typical conservative canon. An alternative starting point is an essay by legendary journalist Pete Hamill in New York magazine. In 1969, Hamill composed one of the most prescient pieces about working-class alienation ever written. He described the working class as standing “somewhere in the economy between the poor—most of whom are the aged, the sick and those unemployable women and children who live on welfare—and the semi-professionals and professionals who earn their way with talents or skills acquired through education.”
Hamill profiled residents of New York’s boroughs, typically disgruntled ethnic Catholics who resented dwindling finances, signs of disrespect, rising crime, and politicians’ indifference to their plight. He observed how the “information explosion” affected this slighted and tribal group. “Television has made an enormous impact on them, and because of the nature of that medium—its preference for the politics of theatre, its seeming inability to ever explain what is happening behind the photographed image—much of their understanding of what happens is superficial.” As a growing medium, television intensified racial tension and ignored the common socio-economic ailments shared by working-class families of all backgrounds. Urban rioting only heightened this divide. Hamill observed that it was “almost impossible to suggest any sort of black-white working-class coalition.”
Hamill reported that working-class whites directed their anger toward New York City’s Mayor John Lindsay, the liberal Republican who presided over a metropolis reeling from fiscal distress and skyrocketing criminal activity. They felt overlooked by their mayor, vulnerable in their neighborhoods, insecure in their jobs, and doubtful about their futures. “The working-class white man feels trapped and, even worse, in a society that purports to be democratic, ignored,” Hamill wrote. He warned of a brewing revolt, one that would ominously involve the use of guns. He concluded that this aggrieved group was simply “in revolt against taxes, joyless work, the double standards and short memories of professional politicians, hypocrisy and what he considers the debasement of the American dream.”
Hamill’s words, nearly 50 years old, could have been lifted from an analysis of today’s working-class grievances. His essay reminds us that the present revolt is not necessarily driven by ideology or party allegiance. The working class that Hamill described flocked to Republicans like Richard Nixon in 1968 and Ronald Reagan in 1980, but they also appreciated Gary Hart’s message in 1988, which sounded the alarm over how the global economy would affect blue-collar workers. Hart’s campaign attracted cross-party appeal, but his famously scandalous downfall in 1987 ruined any chance he might have had to be president. In 1992, Bill Clinton brought the working class back to the Democrats from George H.W. Bush. They then rallied around George W. Bush after September 11 and ensured his re-election in 2004. Disillusionment over the Iraq war and the economic crash inspired hope in Obama’s 2008 campaign. Eight years later, they embraced Trump after feeling alienated by Obama and rejecting Hillary Clinton’s vapid “Stronger Together” message.
The period between 2001 and 2009 in particular tainted the conservative movement’s purity, with the mammoth growth of the national security state, expanded entitlements, an overcommitment to warfare in the Middle East, and a catastrophic economic crash requiring government intervention. In that short time, the productivity gains enjoyed by all Americans in the 1990s disappeared. What followed was a sobering reality. As John G. Fernald of the Federal Reserve Bank of San Francisco and Charles I. Jones of Stanford wrote in 2014, “Growth in educational attainment, developed-economy R&D intensity and population are all likely to be slower in the future than in the past.”
Sluggish growth contributed to the country’s ongoing labor challenges, ranging from stagnant wages to declining workforce participation. In an important essay for National Affairs this fall, Eli Lehrer and Catherine Moyer wrote about the consequences of this phenomenon, particularly how technological progress in a transitioning economy has impacted men. They took note of the scores of men “who are unemployed, unengaged with civil society, uninvolved in family life, and, therefore, finding little meaning in their lives.” And they provided an unsettling statistical comparison. In 1948, 86.7 percent of men were either employed or searching for employment; that dropped to 69.9 percent in January 2017, reflecting a multi-generational pattern of declining labor force participation among men.
The New York Times’ David Leonhardt recently presented a similar historical comparison for the nation’s economic viability. Leonhardt observed, “By 2019, a prime measure of the economy’s health—gross domestic product per working adult—will likely have recovered less in the 12 years since the crisis began than it did during the 12 years since the Great Depression.” Massive mobilization during World War II provides some context, but America has clearly suffered from a declining standard of living. Technological advances transformed how we communicate and process information, but that didn’t necessarily translate into an improved quality of life. This was especially the case for working-class Americans.
In The Retreat of Western Liberalism, Edward Luce, the Financial Times‘ Washington columnist, provides a brilliant historical and political synopsis of how populism thrives under these circumstances. His book reminds us how a prevailing tenet of post-war conservatism—limited government—rings hollow with working-class voters today, and how the “populist right only began to do really well at the ballot box after they began to steal the left’s clothes.” His explanation is worth quoting at length:
In each case, including Donald Trump, populists broke with the centre-right orthodoxy to argue in favour of a government safety net. This is what the old left used to promise and largely delivered (you might say over-delivered). It was the implicit bargain of modern Western democracy. In most countries, including the U.S., it took the form of social insurance. The link between the duties of citizenship and the right to draw benefits was a form of social contract. Even in relatively generous Sweden, future retirees must work for fifteen years before they are eligible to draw a pension. It was an unfortunate coincidence that immigration started to surge just as the value of these benefits began to erode. That was a double whammy. The same governments that were cutting welfare payments were also allowing recent arrivals to join the system. It offended people’s sense of fairness. “You cannot cut entitlement spending and simultaneously widen access to them,” says Francis Fukuyama. “Sooner or later, something has to give.”
Luce’s summary explains blue-collar frustrations. Contrary to prevailing arguments, the working class aren’t members of the far right. They resented the failures of Obamacare, for example, but were also willing to accept some form of universal health insurance. Obamacare’s passage was a boon for grassroots advocacy organizations, especially post-Citizens United. But the foundation for those organizations—the Tea Party movement—testified to the weakened condition of the GOP’s political infrastructure. This vulnerability allowed Trump to thrive in 2016, defeating 16 opponents in the primary and rallying against the previous Republican president. The long-term challenge for Republicans is that Trump courted votes from traditionally Democratic regions. Moving forward, Republicans must identify and support the salient issues confronting these working-class voters. The party must extract the core elements of Trump’s campaign message without inheriting the darker elements of Trumpism. And that means, first and foremost, Republicans must accept that the séance with Ronald Reagan should end.
On this new working-class right, government is sometimes the answer, even if right now it isn’t putting forth a very good image. The federal bureaucracy exploded after September 11, inevitably creating redundancy and waste, while existing departments and agencies failed to create meaningful reforms for post-industrial regions. And the Trump administration has yet to show how it will improve the government’s response rate. The opioid epidemic rages, municipalities and school districts face fiscal collapse, roads and bridges crumble, and crime, poverty, and blight corrode community stability. Trump referenced this “American carnage” in his widely panned inaugural address, but that carnage exists and the working class needs to see a substantive governmental response to it. That means a real bonding endeavor—like the New Deal or New Frontier—that secures a sustainable voting coalition, one that sometimes looks to the government for help.
A working-class conservatism also means creating a labor movement within the Republican Party. Perhaps the days of demonizing unions has expired. Of course, many unions became a counterproductive force as the 20th century progressed, but in today’s global economy, labor reform experimentation at the state level can improve the startlingly low morale of blue-collar workers. Jonathan Rauch made the conservative case for unions in The Atlantic earlier this year. He highlighted how one in three private-sector workers belonged to unions in the 1950s, a number that today has fallen to less than 6 percent. “If 2016 taught us anything,” Rauch wrote, “it was that miserable workers are angry voters, and angry voters are more than capable of lashing out against trade, immigration, free markets, and for that matter liberal democracy itself.”
Exasperation over immigration and trade policies should also be acknowledged. When it comes to U.S. immigration policy, Republicans have avoided supporting restrictive measures, fearing the possible alienation of donors, interest groups, and minority voters. But the immigration system is frozen in time, and it’s time to acknowledge that the Immigration and Naturalization Act of 1965, which opened the door to low-income migrants, created more burdens than opportunities. Massive chain migration from poor developing countries has had a detrimental impact on post-industrial regions, which are ill-equipped to process rapid demographic change and no longer house mass-production industries that can support a low-skilled population. It’s not an act of extremism to support legislative reforms like the RAISE Act proposed by Republican Senators Tom Cotton and David Perdue. Switching to an immigration regime similar to Canada’s merit-based system is a practical approach to supporting higher-skilled migration while easing the burden on America’s poorer regions.
It’s also possible to support free trade while acknowledging its unforeseen consequences. The average working-class voter may not understand the complexity of trade agreements like NAFTA, but he did witness on an organic level how his region significantly declined since the deal went into effect in 1994. Small manufacturers closed shop in too many U.S. cities and towns. The idea of a living wage disappeared for working-class families. Republicans should remember that they can support future trade agreements that secure America’s global economic position while also ensuring that working-class regions stand to see real benefits.
Conservatism is evolving from its post-World War II origins. Republicans witnessed last year how Trump’s voting base had shifted towards expanded health care access, crackdowns on monopolistic sectors, tax increases for the wealthy, and a prudent foreign policy. They maintain a traditional conservative disposition on social issues, but their economic outlook is more in alignment with the New Deal era than the Republican Revolution of 1994. Preserving this coalition will require an inclusive approach. Working-class regions are geographically and culturally diverse, ranging from Appalachian villages to poor city neighborhoods. Republicans must compassionately respond to white senior citizens who cannot retire, Latino small business owners who struggle to finance their dreams, and African-American families in failing urban school districts.
“The working class earns its living with its hands or its backs; its members do not exist on welfare payments; they do not live in abject, swinish poverty, nor in safe, remote suburban comfort,” wrote Hamill in his New York piece. The core of his observation remains timeless, but many of those hands and backs have gone idle as hope for their economic prospects slips away. It’s difficult to forecast whether the working class will even stick with Trump (a question that may be answered by the tax cut deal). But what is clear is that the administration must respond to the needs and aspirations of the working class. It must enact policies that address economic and social needs, from border security to job security. It cannot take their support for granted, the fatal mistake made by the Democratic Party. The working class’s flirtations with Republicans will continue only if the GOP offers them something in return.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania. Follow him on Twitter at @CFMcElwee.
The Schuylkill Expressway, a clogged four-lane artery into Center City Philadelphia, requires emotional stamina and automotive endurance. The eastbound lanes, crammed between wooded hills and the Schuylkill River, present a stop-and-go orchestra dictated by traffic volume, weather, and sudden exits. Drivers slowly pass trains, the ubiquitous Action News ZooBalloon, and remnants of Philadelphia’s industrial past before the city’s skyline emerges beneath the arched Girard Bridge.
The expressway leads to a stunning structural and arboreal vista of gleaming glass and water. Boathouse Row, 19th-century gingerbread houses that designate allegiance to university rowing clubs, charmingly commands the languid river. On warmer afternoons, rowers furiously keep pace with bikers along Kelly Drive, an alternative route for wearied commuters. Just beyond the boat houses is the Fairmount Water Works, an elegant reminder of the city’s history of municipal innovation. Dominating the entire scene, stoic and majestic above Fairmount Park and beneath rising skyscrapers, is the Philadelphia Museum of Art.
The Art Museum, a yellow-hued neoclassical masterpiece, overlooks a flag-lined linear boulevard that leads to City Hall. This impressive stretch, the Benjamin Franklin Parkway, marks its centenary this year. From the air, the Parkway resembles a Parisian thoroughfare of monuments, buildings, and trees. But a closer inspection leads to apprehension. A century after its creation, the Parkway appears unfinished, an assembled boulevard of pedestrian confusion, architectural contrasts, and impaired urban scale. Understanding how such a prominent city feature came to be—and suffered setbacks as the urban landscape was overtaken by cars—may point to ways in which the Benjamin Franklin Parkway can achieve its promise as a great public space.
The Parkway’s completion was a multi-generational endeavor. In 1858, Philadelphia’s City Council presented a plan to run boulevards between the city’s center and its growing suburbs. This proposal paralleled the city’s relentless explosion in population, industry, and commerce. Between 1850 and 1860, Philadelphia witnessed a 367 percent increase in population—in just one decade, going from 121,376 to 565,529 residents. Immigrants drove this staggering increase, principally from Ireland’s Ulster and Connacht. When this Irish influx arrived at Philadelphia’s ports, they typically settled in the city’s neighborhoods to work for textiles factories, locomotive works, or the Pennsylvania Railroad. But many Irish continued north, following the Schuylkill River’s headwaters to work in the anthracite coal mines.
Irish migration necessitated the creation of parishes. In 1864, Napoleon Le Brun, a Philadelphia architect, designed an edifice that served as an everlasting tribute to this period. Completed during the Civil War, the Cathedral of Saints Peter and Paul remains a brownstone masterpiece, a gracefully proportioned basilica overseeing what was Logan Square. The square was named after James Logan, the Irish colonial secretary under William Penn who established shipping links between Philadelphia and Ulster. The cathedral was built under Archbishop James Wood, a Catholic convert from a prominent Philadelphia family. Wood presided over the diocese while forging an alliance with the industries that employed Irish immigrants.
The Cathedral towered over a poorer immigrant neighborhood known for fairs and special events. Through the late 19th century, industrial, commercial, and residential homes expanded from Logan Square. In 1876, the Academy of Natural Sciences, the oldest institution of its kind in the Americas, opened its doors on 19th Street during the nation’s centennial. The structure, initially designed to resemble a Gothic cathedral, opened in a neighborhood that, for all its growth, remained on the city’s outskirts. The neighborhood’s future remained frozen until an urban planning movement enraptured the nation’s industrial cities in the early 1890s.
The City Beautiful movement, introduced at the World’s Columbian Exposition in 1893, popularized multi-building neoclassical and Beaux-Arts revival projects throughout the Progressive Era. At a time when corruption corroded civic institutions and industry polluted air and water, the City Beautiful movement served as an aesthetic-conscious response to the Gilded Age. In his book on the Parkway, the historian David Brownlee explains that the planning movement was a “model of an orderly, classical metropolis, crisscrossed by boulevards and dominated by stately groups of public buildings.” Cities like Philadelphia worked to improve their urban presentation. Despite the nation’s unparalleled economic growth, these cities still lagged behind London, Paris, and Vienna.
In the early 1890s, the city council reviewed the first proposal for what became the Parkway. A citizen-led petition accompanied the proposal, which called for a 160-foot wide road linking Center City to Fairmount Park, Philadelphia’s largest municipal park. The proposed boulevard would cut through a dense neighborhood, Logan Square, and ultimately, the still incomplete City Hall. The multi-tiered Second Empire structure, the nation’s largest municipal building, was finally completed in 1901. Its ornate tower, crowed by a William Penn statue, remained the tallest building in Philadelphia until the Liberty Place skyscrapers broke the tradition in the mid-1980s.
Prudent urban planning followed a sustained period of city corruption. By the turn of the century, Philadelphia’s Republican party had controlled the city since the Civil War era. The Republicans operated City Hall like New York’s Tammany Hall, its power extending to police stations that were placed adjacent to the party’s ward clubs. One observer called City Hall an “unholy alliance” among the “‘best people,’ representing the city’s financial, industrial and commercial interests, and a tightly-knit machine made up of grafters, gamblers and goons, whose political philosophy [was] based on the simple formula ‘what is in it for me?'”
The reporter Lincoln Steffens famously called Philadelphia “corrupt and contented,” but the city experienced a renaissance in architecture and planning despite this reputation. Reform simply expressed itself through brick and mortar. By 1907, ground was broken for what was presented as the “Fairmount Parkway.” Its planners, commissioned by the Fairmount Park Art Association, called for “a direct, dignified and interesting approach from the heart of the business and administrative quarter of the city, through the region of educational activities grouped around Logan Square, to the artistic center to be developed around the Fairmount Plaza, at the entrance to Philadelphia’s largest and most beautiful park.”
The project commenced through a series of corrupt mayoral administrations. For a decade, the construction, which cleared over a thousand structures, slowly moved toward completion amid political fights for reform. But the Parkway owed its existence to the long-standing machine. In his book, Brownlee wrote that “an invincible alliance…between powerful citizens and corrupt politicians” ensured the Parkway’s opening.
In 1917, following years of demolition projects and public debate, the Fairmount Park Commissioners adopted a formal design for the Parkway. The design was presented by Jacque Gréber, a French architect who specialized in landscape design and city planning. Gréber was a leading proponent of the City Beautiful movement, whose urban planning in Ottawa later transformed the modern layout of Canada’s capital city.
Gréber’s Parkway plan called for two linear segments running between Fairmount Park, through Logan Square as the central anchor, and narrowly toward City Hall. In 1918, over a decade after the groundbreaking, Philadelphia’s Evening Public Ledger triumphantly declared that the “Uninterrupted Parkway at last leads from City Hall to Fairmount’s entrance.”
What followed were a series of building projects designed to match the intended grandeur of the Parkway. Gréber modeled the Parkway after the Champs-Élysées in Paris, and he hoped visitors would behold a thoroughfare of monumental buildings, arresting statues and fountains, and bountiful greenery. Following World War I, Gréber stated, “I am glad to say that, if by this work the city of Paris may be enabled to bring its sister in America the inspiration of what makes Paris so attractive to visitors, it will be the first opportunity of Paris to pay a little of the great debt of thankfulness for what Philadelphia and its citizens have done for France during the last three years.”
Following World War I, the Parkway physically embodied the prosperity enjoyed by American cities. Major construction projects commenced during the Roaring Twenties, including the Insurance Company of North America building, the Free Library of Philadelphia, the Fidelity Mutual Life Insurance Company building, and the Rodin Museum. The succession of buildings personified the ideals intended by Beaux-Arts and neoclassical architecture. The Parkway itself became Philadelphia’s center for culture, a boulevard of elegance in a city sprawling with steeples, smoke stacks, and row homes.
By the Great Depression, parts of the Parkway resembled a Parisian boulevard. Logan Square was reconfigured as a circle to imitate the Place de la Concorde, a major public square in Paris. The Swann Memorial Fountain was placed as the circle’s centerpiece, a sculptural tribute to the Schuylkill River, Delaware River, and Wissahickon Creek. The Family Court building, along with the Free Library, were painstakingly modeled after the Hôtel de Crillon and Hôtel de la Marine. Overlooking the circle, they created a structural gateway to the Parkway, which slowly fulfilled Gréber’s architectural and planning vision.
Completing the postcard was the Philadelphia Museum of Art, an ongoing project that created a barrier between Fairmount Park and the Parkway. Designed as a Greek Revival temple, the structure stood over its fraternal buildings on the Parkway, creating an Acropolis for Philadelphia. Through World War II, one could ascend the Art Museum’s steps to behold the Parkway’s ongoing development. From the Franklin Institute to the School Administration Building, the Parkway—renamed after Benjamin Franklin in 1937—deceivingly had limitless possibilities. But the Depression slowed its continued development. By mid-century, the Parkway appeared unfinished, an imperfect arrangement of Gréber’s Parisian dream.
In the 1960s, the celebrated urbanist Jane Jacobs panned the Parkway. Jacobs was a leading skeptic of the City Beautiful movement, labeling its planning approach an unrealistic theory that destroyed the natural order of urban neighborhoods. “The library has no business being out here and neither do the Art Museum or the Franklin Institute,” she said. In The Death and Life of Great American Cities, Jacobs observed that Logan Circle was “discouraging to reach on foot,” noting that it was “mainly an elegant amenity for those speeding by” and “gets a trickle of population on fine days.”
Edmund Bacon, the esteemed urban planner and long-time executive director of Philadelphia’s City Planning Commission, disagreed with Jacob’s assessment. Although Bacon steadfastly defended the Parkway, his planning vision disrupted Gréber’s intentions from an early age. Bacon’s 1932 thesis at Cornell called for “A Civic Center for Philadelphia,” which included the replacement of City Hall with two small civic buildings, along with a tree-lined promenade that awkwardly met the Parkway.
Bacon’s plans arguably hastened the Parkway’s decline. In the 1950s and 1960s, Philadelphia embraced ill-advised urban renewal projects. The city attempted to respond to the ascendant automobile culture and the accompanying flight to the suburbs. More cars made the Parkway nearly inaccessible to pedestrians endeavor, as it became less of an urban boulevard and more of a route for escaping downtown. Bacon attempted to “rescue” the city by planning the Vine Street Expressway, a gross interruption of the Parkway’s streetscape that created gaping holes and pedestrian inaccessibility. Such planning also failed to stall Philadelphia’s industrial and population decline. Between 1960 and 2000, the city lost nearly 500,000 people.
By the time Ed Rendell became mayor in the 1990s, the Parkway was an urban wilderness. Lifeless office buildings—modernist slabs of concrete—disrupted the scenery. The Park Towne Apartments awkwardly stood near the Art Museum, a lonesome structure reached by car, not foot. Chain restaurants were more common than cafes and bars. In Hidden City, the urban designer Greg Meckstroth called the Parkway “a veritable museum ghetto that often becomes desolate at night.” He concluded that it appeared to be “condemned to sour monotony.”
The Parkway endures as a symbol of disappointment in Philadelphia. In a 2014 piece, Gregory Heller—a biographer of Bacon—wrote that the “Parkway looks good on the Fourth of July and during the [Philadelphia] marathon. But otherwise it’s an underutilized, poorly planned, highway wasteland of an urban space.” Heller argued that Philadelphians “should face up to the fact that as much as we love the view from the museum steps, and the boulevard of flags on the evening news, the Parkway is a disaster and the city would be better if it were never built.”
During the spring and summer months, the Parkway transitions into a landlocked pit for festivals and concerts. Temporary security fences and portable bathrooms vandalize a frustrating boulevard of unfulfilled promises. In a recent Philadelphia Inquirer column, the architecture critic Inga Saffron wrote that this year’s “anniversary events, which…focus on high-toned cultural offerings, are a painful reminder that we still haven’t figured out what the Parkway should be.”
At the same time as the Parkway is struggling, Philadelphia is also experiencing a renaissance. Over the past decade, the city witnessed a boom in commercial and residential development, game-changing investments made by universities, and gentrifying neighborhoods that had recently suffered from crime and blight. The Fairmount neighborhood, near the Art Museum, has turned into a thriving residential quarter and a popular nightlife destination. A controversial property-tax abatement program, spearheaded by the city and embraced by developers, contributed to this rapid change. But Philadelphia, a city with an ambivalent future in the 1990s, continues to enjoy an upward trend in population growth, business investment, and neighborhood-level improvements.
Philadelphia’s leading planners still debate how to turn the Parkway into the European-style street envisioned a century ago. The city is leading a study to properly manage festivities and for-profit events. Frequent street closures and disruptive concerts spoil the revival occurring in the Parkway’s surrounding neighborhoods. The city’s study would join numerous plans previously intended to address the Parkway’s flaws.
In 1999, Paul Levy, president and CEO of Center City District, called for more pedestrian-friendly park space, apartment buildings, and cultural venues. Levy’s radical proposals languished through the decade. In 2013, however, Philadelphia’s park officials released “More Park, Less Way,” which prescribed ideas to attract people to the Parkway. In recent years, the Parkway experienced minor improvements, including reconfigured sidewalks, expanded plazas on the north side of Logan Square, and the opening of a serene art museum, the Barnes Foundation. But the Parkway lingers as a lonely pocket in a lively setting, a major city artery with cultural institutions more likely accessed by taking Uber than walking.
In 1998, Buzz Bissinger wrote A Prayer for the City, a magnificent profile of the Rendell years and Philadelphia’s fight for survival during that period. Nearly two decades later, Philadelphia’s prayers appear answered. A city that ignited America’s industrial rise seems ready for the future. But the Parkway, designed to signify Philadelphia’s place in the world, remains in purgatory. It awaits special dispensation. The Parkway, along with Logan Circle, have significantly evolved since the Cathedral opened its doors to immigrants. For now, it remains a destination to daydream. It could use a prayer or a lucky penny in Logan Circle’s fountain.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania. Follow him on Twitter at @cfmcelwee.
When responding to his administration’s darkest hours or confronting steadfast opponents, President Donald Trump turns to his social media arsenal, deploying an artillery of tweets to shock and awe the political, cultural, and broadcasting elite. But Twitter offers only temporary relief, a virtual discharge while confined to the lonely White House. For Trump, hitting the road is the ultimate release, and it’s before admiring and unwavering crowds that the president campaigns for his preferred platform and finds validation. Nearly a year after his stunning victory, Trump’s rallies remain an unchanged ritual, comprised of a congregation celebrating the president’s populist agenda with all its solemnities.
But Trump didn’t invent this exhibition or emotional approach. It was Morton Downey, Jr., a bombastic provocateur, who transformed our media culture through his nationally syndicated television show in the late 1980s. This autumn marks the thirtieth anniversary of The Morton Downey Jr. Show’s debut on New York’s WWOR-TV, and while the short-lived program ended with scandal and fleeing advertisers, it still established a rhetorical style replicated by talking heads, politicians, and ultimately, a winning presidential candidate.
Downey, who died in 2001, presided over the original Trump rally. His show launched during the final stretch of Reagan’s presidency, a period of cataclysmic change in the media realm that was only realized in hindsight.
The era’s shifting cultural, political, and economic fault lines slowly cracked the foundations of established institutions, whether it was print publications, network news, or political parties. And yet it remained a simpler time, when households digested current events in local newspapers, tuned into Tom Brokaw or Dan Rather on countertop televisions over dinner, and understood the clear ideological sympathies of Republicans and Democrats. These households, often working class, were located in small to mid-sized cities and aging postwar suburbs that were only beginning to witness the rapid changes wrought by technology and globalization. Downey intuitively understood the budding frustrations of these households, and he tapped into that resentment before a fawning television studio audience.
Downey’s show arrived just months after Gary Hart’s ignominious downfall as the presumptive Democratic nominee in the 1988 presidential election. Hart, a U.S. senator from Colorado, was considered a formidable candidate against then-Vice President George H.W. Bush, but rumors of adultery and a newspaper’s stakeout of his Capitol Hill home created a media frenzy that ended his campaign. The frantic coverage of this scandal was a “Franz Ferdinand” moment, one which completely transformed the confluence of American media, politics, and culture just as the Austro-Hungarian Archduke’s assassination in 1914 changed the course of geopolitical history.
If there is a Guns of August-like book that brilliantly captures this period, it’s Matt Bai’s 2014 account of the Hart scandal, All the Truth Is Out. “Within about a decade or so of Gary Hart’s spectacular collapse,” wrote Bai, “new technologies—first the satellite revolution that made twenty-four-hour news possible, and then the advent of the Internet and the rapid spread of broadband technology—would obliterate this old order, creating a new world of instantaneous and borderless information.” In a cruel twist of irony, Hart spoke on the campaign trail about how America should prepare for the 21st century’s inevitable technological upheaval. What Hart failed to anticipate, however, was how a figure like Downey would trigger Americans’ emotional response to these changes.
Although raised in a world of privilege, Downey was always an outsider. Born in Los Angeles in 1932, Downey was the oldest son of Morton Downey, a famed Irish tenor during the Depression era (“When Irish Eyes Are Smiling”), and Barbara Bennett, an actress and dancer.
Dysfunction, marital drama, and violence marked Downey’s formative years. His alcoholic mother abandoned the family for a B-movie cowboy actor, and his father instilled fear with his ferocious temper. But the Downey family, for all their faults, still enjoyed the trappings of celebrity. Before they were shipped to Connecticut to be raised by their paternal grandparents, Downey and his siblings grew up in opulent homes and counted the Kennedy family as neighbors in Hyannis Port. By age 10, Downey was sent to military school, and his educational quest continued with a degree from New York University, a J.D. from a correspondence school, and a doctorate from an unaccredited college in California.
Downey inherited his father’s love for show business, but Morton Downey, Sr. responded to his son’s interest with stunning cruelty. “The name Downey opened doors,” Downey recalled in a 1988 interview with People. “But they slammed just as fast because that same name controlled them.” Downey’s father was perversely determined to keep his son out of Hollywood, going as far as arranging for his son’s imprisonment for bouncing a check. Downey served 61 days in a California jail. His father finally apologized many years later.
Downey persevered in spite of his father’s opposition, and his career trajectory included working as a singer, disc jockey, lobbyist, co-founder of the American Basketball Association, and eventually a radio host in a half dozen cities. He was also a campaign aide, first working for both John and Robert Kennedy’s presidential runs before switching sides, joining the National Right to Life Organization, and later running for president on the American Independent party ticket in 1980.
As a radio host, Downey’s evocative style tuned listeners into his show while also imperiling his allotted airtime with station bosses. In 1984, Downey was fired from a Sacramento station, fortuitously leaving his seat to be filled by a former Kansas City Royals ticket salesman named Rush Limbaugh. It was not until 1987, at age 55, that Downey got his lucky break with WWOR, which relocated to a studio in Secaucus, New Jersey.
Known by natives as “WOR,” WWOR-TV’s popular programming originally reached audiences in the New York metro region, New Jersey, Philadelphia and its suburbs, and pockets of Pennsylvania. The station’s most popular host was Joe Franklin, who interviewed a comically eclectic mix of celebrities, faded stars, and unknown performers. It was Bob Pittman, a producer and co-founder of MTV, who arranged Downey’s debut in what was then a superstation with 20 million people picking up WWOR’s signal nationwide.
Downey’s show developed a loyal cult following. His confrontational persona, distinguished by an intolerance for excessive political correctness, especially resonated with a large segment of the increasingly post-industrial Mid-Atlantic.
His audience, admiring baby boomers or members of Generation X, typically lived in the cities and towns grappling with the stagnating wages, job limitations, rising crime, demographic changes, and technological upheaval occurring in a newly emerging global economy. Their families lived in bungalows on Staten Island, rowhomes in northeast Philadelphia, ranches in Levittown, and duplexes throughout Pennsylvania’s anthracite coal region and Lehigh Valley.
The typical audience member was a young male who likely hailed from a third or fourth generation ethnic Catholic family, residing in neighborhoods experiencing the accoutrements of urban decline. His homemaking grandmother listened to Morton Downey, Sr. croon on the Victrola as a child, his grandfather recalled Father Coughlin’s radio diatribes before World War II, and he or his parents watched TV host Joe Pyne’s combative interviews on WDEL-TV in the 1960s. Downey communicated in a way that easily clicked with this demographic, which understood the language of conflict in working class urban America.
“The element that makes Mort work is that he is truly not a conservative; he is a populist,” said Pittman in a 1988 interview. “If you watch him, he is sometimes on the liberal side of the issue. He bounces around. What he’s really interested in is the little man’s opinion: that’s really what he is representing.” Noting Downey’s television style, Pittman concluded that, “Downey doesn’t take an intellectual approach to things, but an emotional approach.”
This approach manifested itself in Downey’s television optics, which hooked the viewer and rallied the crowd. Each episode started as a field entrance with Downey, kibitzing with a pack of crew members behind stage, slowly approaching the audience. A pulsating bass played and the adoring crowd cheered as the grinning Downey approached individual audience members, slapping high-fives with men and kissing or hugging women. Typically attired in a Ralph Lauren oxford cloth shirt, regimental tie, khakis, and loafers, Downey faced the camera on-stage, announcing the controversial theme for that evening. Downey then cued the intro video, with the memorable theme depicting the host wearing boxing gloves, making strange faces, and sporting his pearly-white smile amidst a dizzying montage of newspaper headlines, military tanks, women’s legs, TV sets, and electric guitars.
The first episode aired on October 19, 1987, known as Black Monday, when the Roaring Eighties ended with a global stock market crash. The show became an instant hit—it was nationally syndicated by 1988—but critics were unimpressed with Downey’s approach. “The Morton Downey Jr. Show is, beneath the surface, a show about bullies,” wrote David Bianculli, the New York Post’s TV critic. “Downey, with the power of TV, becomes Head Bully, and encourages members of his audience to ‘get involved’—the same way the guy with the torch encouraged the mob around Frankenstein’s castle to get involved.” When the camera rolls Downey was “ready to fight and win,” wrote New York‘s David Blum. “Facing him are his fans. They want blood, and Mort is going to give it to them.”
The chain smoking host (he estimated smoking an average 80 cigarettes a day) ambushed his guests on episodes that addressed political incorrectness, divorce, the death penalty, race relations, affirmative action, Communism, and Vietnam. He invited audience members to approach his “Loud Mouths,” two podiums sporting the show’s open-mouth logo. Guests or audience members stated opinions but their remarks were often interrupted by Downey calling them “pablum puking liberals” and barking “zip it” or “shut up.” His shows even sparked physical confrontations among guests, with CORE chairman Roy Innis once knocking the Reverend Al Sharpton to the ground. Downey was forced to break up a potential stage brawl in Harlem’s Apollo Theatre.
Downey unsettled critics and advertisers, but audiences craved his brash populism. He presented a conservative leaning viewpoint, yet his positions remained ideologically complex and failed to properly fit into either party. His sympathies were always with the working class, whether it was supporting government-funded healthcare for the elderly or discussing the perils of illegal immigration.
On his own TV show, On the Record, Alan Dershowitz asked Downey how he could appeal to this frustrated group of people. “Because I was frustrated most of my life,” he responded. “I’m able to come out there and present my frustration which I now recognize are the frustrations that every American goes through at one time or another in their life.” Downey told the Associated Press that Americans saw him as a “loud-mouth who gets in trouble like they do, who’s had problems like they had, someone that they can identify with a lot more than someone who’s squeaky clean.”
For all his faults, friends and foes alike acknowledged that Downey’s crude media persona didn’t match the kind and humble man they knew off camera. He was simply a performance artist who enraptured the country, but his show quickly faced an expiration date. By 1989, advertisers fled what was labeled “trash TV” or “tabloid TV.” Downey attempted a gentler show, but off-stage antics hastened its demise. In April of that year, Downey claimed he was assaulted by a group of neo-Nazi skinheads, but his high-profile story didn’t add up. His incident was considered a hoax, and by September his producers and distributor announced the show’s cancellation.
Fast forward three decades and we now have a president who makes Downey’s show seem relevant and fresh. The parallels between Downey and Trump are stunning. They were privileged sons of complicated men, excruciatingly persevered in their respective fields, embraced show business and nurtured a growing audience with their working class language, encountered business failures in Atlantic City’s casino industry, and created an onstage spectacle that captivated the nation’s attention. They cultivated a base of aggrieved Republicans and Democrats who once cheered at Downey’s crude insults and now laugh at Trump’s stinging tweets.
In 1989, Downey taped an interview with Trump, who discussed America’s trade imbalance with Japan. “It’s a huge problem, and it’s a problem that’s going to get worse,” Trump said. “And they’re laughing at us.” Downey also asked Trump if he resented press criticism. “I think I used to resent it, I’m not sure I do anymore,” Trump responded. “Somehow, over a period of time, you become harder to this. You also realize it means less. It’s yesterday’s news. They take a shot at you in the newspaper, some reporter doesn’t like something for his own personal reason, so they take a shot. I find now—10 years ago I used to say ‘boy how could they do that, it’s wrong, it’s unfair,’ I find now that it really doesn’t matter.”
Trump is a cultural product of the late 1980s, and Downey deserves posthumous credit for creating the media ecosystem that led to this administration. The country continues to rapidly change, but the issues remain the same. The modern working class Americans who appreciate Downey or Trump fail to click with either political party. They’re just navigating the frantic pace of a global economy that often appears to present little hope for their future. In 1987, their outlet for populist release was Downey’s show. Now it’s Trump’s presidency. Downey is long gone, but his legacy has evolved into a new political frontier, the boundaries for which have yet to be defined.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania. Follow him on Twitter at @cfmcelwee
The great 20th-century American novelist F. Scott Fitzgerald often evoked the fleeting passage of time. He described the final weeks of summer in This Side of Paradise as that “sad season of life without growth.” On a particularly warm summer afternoon, The Great Gatsby’s Daisy Buchanan lamented her boredom. “‘What’ll we do with ourselves this afternoon,’ cried Daisy, ‘and the day after that, and the next thirty years?’” Her friend Jordan Baker dismissed Daisy’s morbidity, reminding her that, “Life starts all over again when it gets crisp in the fall.”
The mood in Washington easily corresponds with Fitzgerald’s melancholic theme. Congress returns to Capitol Hill this autumn hardly rejuvenated. President Donald Trump, in survival mode without a substantive legislative accomplishment, will market tax reform. Speaker Paul Ryan and Senate Majority Leader Mitch McConnell, estranged allies of the White House, will pursue their own tax reform plan while also addressing the devastating effects of Hurricane Harvey. Healthcare legislation appears terminal and unforeseen developments in foreign affairs could disrupt everything. And so Trump, along with a Congress controlled by his own party, begins the season without a clear legislative agenda or ideological path.
With Congress back in session, Union Station’s concourse will swell with commuters, congested restaurants will turn diners away, and the Trump International Hotel will maintain its distinction as this administration’s crossroads for lobbyists, policy makers, tourists, and loyal officials.
The Trump Hotel operates in the Old Post Office building, a stunning Romanesque Revival structure completed during the gilded 1890s. Its granite façade, crowned with a domineering clock tower, occupies a full city block on Pennsylvania Avenue. In the 1970s, the building faced an uncertain future on a deteriorating stretch of the street that also included the White House. It was Daniel Patrick Moynihan—esteemed policy wonk, indefatigable egghead, and later U.S. senator from New York—who crusaded to save the avenue he deemed the “main street for both the city and the nation.” Moynihan’s lobbying endeavors resulted in Congress’ creation of the Pennsylvania Avenue Development Corporation, which renewed the thoroughfare and saved the Old Post Office from the wrecking ball.
As meetings and dinners commence beneath the hotel’s chandeliers, perhaps Trump’s acolytes should revisit the man credited with saving their place of congregation. After all, it was Moynihan who gave Hillary Clinton his blessing to run for his Senate seat in 2000—a decision that indirectly led to this presidency. But the senator and his wife Liz were known for their reservations about the Clintons. In 2008, just a few years after Senator Moynihan’s death, Liz endorsed Barack Obama over Clinton, arguing that Moynihan would have approved the candidate’s work to “rekindle hope in our young as he encourages them to participate in the political process.” Liz, who served as her husband’s campaign manager, was also dismayed by Clinton’s hostility toward Obama’s candidacy.
We know the outcome of Clinton’s story, and the former candidate will exhaustively remind us of that story as she launches her autopsy tour in the coming weeks. But could the Trump administration, members of Congress, and pundits learn from her predecessor’s legacy? As we navigate the uncertain contours of this political era, could Moynihan’s voluminous writings, contradictory policy positions, legislative tactics, and astute synopsis of socio-economic problems offer a mild prescription for our present paralysis?
* * *
Daniel Patrick Moynihan was born an outsider. The son of a journalist and homemaker, Moynihan’s dysfunctional Irish Catholic family was affected by the working-class issues that fuel populism. When Moynihan’s alcoholic father abandoned the family, his childhood featured the unhappy experiences of shining shoes, hopping from apartment to apartment, and witnessing mass poverty in pre-war New York. Moynihan’s sociological diagnoses stemmed from this youth.
“From the wild Irish slums of the 19th-century Eastern seaboard to the riot-torn suburbs of Los Angeles,” wrote Moynihan, “there is one unmistakable lesson in American history: a community that allows a large number of young men to grow up in broken families, dominated by women, never acquiring any stable relationship to male authority, never acquiring any set of rational expectations about the future—that community asks for and gets chaos.”
When Moynihan issued this controversial warning, he could demonstrate that he succeeded in spite of exposure to this ecosystem. Moynihan graduated valedictorian of his high school class, worked on the piers of the Hudson River, earned his college degree from Tufts University on the G.I. Bill, became a Fulbright fellow, and completed his doctorate. He served as an advisor to both Democrats and Republicans—including the Kennedy, Johnson, Nixon, and Ford administrations—but he never wavered in his allegiance to Franklin Delano Roosevelt’s New Deal coalition. Moynihan believed that a person’s political affiliation was determined by the year he was born, and as a product of 1927, he considered himself in alignment with FDR’s Democratic Party.
But it remains difficult to decipher Moynihan’s ideological fidelity. He espoused liberalism, yet neoconservatives, neoliberals, and traditional conservatives considered him a member of their respective tribes. We are left with his innumerable writings, and they show a scholar whose views easily melded with those shared by Nathan Glazer, James Q. Wilson, Irving Kristol, Norman Podhoretz, and William F. Buckley. Moynihan was a creature of civility, and he shared this perspective during a period when conservatism ascended and liberalism retreated. Nearly 15 years after Moynihan’s passing, we find ourselves in a political epoch where both ends of the ideological spectrum appear in flux. During these years, blunders in Iraq and Afghanistan—accompanied by economic collapse, technological upheaval, rapid demographic change, and unrepentant globalization—have engendered disillusionment, socio-economic instability, and ultimately populist backlash.
Although Moynihan sometimes seemed contrarian for the sake of it, his work remains a brilliant exercise in prescience, providing borderline biblical wisdom for our present challenges in domestic and foreign affairs. If Moynihan were alive for the present upheaval, he would see his most acute societal warnings confirmed. He would understand why Trump is president, for he criticized both parties’ political failures for decades.
The Trump administration and Congress should take note of Moynihan’s prophecy. In 1992, he wrote that U.S. healthcare suffered from “Baumol’s disease,” commenting that the system was “plagued by cumulative and persistent rises in costs at a rate which normally exceeded to a significant degree the corresponding rate of increase for commodities generally.” Moynihan’s invocation of this economic phenomenon is hardly dated, and if Congress revisits healthcare reform, rising costs remain a fundamental problem. Yet as the Senate Finance Committee chair during Bill Clinton’s health care pitch in 1994-1995, Moynihan curiously declared that there was no crisis, playing an important role in derailing the universal health-care endeavor led by Hillary and senior advisor Ira Magaziner. But at the time, Moynihan questioned Clinton’s “intentionally obscure” proposal and shared the belief that the overall plan was too coercive.
On Obamacare, the perpetually skeptical statesman would have supported some type of reform. Moynihan would also champion welfare reform, rejecting the idea of permanent governmental dependence. The New Deal always guided Moynihan, and he was a beneficiary of that period, yet he was hostile toward the excesses of welfare that sprouted from Lyndon B. Johnson’s Great Society. Regardless, Moynihan still expressed concern for how reform could affect the poor and elderly. He once told his aide, Lawrence O’Donnell, that he could easily reform health care by deleting “65 and older” from the legislation limiting the Medicare plan to seniors.
Moynihan also presented a case for tax reform. In a 1981 newsletter, he wrote about the “‘bracket creep’ that has increased the real rate of taxes for people who have had no real increase in income.” Nearly four decades and six administrations later, the statement remains timeless. “A tax system must not only be fair, it must be seen to be fair,” he declared. “It is useful to keep in mind that our country was founded in a revolution that started over unfair taxation.”
Moynihan’s commentary rarely seems dated. His positions would reassure or dishearten both sides on today’s most pressing debates. The senator opposed NAFTA in 1994, complaining that “the United States government still doesn’t seem prepared to address the needs of those workers who will lose their jobs as a result of this agreement.” But at the same time, he supported lax immigration laws, arguing that enforcement would not discourage immigrants from coming to the U.S. “So, take down the walls and you make sure this migration will continue,” he said.
Moynihan also understood the perils of drugs and their corrosive impact. His observation in 1998 still rings true for the countless communities grappling with a heroin crisis: “Since the desire of man to alter his state of consciousness is as old as human history, and technology continues to provide a breathtaking array of drugs capable of producing everything from oblivious to nirvana, I think it safe to assume that we may never win a ‘war’ against drugs.”
Perhaps the closest modern-day comparison to Moynihan is former U.S. Sen. Jim Webb, whose moderate to conservative viewpoints didn’t quite fit in either political party. Webb himself confirmed Moynihan’s influence in a 2008 interview with the New Yorker. “My prototype really is Daniel Patrick Moynihan, who was able to combine an academic, intellectual career and a government career, really put new ideas out there, and think,” Webb said.
In 1993, Moynihan wrote an American Scholar essay, “Defining Deviancy Down,” in which he argued that American society was increasingly lowering its standards for tolerating bad behavior. He lamented eroding families, rising crime, and a general permissiveness toward traditionally unacceptable behavior. Nearly a quarter-century later, it seems quite clear that what was once abnormal is now normalized. This is most evident in the post-industrial communities ravaged by socio-economic maladies. Moynihan understood these problems and offered a substantive intellectual take on the issues that resonate in these cities and towns.
In what often seems like a dark age, it is worth revisiting Moynihan’s work. An ideal start is Daniel Patrick Moynihan: A Portrait in Letters of An American Visionary, a hefty volume of the senator’s political writing, memorandums, and letters. Few lawmakers could compete with Moynihan’s wide-ranging intellectual interests, but perhaps they will find solace, or even inspiration, as they navigate our ambivalent future. But for now, Congress returns to Trump’s Washington as boats against the current, rowing toward an irresolvable future.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania.
In a more uneventful year, the centennial of John F. Kennedy’s birth would attract media-induced introspection. But the dervish news cycle prohibits reflecting upon such benign, yet relevant, political moments. As we breathlessly keep pace with Twitter feeds, chyrons, and emailed newsletters, Kennedy’s 100th birthday appears confined to Time Life retrospectives in grocery checkout lines. Perhaps this fate was unavoidable. A cluttered Kennedy canon, accumulated over years of painstaking probes, stripped the 35th president’s life of the cultural relevance it once represented.
But this frenzied present deserves some historical clarity. In a poignant commencement address this year, Peggy Noonan lamented the absence of historical perspective among young reporters. “They have seen the movie and not read the book. They’ve heard the sound bite but not read the speech,” said Noonan. “Their understanding of history, even recent history, is therefore superficial.”
And so Kennedy’s prosperous upbringing, inevitable political rise, and tragic death prove worthy of revisiting during this era of socio-economic and political upheavals. His life, and the period in which he presided, could help us interpret the fractured fault lines of this current cultural moment. Unfortunately, there is no series that fully captures the kingdom, power, and the glory of the Kennedy hierarchy (as Robert Caro did with his multi-volume The Years of Lyndon Johnson). But the latest addition to Kennedy’s biographical canon, The Road to Camelot, is an important companion to understanding the presidential campaign process and the coalitions that produce surprising, yet razor thin, electoral outcomes.
Written by Thomas Oliphant and Curtis Wilkie, the veteran journalists entertain without injecting political gossip, and analyze without risking historical groupthink or indulgent revisionism. The book demonstrates how Kennedy’s victory in 1960 was the culmination of years of preparation and grassroots recruitment. Under the initial oversight of Kennedy’s father, Joseph, the family’s inner circle commanded what became an unconventional path. Following the Democratic convention in 1956 and through the 1960 election, the Kennedy campaign tirelessly navigated a volatile and evolving electoral map. During this five-year quest for national office, Oliphant and Wilkie write that Kennedy’s non-stop “traveling, speaking, maneuvering, and writing enabled his ‘positives’ (fresh, vigorous, new thinking, Catholic, activist) to keep his ‘negatives’ (inexperienced, unknown risk, bigger government, Catholic) at bay.”
When grieving the present electoral landscape, pundits often blame how reform disempowered the professionals and hacks of both political parties. But Oliphant and Wilkie remind us that rebellion against party norms is nothing new to campaigning. Kennedy disdained how the Democratic machine operated, and after his first Congressional victory in 1946, he aggressively worked outside the party’s suffocating confines. With the help of his brother, Robert, and a pack of loyal comrades, Kennedy maneuvered his rise as a national political figure through grassroots organization (“Kennedy secretaries”), innovative marketing, and policy triangulation. Kennedy’s cozy relationship with the press helped this endeavor. The Kennedy family’s fortune and fame attracted regular coverage, and the frequent magazine profiles and photo spreads made the candidate’s rise almost pre-destined. Regardless, Kennedy was an energetic campaigner, having made over 140 appearances across the country in 1957 alone.
The Road to Camelot shows how the mythology of Kennedy, masterfully crafted at the time, concealed the sins that would imperil any modern campaign. The authors are clearly fond of their subject, but they present Kennedy’s odyssey without clouding the reality of dirty tricks, reckless behavior, and extramarital affairs. Kennedy successfully hid these darker compartments thanks to an adoring press. This turned out much to Vice President Richard Nixon’s misfortune. Unlike his opponent, Nixon encountered a series of unlucky and widely reported developments throughout 1960. During a critical campaign stretch, an infected knee confined Nixon to bed rest. At one point, even his own boss, President Dwight Eisenhower, humiliated him at a White House press conference. When asked if he could give an example of any major idea Nixon contributed to his presidency, Eisenhower responded, “If you give me a week, I might think of one. I don’t remember.”
The overall campaign’s mythology wounded Nixon. Countless profiles recount Nixon’s cadaver-like appearance at the first presidential debate, but the initial reaction to both candidates’ performances was more measured. As New York Times columnist James Reston concluded that September, the debate “did not make or break either candidate.”
Nixon could never compete with Kennedy’s telegenic persona, a gift less endowed than painfully crafted through constant practice and diaphragm exercises. Kennedy’s advertising team also embraced the embryonic stages of the modern Information Age. Their tactics seem primitive and innocent compared to the algorithms, data mining, and psychological profiles culled by today’s campaigns. But the campy jingles, pamphlets, and television commercials were innovative measures that gave high hopes for Kennedy’s electoral success. After all, this was the pre-smartphone age, a time when the critic Dwight Macdonald could soberly make the following declaration: “As smoking gives us something to do with our hands when we aren’t using them, Time gives us something to do with our minds when we aren’t thinking.”
* * *
The Road to Camelot also reminds us that Kennedy understood how Catholicism could derail his electoral chances. But Kennedy refused to downplay this factor, embracing his Catholic identity throughout the campaign trail. A top advisor, Walt Rostow, recalled Kennedy’s decision to barnstorm the West Virginia primary. Campaign officials feared a punishing defeat, but Kennedy understood that confronting a state with historically anti-Catholic pockets was necessary. “I have no right to go before the Democratic convention and claim to be a candidate if I can only win primaries in states with 25 percent or more Catholics,” Kennedy said. “I must go in there. And I am going in.”
The authors devote generous coverage to Kennedy’s Catholicism, showing how he gained national popularity without religion disqualifying his candidacy. Beginning in 1956, Kennedy delegated his long-time aide, Ted Sorensen, to research the Catholic vote. Sorensen’s findings resulted in the “Bailey Report.” Misleadingly attributed to John Bailey, a party boss from Connecticut, Sorensen’s report explored the viability of a Catholic presidency. Before the report’s prescient findings, political operatives had concluded that Al Smith’s 1928 defeat spelled the end of placing a Catholic on a presidential ticket. But Sorensen concluded that there was “such a thing as a ‘Catholic’ vote, whereby a high proportion of Catholics of all ages, residences, occupations and economic status vote for a well-known Catholic candidate or a ticket with special Catholic appeal.”
The book’s exploration of ethno-religious politics is an especially powerful subject to revisit following the 2016 campaign. The Democratic platform is now beholden to identity politics. Their polarizing coalition building ignores the ethnic Catholics who built the party in Northern and Midwestern cities during the twentieth century. Amazingly, it was only a half century ago that pastors, columnists, and politicians could openly incite anti-Catholic bigotry. When the Protestant minister Norman Vincent Peale railed against Kennedy’s Catholicism and questioned his national allegiance, the candidate responded by invoking the death of his brother Joseph P. Kennedy Jr. in service during World War II.
Oliphant and Wilkie reveal that Kennedy’s 1960 victory was no mandate. Outside of his home state of Massachusetts, more Americans voted for Nixon than Kennedy. His triumph was made possible by the Catholic turnout in states like New York, Pennsylvania, Michigan, and Illinois. For Catholics in urban and semi-rural regions, Kennedy’s election was a tribal benediction. The candidate himself was less important than what his candidacy signified. In row homes, duplexes, and shanties, Kennedy quickly earned his real estate on living room walls alongside boxers, union bosses, and the Sacred Heart.
Should we be surprised that the descendants of this coalition voted for Donald Trump? Exit polling indicated that Trump carried the Catholic vote, with Pew reporting a 7-point winning margin. Barack Obama previously won the Catholic vote in 2008 and 2012. While cultural issues certainly played a role in this shift, a closer inspection indicates where Catholic voters switched sides.
The growing literature on the plight of Appalachia deflects attention from the struggles of dwindling ethnic Catholic communities. For so many ethnic hamlets in Michigan or Pennsylvania, a middle-class existence either evaporated or never arrived. Their cash strapped local governments fail to provide adequate services. Rapid demographic change overwhelms their school districts and neighborhoods. Manufacturers announce layoffs and retail malls close.
The collective effect erodes any semblance of social cohesion. In these communities, individuals live in the same neighborhoods where their immigrant Catholic families arrived a century before. But the jobs are gone and their parish is consolidated or closed. These Democratic strongholds supported the labor movement, disproportionately served in the First and Second World Wars, and rallied around Kennedy in the hope of cultural vindication. This coalition remembers when Kennedy famously asked what Americans could do for their country. After reading this masterful study, one cannot help but ask what the current Democratic party has done for working-class Catholics.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania.
The Past and Future City: How Historic Preservation is Reviving America’s Communities, Stephanie Meeks, Island Press, 352 pages.
In the mid-1960s, San Francisco enjoyed a blue-collar existence. It remained a city where Sicilian fishermen caught salmon, painted boats, and repaired nets along the pier. Middle-class families contentedly resided in ethnic neighborhoods that could easily blend in any East Coast city. Joe DiMaggio, long retired from the New York Yankees, carried out an enigmatic existence running a restaurant along Fisherman’s Wharf.
DiMaggio’s retreat to San Francisco inspired legendary journalist Gay Talese to visit the baseball legend and his home terrain in 1966. When Esquire published Talese’s classic piece, “The Silent Season of a Hero,” San Francisco was already distinguishing itself as a tourist attraction and haven for the hippie counterculture. But the city had yet to experience the shifting socio-economic fault lines that now make his portrayal seem more like a time capsule than a timeless cinematic presentation.
In today’s San Francisco, the idea of enjoying a reasonable financial existence is foreign to the descendants of immigrants who flocked to the Bay Area in the late 19th and early 20th centuries. Google’s private buses shuttle tech bros down to Silicon Valley in a city where the unrepentant pace of gentrification and technology are erasing any sense of historical identity. To live a middle-class life in San Francisco is now a form of augmented reality. The city’s transformation, and the economic maladies that inevitably arose, is one of the subjects of Stephanie Meeks’ The Past and Future City, a manual describing the role of historic preservation in America’s cities.
Meeks explores how gentrifying neighborhoods imperil heritage businesses and displace middle-class and immigrant families—disregarding the prescriptions of diversity from Jane Jacobs’ influential 1961 book, The Death and Life of Great American Cities. According to Meeks, commercial rents in San Francisco have risen 250 percent since 1999. In 2014 alone, an estimated 13,000 businesses closed, including 4,000 that had operated for over five years. While San Francisco has responded by creating a Legacy Business Preservation Fund, which provides grants to businesses and nonprofits that have a historical impact on local neighborhoods, the city has yet to resolve the problem of balancing gentrification with making the city financially habitable for all residents.
Meeks, who co-authored the book with Kevin Murphy, oversees the National Trust for Historic Preservation. During her tenure, the organization has worked to revitalize communities and create programs like National Treasures, which identifies threatened historic places and aggressively works to save them. Meeks admirably reminds readers that preservation is an issue far more complex than just rescuing old buildings. Instead, saving places is about “defin[ing] a community so that future generations can know their past, feel a connection to those who came before, and build a foundation for the future.”
The book is frequently a lamentation of the casualties of gentrification. As major cities celebrate a renaissance of new commercial and residential development, long-time residents mourn the loss of storied bars, diners, and stores. To express grief over a closed coffee shop or record store is understandable, but it is now fashionable to grow nostalgic over the squalor that once defined today’s thriving neighborhoods. Many baby boomers, harkening back to the ‘70s and ‘80s, habitually romanticize the stomach-churning sense of apprehension one once felt walking to a row-home apartment or passing seedy bars and theaters. In hindsight, this atmosphere allegedly sparked creativity.
Publishers have embraced a literary niche of memoirs mourning how gentrification sterilized the grime and extinguished the last vestiges of baby boomer youth. Mayor Abe Beame’s New York is usually the stage for these autobiographical expressions of sorrow. Whether it’s Patti Smith’s Just Kids or James Wolcott’s Lucking Out, we’re told that art, love, and happiness thrived in a downtown Manhattan that featured crumbling tenement buildings, lurking muggers, and drug addicts, along with a municipal balance sheet full of red ink.
None of these books would exist if that metropolitan Dark Age had endured. The luxuries of safety, stability, and success allow these writers to regret whatever blight and decay wasn’t preserved. What stands out among this canon is Anatole Broyard’s Kafka Was the Rage, a reminiscence of New York just before the nadir of the 1970s. Broyard, who wrote for the New York Times, movingly described Greenwich Village as it existed immediately following World War II, when returning soldiers—armed with the G.I. Bill—flocked to the neighborhood to learn, create, and debate in a bohemian ecosystem.
“Though much of the Village was shabby, I didn’t mind,” recounted Broyard. “I thought all character was a form of shabbiness, a wearing away of surfaces. I saw this shabbiness as our version of ruins, the relic of a short history. The sadness of the buildings was literature. I was twenty-six, and sadness was a stimulant, even an aphrodisiac.”
Of course, to be young and comfortably enjoy artistic pursuits in today’s Village is a fantasy. But should we condemn the unforeseen consequences of gentrification? Concerns about its impact on city neighborhoods date back to the 1960s. Losing a small business or facing an apartment eviction are tragic outcomes, but do native, poorer residents really lament cleaner and safer streets?
Meeks cites studies from the University of Washington and Columbia University that found poorer residents less inclined to flee a neighborhood undergoing gentrification or experiencing higher rents. “The most plausible interpretation,” wrote Columbia’s Lance Freeman and Frank Braconi, “may be the simplest: As neighborhoods gentrify, they also improve in many ways that may be appreciated by their disadvantaged residents as by their more affluent ones.”
Arguably, it’s through gentrification that preservation prevails in city neighborhoods. An investor is more inclined to restore an aging corner store into a microbrewery if the neighborhood sheds its unpleasant past. In American cities lucky enough to experience gentrification, the survival of historic buildings often depends on a neighborhood’s transformation.
Unfortunately, a healthy city doesn’t always translate into municipal leaders protecting historic buildings and neighborhoods. Instead, the mutually dependent forces of globalization and technology threaten our traditional concept of cultural or architectural heritage. In San Francisco or New York, the explosion of private global wealth since the 1990s created a demand for premium housing. For Manhattan, this radical development resulted in the construction of narrow high rises that puncture the island’s elegant skyline. But this dramatic change also expanded the boundaries of New York’s prosperity. Without gentrification, the blocks of structurally unique housing in Harlem, Park Slope, or Fort Greene would have fallen into disrepair.
Meanwhile, in Philadelphia, a generous tax abatement program—combined with flawed preservation policies—changed the landscape of Center City and its surrounding neighborhoods. During the past decade, the city experienced a demolition bonanza. In neighborhoods of historic housing like Powelton Village, developers “have discovered they can make a tidy sum simply by replacing one of these old houses with a stucco-clad apartment building and then cramming it with students,” observed the Philadelphia Inquirer’s architecture critic, Inga Saffron.
Preservationists have protested how Philadelphia’s demolition permits are approved. In the past year, two historic structures were razed for projects that never materialized. Currently, the preservation fight is focused on Center City’s Jeweler’s Row, a charming set of commercial properties on a block laid out in 1799. The historic storefronts, slated for demolition, would be replaced by a massive condo tower. Ill-advised demolition projects continue throughout Philadelphia, its future depending on an influx, however temporary, of millennials and students.
Cities like Philadelphia risk losing what shaped their identities, but their economic perseverance and favorable location at least provide prospects for preservation-minded investment projects. In The Past and Future City, Meeks’ greatest shortcoming is her failure to address how preservation can succeed in smaller Rust Belt cities confronting economic decline, urban blight, and rapid demographic and cultural change. After all, any of these post-industrial hubs would welcome the fruits of gentrification.
In recent months, field producers for cable news networks have flocked to these cities, attempting to document how Donald Trump triumphed in this so-called “flyover country.” Arriving like colonial administrators, they snap photos of dilapidated buildings and shuttered factories before heading to the nearest bar or fast-food restaurant to interview natives about how they endure in places overcome by urban decline.
Revitalization and reinvestment shouldn’t be confined to coastal cities. At a time when New York and Philadelphia regret what gentrification wrought, smaller urban pockets aggressively search for ways to make such blessings occur. Unfortunately, these communities do not enjoy the socio-economic advantages that allowed larger cities to overcome urban decay.
In countless smaller cities, multi-generational families reside in neighborhoods overwhelmed by blight and dysfunction. In 1984, George L. Kelling and James Q. Wilson famously attributed urban decay to modern mobility patterns and the retreat of police authority through vagrancy laws. They noted that before World War II, “City dwellers—because of money costs, transportation difficulties, familial and church connections—could rarely move away from neighborhood problems.” This forced residents to reclaim normalcy on their city streets.
We have returned to that pre-World War II trend, but without the institutions that once held these communities together. Stagnant wages leave many no alternative but to live in neighborhoods overwhelmed by drugs, gangs, and blight. Rapid demographic and economic change—accompanied by a fragmenting civic culture—make it difficult to apply the lessons of historic preservation in these cities. Magnificent old churches are shuttered or demolished. Stunning commercial or industrial buildings—the legacy of architects trained in New York or Paris—fall into disrepair. The fading vitality of a city’s physical surroundings becomes emotionally paralyzing. As Pete Hamill reminds us: “Nostalgia is genuine—you mourn things that actually happened.”
If applied properly, practical attempts at preservation could protect architectural treasures, build community morale, and improve overall neighborhood-level sociability. But gentrification may never arrive in these communities. While Meeks discusses the challenges of gentrification in major cities, she does not explain how post-industrial regions can embrace preservation without economic development as part of the cycle of urban revitalization.
For Meeks, a critical preservation tool is flexible or mixed-use zoning, which permits historic structures to be readily converted through adaptive reuse. But this could easily backfire in smaller cities. As Nicole Stelle Garnett wrote in her brilliant work on land use policy, Ordering the City, mixed-use zoning “might lead some low-income entrepreneurs to establish the types of businesses equated with urban decay.” In other words, a city may develop a comprehensive plan that encourages mixed-use zoning, but the resulting pawn shops, used-car lots, and check-cashing stores would destroy the character of whichever neighborhood was targeted for preservation.
Smaller cities and towns boast architectural, cultural, or industrial legacies that could unite communities, encourage preservation, and, if properly marketed, spur revitalization. But preservation transcends saving an old office building, factory, or church. It’s no longer decay that risks deconstructing our urban landscapes. Empty pews, folded newspapers, and defunct social clubs remind us that the foundation of America’s civic culture is collapsing.
As we retreat to a parallel existence through pixelated screens, our society becomes disengaged from the pillars of heritage that created communities, sustained neighborhoods, and brought families together. A steeple, marquee, or bank entrance tells a story. How we preserve that story remains to be seen. At the very least, repairing our civic culture could save many struggling cities and towns that still nurse an acute sense of culture and history. But gentrification is an unavoidable step in embracing the neighborhoods of our forefathers. It’s only then that we can come to love these cities as William Faulkner loved his native Mississippi—in spite of, not because.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania. This article was made possible with support from the Richard H. Driehaus Foundation.
Writing in his memoir about the literary avant-garde of Paris and New York in the 1950s and 1960s, Richard Seaver lamented the imperfections of memory. The late publisher and editor cautioned that, “Time is not kind to the harried mind.” Seaver explained that he “tried to tell these tales as they were, accurately and fairly, and if I have erred or memory has betrayed, blame it on those Irish genes my wife has encountered over the years.”
It’s a safe assumption that the once prominent American writer John O’Hara would never issue such an apologia. Irish genes predispose their recipients to hyperbole and embellishment, but O’Hara wasn’t a carrier of this ancestral foible. His meticulous devotion to detail produced novels and short stories that captured the cultural temperament and consumer preferences of America between the 1880s and World War II. He didn’t want us to forget either, as his epitaph suggested: “Better than anyone else, he told the truth about his time. He was a professional. He wrote honestly and well.”
O’Hara died in 1970, but the imperfections of literary appreciation—accompanied by the unremitting passage of time—have sentenced the author’s work to a cruel purgatory of faded memory. The latest attempt to revive O’Hara’s legacy comes through the Library of America, which recently released a volume of sixty stories. The collection, compiled by the New York Times’ Charles McGrath, reminds us that O’Hara remains the Bard of the American Short Story.
Once known as the “master of the fancied slight,” O’Hara has suffered a posthumous assault for his grudges, arrogance, and alcohol-induced temper. His contemporaries in mid-20th century Manhattan acknowledged his talents but denied him absolution. And so O’Hara’s penance lingers with each passing decade, with no revival restoring his invaluable and boundless literary contributions.
As critics frantically search for historical and literary clarity during this newly christened Trump era, John O’Hara’s work is a necessary start. His stories provide a factual account of the period of unparalleled growth during the late 19th and early 20th centuries. At a time dominated by Snapchat, where people express without reflection, O’Hara offers a curative message into how people actually think, feel, and experience the extraordinary and ordinary of everyday life.
Born in 1905, O’Hara’s anthropological sense of place wasn’t nurtured on the tension-soaked streets of some coastal Eastern metropolis. His brilliant gift for observation and ear for dialogue germinated instead in Pottsville, a small city tucked away in the anthracite coal region of Schuylkill County, Pennsylvania. It’s impossible to understand O’Hara without understanding his hometown.
Pottsville was the commercial center of the region’s southern coal field. Approaching O’Hara’s hometown on Route 61, one can recognize the important role this city once played during the height of the Industrial Revolution. Its grid of dramatic hills supports blocks of pre-World War II architecture, structures dressed up in the ornate styles that only Gilded Age wealth could build. Overlooking the city of steeples and Federalist townhomes is the Schuylkill County Courthouse, a massive Queen Anne edifice that conceals the region’s darkest chapter in history.
Located behind the courthouse is the county prison, a brownstone castle façade dating back to 1851. It was here, on June 21, 1877, that six men were executed for their alleged acts of violence against mine bosses. The men were believed to be members of the Molly Maguires, an oath-bound Irish secret society with agrarian roots in Ulster.
During the 1860s and ’70s, Irish laborers protested the medieval conditions of the mines and company towns. A series of heinous crimes were committed against Welsh and English mine bosses, and the acts were subsequently tied to these Irish laborers. The powerful Philadelphia & Reading Railroad, which controlled Schuylkill County’s mines and railroads, hired a Pinkerton detective to uncover the alleged Mollies’ crimes. Twenty Irishmen were placed on trial and they were convicted by mostly German-speaking juries. The hanging of six men in Schuylkill County was part of the largest mass execution in Pennsylvania history.
Historians continue to debate the very existence of the Molly Maguires. The historian Kevin Kenny wrote that the Mollies “have been depicted in every imaginable way, from sociopaths and terrorists at one end of the spectrum to innocent victims and proletarian revolutionaries at the other.” Regardless, any association with the Mollies or Ancient Order of Hibernians risked excommunication from the Roman Catholic Church. Archbishop James F. Wood, a convert from a patrician Philadelphia family, frightened the region’s Ulster and Connaught immigrants with this threat, and so the very mention of the Mollies remained verboten through the early 20th century.
The Mollies’ legacy still haunted the region during O’Hara’s youth, a time when he digested the stark and latent cultural complexities of his surroundings. A devout social historian, O’Hara recounted the ethno-religious tensions in his “Gibbsville” stories, the fictional name he pinned on his hometown in honor of The New Yorker’s Wolcott Gibbs. In just one example, “Afternoon Waltz,” O’Hara reports the unspoken strain between an Irish-Catholic maid and her Welsh-Methodist host family. In spite of the divisions that the Mollies created among the Irish, O’Hara wrote that the “non-Mollies were loosely united with the Mollies by the common enemy, the Welsh Protestants, and when Sarah Lundy first went to work for the Evans family neither party could guess how the arrangement would work out.”
When not deciphering the socio-economic fault lines of his native geography, O’Hara revealed the endless flaws and disappointments of characters in Hollywood, Philadelphia, and New York. But when discounting O’Hara’s undervalued novels, his richest contribution remains the “Gibbsville” stories, a number of which can be found in the Library of America collection. Although the stories bracket a unique period of American society, O’Hara’s themes remain timeless through his evocative character portrayals. The stories grapple with money and class, with O’Hara exposing the resentment, perversion, insecurity, and cruelty of humanity as it reaches, retains, or loses its aspirations.
O’Hara’s stories were not clever literary inventions, but accurate depictions from his own memory and experience. He was raised in a “lace-curtain” Irish-Catholic family, growing up in a spacious rowhome—across from the Yuengling Brewery—on Pottsville’s fashionable Mahantongo Street. The O’Haras enjoyed a better place in the city’s social hierarchy than fellow members of their tribe, but their position remained tenuous. His father, Dr. Patrick O’Hara, was known as “the Irish doctor,” a founder of the city’s Catholic hospital, and a legendary surgeon for victims of mining accidents.
The Molly Maguire episode, bouts of labor unrest, and a growing Democratic Party induced a lingering anti-Catholicism and prejudice against the Irish. The Klan-fueled defeat of Al Smith in the 1928 presidential election only amplified this reminder. But the O’Haras enjoyed their place in society, and whatever subtle hostility existed was checked by their financial security.
As a boy, O’Hara enjoyed all the material possessions of a young country squire. If anything, O’Hara matured in a city where groups were segregated for their socio-economic standing. But a simple stroll along Centre Street exposed O’Hara to the WASP elite, Pennsylvania Dutch merchants, poor southern and eastern European immigrants, and the struggling working class members of his own tribe. It was through Pottsville that O’Hara learned how financial predestination determined not only where a person lived, but which college he attended and clubs he joined. The alternative was a life of factory jobs, too many children, and eternal economic insecurity.
As a young man, O’Hara was considered a snob, and a preference for riding boots didn’t help his reputation. But his social standing and material preferences didn’t discourage the boy O’Hara from playing with all children, regardless of background. O’Hara demonstrated this “spurious democracy” in his 1934 masterpiece, Appointment in Samarra.
The novel, set in Gibbsville in 1930, follows the downfall of Julian English over a three-day period during Christmas. O’Hara occasionally flashes back to Julian’s earlier years, such as when the sons of the wealthy and poor would play a ball game. O’Hara delineated what was left unsaid during ball games: “you did not talk about jail, because of Walt’s father; nor about drunken men, because there was a saloon-keeper’s son; nor about the Catholics, because the motorman’s son and one bookkeeper’s son were Catholic. Julian also was not allowed to mention the name of any doctor.”
O’Hara refined his craft when driving his father to house calls in the patch towns. While waiting in the car, he read F. Scott Fitzgerald, Sinclair Lewis, and Booth Tarkington. An early immersion in realistic character development, along with careful listening to how people spoke, inspired stories like the autobiographical “The Doctor’s Son.”
In life and fiction, O’Hara was paralyzed by his preoccupation with class. This yearning for recognition stemmed from a heavy dose of misfortune and self-destruction in early life. As a teenager, he floundered in prep school after prep school, until finally excelling and becoming valedictorian at Niagara University Prep. Approaching the academic finish line, O’Hara thought he might fill a slot at his beloved Yale. But a drunken night on graduation eve resulted in expulsion and shattered dreams. O’Hara’s misfortune continued with his father’s early death. Among Dr. O’Hara’s last words were, “poor John,” and a warning that his son had a spot on his lung. The father died without a will, leaving his family with almost no money.
After a faltering newspaper career, O’Hara’s misfortune slowly turned in New York, where he befriended members of a famed literary collective, the Algonquin Round Table. He is best remembered for his long and turbulent career with The New Yorker, which ran most of his short stories. But O’Hara’s fame never made him feel fully accepted into the club. O’Hara’s character flaws were indisputable, and yet his Irish Catholicism—no matter how hard he worked to drop it—was likely the biggest reason he only achieved partial membership in the Protestant-dominated establishment. When reporting for Time, O’Hara commented that his editor, Henry Luce, always gave him “that Protestant look.” He expressed this bitterness in his writing as well: “We’re Micks, we’re non-assimilable, we Micks.”
At a time when a large proportion of the population once again seems to feel alienated from the establishment, the characters of Gibbsville and beyond remind us that geographical and class divisions have long been part of the American story. And that is why it’s time to read O’Hara again.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania.
It’s been more than a quarter-century since Harry “Rabbit” Angstrom, the protagonist of John Updike’s sweeping quartet of middle-class life in America, died in the final novel of the series, Rabbit at Rest. In Rabbit, Updike presented an everyman who inelegantly navigated the political, social, and economic coordinates of his time. The glance of a newspaper headline, an overheard song on the radio, the survey of a changing neighborhood—these were the plot elements that directed Rabbit’s dysfunctional march into modern time. Revisiting Updike’s Rabbit novels is a rendezvous with prescience, for no collection of postwar fiction could help us better understand how working-class populism—in the form of Donald Trump—prevailed on Election Day 2016.
In the course of four decades (1959–1989), we read how Rabbit travels through life in a fictionalized Brewer (Reading), Pa. We view him as a jaded ex-high school basketball star, an instigator of dysfunction, an inheritor bound for excess, and finally, a middle-aged man overcome with nostalgia. Rabbit’s life parallels the political and social milieu of postwar America, whether it’s rebellion against conformity in the 1950s (Rabbit, Run), racial conflict and cultural anarchy in the 1960s (Rabbit Redux), financial excess in the late 1970s (Rabbit Is Rich), or uncertainty about the country’s future in the late 1980s (Rabbit at Rest).
We learn how Rabbit’s political thoughts evolve throughout Updike’s tetralogy, a time capsule of how Americans responded to the current events of their time. Rabbit and his Diamond (Berks) County ilk are conservative Democrats, products of the New Deal who support entitlements, defend Vietnam, possess an unbending patriotism, question their country’s economic future, and nurture a working-class intuition. “I’m a conservative,” Rabbit proclaims in Rabbit Redux. “I voted for Hubert Humphrey.”
Although Rabbit supported Humphrey in 1968, he later has a “Reagan Democrat” conversion, voting for George H.W. Bush in the final novel. If anything, he’s the fictional embodiment of a political prototype, a cross-party coalition infuriated by the loss of what communities like Brewer once symbolized: economic prosperity and a shot at a stable middle-class American life. The Rabbit novels could serve as the fictional companion to any social-policy book by Charles Murray. The realism of Updike’s characters and plot lines is a tribute to Updike’s understanding of this durable voting bloc, one that determined Hillary Clinton’s fate.
Updike’s insight stems from his childhood, rather than some removed anthropological immersion. A native son of Berks County, Updike was the prodigy of a region he fondly profiled throughout his literary work. As Adam Begley wrote in his biography of Updike, he was “very much preoccupied with Berks County (Plowville, Shillington, Reading), both in his stories and in his novels; it’s not too much of a stretch to say that he lived there most mornings.” Updike’s knowledge produced not only groundbreaking fiction, but an accurate portrayal of his home base.
A short drive from Philadelphia, Berks County is a magnificent vista of sprawling lowland underlain by rich soil and limestone. Its valleys contain meandering streams, tributaries of the Schuylkill River, once a canal system that delivered anthracite coal from the state’s northeastern cities and towns to Philadelphia. The broad line of the Blue Mountain ridge is a constant presence, while gently sloping hills bound the county’s southern tier.
The urban focal point is Reading, the county seat and Pennsylvania’s fifth largest city. From a distance, the city is an impressive layout of turreted row homes and Romanesque churches, their colorful brick and stonework amplified or shadowed by Mount Penn. In the city’s center is the courthouse, an Art Deco structure strikingly reminiscent of Los Angeles City Hall. A bridge on Reading’s Penn Street links the downtown to its leafy suburbs, towns like Wyomissing or West Reading that could easily blend in to New York’s Westchester County.
Berks County is also the cultural pulse of the Pennsylvania Dutch, a group of German-speaking immigrants who settled in the south-central and eastern portions of William Penn’s colony in the early 18th century. They settled not only in farming communities, but also Reading, which was far more ethnically homogenous and Protestant compared to most urban centers. The passage of time never diluted their heritage, a tenacious combination of conservatism, family tradition, geographical contentment, and Protestant sensibility. As a political force, they tended to deliver the county to Republican candidates at the presidential level, while supporting Democratic candidates for lower offices. Their support for labor often resulted in perplexing electoral outcomes. During the Depression, Reading was the nation’s only city whose council was completely controlled by the Socialist party.
It’s upon closer inspection that one can discern the socioeconomic dynamics of Updike’s literary protectorate. Reading’s impressive architecture fails to conceal the city’s many challenges, from strained municipal resources and corrosive blight to rampant crime and drug use. Once a hub for railroads and textiles, Reading now confronts the realities of urban decline. During Updike’s youth, flashing theater marquees, lively nightclubs, and bustling hotels defined “The Pretzel City.” But this reputation had faded when Updike described his hometown in the 1970s and 1980s. By 2011, Reading earned the unfortunate distinction of being the country’s poorest city, with U.S. Census Bureau data showing the population of 88,000 having the largest share of residents living in poverty.
A few weeks before the election, the Wall Street Journal’s Bob Davis and Gary Fields profiled Berks County’s fraying social institutions in their “Great Unraveling” series. They noted the county’s 30 percent decline in manufacturing jobs, compounded by a 6 percent decrease in inflation-adjusted median income since 1995. They also noted the changing demographics, with Latinos comprising a majority of Reading’s population and the white working class moving into its sprawling suburbs.
In April, Donald Trump and Bernie Sanders carried Berks County in their respective party’s primaries. At a Sanders rally in Reading before the primary, a New York Times reporter asked a retired steel worker whom he’d support in a Trump-Clinton matchup. He responded, “I would probably go for Donald Trump.” A drive along Interstate 78 in the succeeding months clearly signified where the county’s political preference was heading. Countless Trump signs adorned the barns and equipment of farms lining the highway. Trump ultimately won Berks County, significantly outperforming Romney’s victory in 2012 and nearly reaching Barack Obama’s winning margin in 2008.
If Updike’s novels taught us anything, it’s that the Trump coalition is the consequence of the budding frustrations found in the Rabbit novels. The problems that Rabbit encountered were decades in the making. As a political force, their populist strand just happened to reach its explosive point at the polls this November. They represent an economic bracket burdened by financial insecurity, negatively impacted by trade deals, and resentful over current immigration policies. They’re also voters, like Rabbit, who still constitute the largest demographic group in the U.S. But as David Frum wrote earlier this year, this group of “noncollege-educated people of European descent, are not only earning less than a generation ago, they are marrying less, raising more children outside marriage, taking more drugs, and dying earlier.”
In places like Berks County, Trump’s supporters personified a labor movement, comprising Democrats and Republicans who were devoid of ideology and believed Hillary represented the policymakers who eroded the state’s working class. They’re voters who drive through their hometowns, see the structural carcasses that housed steel mills and textile factories, and lament the shuttered churches and dilapidated homes that once provided spiritual and physical shelter for their ancestors. They’re witnesses to how communities like Reading have suffered from decades of job loss and rising poverty. They’re the reason Trump became the first Republican to win Pennsylvania since 1988.
Updike passed away only a week after Barack Obama’s presidential inauguration in 2009. The author’s departure from this world, where he sought to “give the mundane its beautiful due,” robbed him from observing the economic tumult, technological change, and political gridlock that ultimately led to Trump’s presidential victory. We’re left wondering how this chronicler of post-industrial America would interpret the Obama years and the subsequent populist revolt. Would Updike have created a fictional character, like Rabbit, who would listlessly traverse the unrepentant speed of automation and globalization, and the resulting erosion of working class life? In a year of unforeseen events and surprising political developments, this time Rabbit’s fellow Pennsylvanians didn’t run—they voted.
Charles F. McElwee III works in the government affairs sector in Harrisburg.