Our news feeds this year have been saturated with dispatches from Donald Trump’s America. Media outlets deployed journalists to regions besieged by rapid globalization, technological advances, and demographic change, where they attempted to explain the anger that inspired voters to embrace a figure like Trump. Their written pieces created a kind of niche journalism, presenting stories of what our nation failed to address with its transitioning economy. A number of these articles were accurate and empathetic portrayals of struggling communities, but too many of them patronized their subjects and reeked of condescension.
Typical was for journalists to decamp to down-and-out bars, where they’d score quotes of rage, racism, and radicalism from afternoon drinkers. Then it was on to boarded-up downtowns, shabby diners, and blighted street corners to paint a picture of human decay. Yes, there is a darker side to many of these post-industrial cities and towns. It’s also true that economic discontent often triggers political extremism. But too many of these accounts read like the field notes of colonial administrators visiting an uncivilized hinterland. Working-class regions are far more culturally fractious, economically diverse, and politically complex than what has been presented in the press.
The accumulating literature has also left political questions unanswered. As president, Trump happens to be leader of the GOP, but his past defies party allegiance. Likewise, if anything, his stalwart supporters are less devoted to the Republican Party than to the commander-in-chief. So what does this betoken for Republicans? What does conservatism even mean for the working class in post-industrial America?
The answers to these questions lie outside the typical conservative canon. An alternative starting point is an essay by legendary journalist Pete Hamill in New York magazine. In 1969, Hamill composed one of the most prescient pieces about working-class alienation ever written. He described the working class as standing “somewhere in the economy between the poor—most of whom are the aged, the sick and those unemployable women and children who live on welfare—and the semi-professionals and professionals who earn their way with talents or skills acquired through education.”
Hamill profiled residents of New York’s boroughs, typically disgruntled ethnic Catholics who resented dwindling finances, signs of disrespect, rising crime, and politicians’ indifference to their plight. He observed how the “information explosion” affected this slighted and tribal group. “Television has made an enormous impact on them, and because of the nature of that medium—its preference for the politics of theatre, its seeming inability to ever explain what is happening behind the photographed image—much of their understanding of what happens is superficial.” As a growing medium, television intensified racial tension and ignored the common socio-economic ailments shared by working-class families of all backgrounds. Urban rioting only heightened this divide. Hamill observed that it was “almost impossible to suggest any sort of black-white working-class coalition.”
Hamill reported that working-class whites directed their anger toward New York City’s Mayor John Lindsay, the liberal Republican who presided over a metropolis reeling from fiscal distress and skyrocketing criminal activity. They felt overlooked by their mayor, vulnerable in their neighborhoods, insecure in their jobs, and doubtful about their futures. “The working-class white man feels trapped and, even worse, in a society that purports to be democratic, ignored,” Hamill wrote. He warned of a brewing revolt, one that would ominously involve the use of guns. He concluded that this aggrieved group was simply “in revolt against taxes, joyless work, the double standards and short memories of professional politicians, hypocrisy and what he considers the debasement of the American dream.”
Hamill’s words, nearly 50 years old, could have been lifted from an analysis of today’s working-class grievances. His essay reminds us that the present revolt is not necessarily driven by ideology or party allegiance. The working class that Hamill described flocked to Republicans like Richard Nixon in 1968 and Ronald Reagan in 1980, but they also appreciated Gary Hart’s message in 1988, which sounded the alarm over how the global economy would affect blue-collar workers. Hart’s campaign attracted cross-party appeal, but his famously scandalous downfall in 1987 ruined any chance he might have had to be president. In 1992, Bill Clinton brought the working class back to the Democrats from George H.W. Bush. They then rallied around George W. Bush after September 11 and ensured his re-election in 2004. Disillusionment over the Iraq war and the economic crash inspired hope in Obama’s 2008 campaign. Eight years later, they embraced Trump after feeling alienated by Obama and rejecting Hillary Clinton’s vapid “Stronger Together” message.
The period between 2001 and 2009 in particular tainted the conservative movement’s purity, with the mammoth growth of the national security state, expanded entitlements, an overcommitment to warfare in the Middle East, and a catastrophic economic crash requiring government intervention. In that short time, the productivity gains enjoyed by all Americans in the 1990s disappeared. What followed was a sobering reality. As John G. Fernald of the Federal Reserve Bank of San Francisco and Charles I. Jones of Stanford wrote in 2014, “Growth in educational attainment, developed-economy R&D intensity and population are all likely to be slower in the future than in the past.”
Sluggish growth contributed to the country’s ongoing labor challenges, ranging from stagnant wages to declining workforce participation. In an important essay for National Affairs this fall, Eli Lehrer and Catherine Moyer wrote about the consequences of this phenomenon, particularly how technological progress in a transitioning economy has impacted men. They took note of the scores of men “who are unemployed, unengaged with civil society, uninvolved in family life, and, therefore, finding little meaning in their lives.” And they provided an unsettling statistical comparison. In 1948, 86.7 percent of men were either employed or searching for employment; that dropped to 69.9 percent in January 2017, reflecting a multi-generational pattern of declining labor force participation among men.
The New York Times’ David Leonhardt recently presented a similar historical comparison for the nation’s economic viability. Leonhardt observed, “By 2019, a prime measure of the economy’s health—gross domestic product per working adult—will likely have recovered less in the 12 years since the crisis began than it did during the 12 years since the Great Depression.” Massive mobilization during World War II provides some context, but America has clearly suffered from a declining standard of living. Technological advances transformed how we communicate and process information, but that didn’t necessarily translate into an improved quality of life. This was especially the case for working-class Americans.
In The Retreat of Western Liberalism, Edward Luce, the Financial Times‘ Washington columnist, provides a brilliant historical and political synopsis of how populism thrives under these circumstances. His book reminds us how a prevailing tenet of post-war conservatism—limited government—rings hollow with working-class voters today, and how the “populist right only began to do really well at the ballot box after they began to steal the left’s clothes.” His explanation is worth quoting at length:
In each case, including Donald Trump, populists broke with the centre-right orthodoxy to argue in favour of a government safety net. This is what the old left used to promise and largely delivered (you might say over-delivered). It was the implicit bargain of modern Western democracy. In most countries, including the U.S., it took the form of social insurance. The link between the duties of citizenship and the right to draw benefits was a form of social contract. Even in relatively generous Sweden, future retirees must work for fifteen years before they are eligible to draw a pension. It was an unfortunate coincidence that immigration started to surge just as the value of these benefits began to erode. That was a double whammy. The same governments that were cutting welfare payments were also allowing recent arrivals to join the system. It offended people’s sense of fairness. “You cannot cut entitlement spending and simultaneously widen access to them,” says Francis Fukuyama. “Sooner or later, something has to give.”
Luce’s summary explains blue-collar frustrations. Contrary to prevailing arguments, the working class aren’t members of the far right. They resented the failures of Obamacare, for example, but were also willing to accept some form of universal health insurance. Obamacare’s passage was a boon for grassroots advocacy organizations, especially post-Citizens United. But the foundation for those organizations—the Tea Party movement—testified to the weakened condition of the GOP’s political infrastructure. This vulnerability allowed Trump to thrive in 2016, defeating 16 opponents in the primary and rallying against the previous Republican president. The long-term challenge for Republicans is that Trump courted votes from traditionally Democratic regions. Moving forward, Republicans must identify and support the salient issues confronting these working-class voters. The party must extract the core elements of Trump’s campaign message without inheriting the darker elements of Trumpism. And that means, first and foremost, Republicans must accept that the séance with Ronald Reagan should end.
On this new working-class right, government is sometimes the answer, even if right now it isn’t putting forth a very good image. The federal bureaucracy exploded after September 11, inevitably creating redundancy and waste, while existing departments and agencies failed to create meaningful reforms for post-industrial regions. And the Trump administration has yet to show how it will improve the government’s response rate. The opioid epidemic rages, municipalities and school districts face fiscal collapse, roads and bridges crumble, and crime, poverty, and blight corrode community stability. Trump referenced this “American carnage” in his widely panned inaugural address, but that carnage exists and the working class needs to see a substantive governmental response to it. That means a real bonding endeavor—like the New Deal or New Frontier—that secures a sustainable voting coalition, one that sometimes looks to the government for help.
A working-class conservatism also means creating a labor movement within the Republican Party. Perhaps the days of demonizing unions has expired. Of course, many unions became a counterproductive force as the 20th century progressed, but in today’s global economy, labor reform experimentation at the state level can improve the startlingly low morale of blue-collar workers. Jonathan Rauch made the conservative case for unions in The Atlantic earlier this year. He highlighted how one in three private-sector workers belonged to unions in the 1950s, a number that today has fallen to less than 6 percent. “If 2016 taught us anything,” Rauch wrote, “it was that miserable workers are angry voters, and angry voters are more than capable of lashing out against trade, immigration, free markets, and for that matter liberal democracy itself.”
Exasperation over immigration and trade policies should also be acknowledged. When it comes to U.S. immigration policy, Republicans have avoided supporting restrictive measures, fearing the possible alienation of donors, interest groups, and minority voters. But the immigration system is frozen in time, and it’s time to acknowledge that the Immigration and Naturalization Act of 1965, which opened the door to low-income migrants, created more burdens than opportunities. Massive chain migration from poor developing countries has had a detrimental impact on post-industrial regions, which are ill-equipped to process rapid demographic change and no longer house mass-production industries that can support a low-skilled population. It’s not an act of extremism to support legislative reforms like the RAISE Act proposed by Republican Senators Tom Cotton and David Perdue. Switching to an immigration regime similar to Canada’s merit-based system is a practical approach to supporting higher-skilled migration while easing the burden on America’s poorer regions.
It’s also possible to support free trade while acknowledging its unforeseen consequences. The average working-class voter may not understand the complexity of trade agreements like NAFTA, but he did witness on an organic level how his region significantly declined since the deal went into effect in 1994. Small manufacturers closed shop in too many U.S. cities and towns. The idea of a living wage disappeared for working-class families. Republicans should remember that they can support future trade agreements that secure America’s global economic position while also ensuring that working-class regions stand to see real benefits.
Conservatism is evolving from its post-World War II origins. Republicans witnessed last year how Trump’s voting base had shifted towards expanded health care access, crackdowns on monopolistic sectors, tax increases for the wealthy, and a prudent foreign policy. They maintain a traditional conservative disposition on social issues, but their economic outlook is more in alignment with the New Deal era than the Republican Revolution of 1994. Preserving this coalition will require an inclusive approach. Working-class regions are geographically and culturally diverse, ranging from Appalachian villages to poor city neighborhoods. Republicans must compassionately respond to white senior citizens who cannot retire, Latino small business owners who struggle to finance their dreams, and African-American families in failing urban school districts.
“The working class earns its living with its hands or its backs; its members do not exist on welfare payments; they do not live in abject, swinish poverty, nor in safe, remote suburban comfort,” wrote Hamill in his New York piece. The core of his observation remains timeless, but many of those hands and backs have gone idle as hope for their economic prospects slips away. It’s difficult to forecast whether the working class will even stick with Trump (a question that may be answered by the tax cut deal). But what is clear is that the administration must respond to the needs and aspirations of the working class. It must enact policies that address economic and social needs, from border security to job security. It cannot take their support for granted, the fatal mistake made by the Democratic Party. The working class’s flirtations with Republicans will continue only if the GOP offers them something in return.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania. Follow him on Twitter at @CFMcElwee.
The Schuylkill Expressway, a clogged four-lane artery into Center City Philadelphia, requires emotional stamina and automotive endurance. The eastbound lanes, crammed between wooded hills and the Schuylkill River, present a stop-and-go orchestra dictated by traffic volume, weather, and sudden exits. Drivers slowly pass trains, the ubiquitous Action News ZooBalloon, and remnants of Philadelphia’s industrial past before the city’s skyline emerges beneath the arched Girard Bridge.
The expressway leads to a stunning structural and arboreal vista of gleaming glass and water. Boathouse Row, 19th-century gingerbread houses that designate allegiance to university rowing clubs, charmingly commands the languid river. On warmer afternoons, rowers furiously keep pace with bikers along Kelly Drive, an alternative route for wearied commuters. Just beyond the boat houses is the Fairmount Water Works, an elegant reminder of the city’s history of municipal innovation. Dominating the entire scene, stoic and majestic above Fairmount Park and beneath rising skyscrapers, is the Philadelphia Museum of Art.
The Art Museum, a yellow-hued neoclassical masterpiece, overlooks a flag-lined linear boulevard that leads to City Hall. This impressive stretch, the Benjamin Franklin Parkway, marks its centenary this year. From the air, the Parkway resembles a Parisian thoroughfare of monuments, buildings, and trees. But a closer inspection leads to apprehension. A century after its creation, the Parkway appears unfinished, an assembled boulevard of pedestrian confusion, architectural contrasts, and impaired urban scale. Understanding how such a prominent city feature came to be—and suffered setbacks as the urban landscape was overtaken by cars—may point to ways in which the Benjamin Franklin Parkway can achieve its promise as a great public space.
The Parkway’s completion was a multi-generational endeavor. In 1858, Philadelphia’s City Council presented a plan to run boulevards between the city’s center and its growing suburbs. This proposal paralleled the city’s relentless explosion in population, industry, and commerce. Between 1850 and 1860, Philadelphia witnessed a 367 percent increase in population—in just one decade, going from 121,376 to 565,529 residents. Immigrants drove this staggering increase, principally from Ireland’s Ulster and Connacht. When this Irish influx arrived at Philadelphia’s ports, they typically settled in the city’s neighborhoods to work for textiles factories, locomotive works, or the Pennsylvania Railroad. But many Irish continued north, following the Schuylkill River’s headwaters to work in the anthracite coal mines.
Irish migration necessitated the creation of parishes. In 1864, Napoleon Le Brun, a Philadelphia architect, designed an edifice that served as an everlasting tribute to this period. Completed during the Civil War, the Cathedral of Saints Peter and Paul remains a brownstone masterpiece, a gracefully proportioned basilica overseeing what was Logan Square. The square was named after James Logan, the Irish colonial secretary under William Penn who established shipping links between Philadelphia and Ulster. The cathedral was built under Archbishop James Wood, a Catholic convert from a prominent Philadelphia family. Wood presided over the diocese while forging an alliance with the industries that employed Irish immigrants.
The Cathedral towered over a poorer immigrant neighborhood known for fairs and special events. Through the late 19th century, industrial, commercial, and residential homes expanded from Logan Square. In 1876, the Academy of Natural Sciences, the oldest institution of its kind in the Americas, opened its doors on 19th Street during the nation’s centennial. The structure, initially designed to resemble a Gothic cathedral, opened in a neighborhood that, for all its growth, remained on the city’s outskirts. The neighborhood’s future remained frozen until an urban planning movement enraptured the nation’s industrial cities in the early 1890s.
The City Beautiful movement, introduced at the World’s Columbian Exposition in 1893, popularized multi-building neoclassical and Beaux-Arts revival projects throughout the Progressive Era. At a time when corruption corroded civic institutions and industry polluted air and water, the City Beautiful movement served as an aesthetic-conscious response to the Gilded Age. In his book on the Parkway, the historian David Brownlee explains that the planning movement was a “model of an orderly, classical metropolis, crisscrossed by boulevards and dominated by stately groups of public buildings.” Cities like Philadelphia worked to improve their urban presentation. Despite the nation’s unparalleled economic growth, these cities still lagged behind London, Paris, and Vienna.
In the early 1890s, the city council reviewed the first proposal for what became the Parkway. A citizen-led petition accompanied the proposal, which called for a 160-foot wide road linking Center City to Fairmount Park, Philadelphia’s largest municipal park. The proposed boulevard would cut through a dense neighborhood, Logan Square, and ultimately, the still incomplete City Hall. The multi-tiered Second Empire structure, the nation’s largest municipal building, was finally completed in 1901. Its ornate tower, crowed by a William Penn statue, remained the tallest building in Philadelphia until the Liberty Place skyscrapers broke the tradition in the mid-1980s.
Prudent urban planning followed a sustained period of city corruption. By the turn of the century, Philadelphia’s Republican party had controlled the city since the Civil War era. The Republicans operated City Hall like New York’s Tammany Hall, its power extending to police stations that were placed adjacent to the party’s ward clubs. One observer called City Hall an “unholy alliance” among the “‘best people,’ representing the city’s financial, industrial and commercial interests, and a tightly-knit machine made up of grafters, gamblers and goons, whose political philosophy [was] based on the simple formula ‘what is in it for me?'”
The reporter Lincoln Steffens famously called Philadelphia “corrupt and contented,” but the city experienced a renaissance in architecture and planning despite this reputation. Reform simply expressed itself through brick and mortar. By 1907, ground was broken for what was presented as the “Fairmount Parkway.” Its planners, commissioned by the Fairmount Park Art Association, called for “a direct, dignified and interesting approach from the heart of the business and administrative quarter of the city, through the region of educational activities grouped around Logan Square, to the artistic center to be developed around the Fairmount Plaza, at the entrance to Philadelphia’s largest and most beautiful park.”
The project commenced through a series of corrupt mayoral administrations. For a decade, the construction, which cleared over a thousand structures, slowly moved toward completion amid political fights for reform. But the Parkway owed its existence to the long-standing machine. In his book, Brownlee wrote that “an invincible alliance…between powerful citizens and corrupt politicians” ensured the Parkway’s opening.
In 1917, following years of demolition projects and public debate, the Fairmount Park Commissioners adopted a formal design for the Parkway. The design was presented by Jacque Gréber, a French architect who specialized in landscape design and city planning. Gréber was a leading proponent of the City Beautiful movement, whose urban planning in Ottawa later transformed the modern layout of Canada’s capital city.
Gréber’s Parkway plan called for two linear segments running between Fairmount Park, through Logan Square as the central anchor, and narrowly toward City Hall. In 1918, over a decade after the groundbreaking, Philadelphia’s Evening Public Ledger triumphantly declared that the “Uninterrupted Parkway at last leads from City Hall to Fairmount’s entrance.”
What followed were a series of building projects designed to match the intended grandeur of the Parkway. Gréber modeled the Parkway after the Champs-Élysées in Paris, and he hoped visitors would behold a thoroughfare of monumental buildings, arresting statues and fountains, and bountiful greenery. Following World War I, Gréber stated, “I am glad to say that, if by this work the city of Paris may be enabled to bring its sister in America the inspiration of what makes Paris so attractive to visitors, it will be the first opportunity of Paris to pay a little of the great debt of thankfulness for what Philadelphia and its citizens have done for France during the last three years.”
Following World War I, the Parkway physically embodied the prosperity enjoyed by American cities. Major construction projects commenced during the Roaring Twenties, including the Insurance Company of North America building, the Free Library of Philadelphia, the Fidelity Mutual Life Insurance Company building, and the Rodin Museum. The succession of buildings personified the ideals intended by Beaux-Arts and neoclassical architecture. The Parkway itself became Philadelphia’s center for culture, a boulevard of elegance in a city sprawling with steeples, smoke stacks, and row homes.
By the Great Depression, parts of the Parkway resembled a Parisian boulevard. Logan Square was reconfigured as a circle to imitate the Place de la Concorde, a major public square in Paris. The Swann Memorial Fountain was placed as the circle’s centerpiece, a sculptural tribute to the Schuylkill River, Delaware River, and Wissahickon Creek. The Family Court building, along with the Free Library, were painstakingly modeled after the Hôtel de Crillon and Hôtel de la Marine. Overlooking the circle, they created a structural gateway to the Parkway, which slowly fulfilled Gréber’s architectural and planning vision.
Completing the postcard was the Philadelphia Museum of Art, an ongoing project that created a barrier between Fairmount Park and the Parkway. Designed as a Greek Revival temple, the structure stood over its fraternal buildings on the Parkway, creating an Acropolis for Philadelphia. Through World War II, one could ascend the Art Museum’s steps to behold the Parkway’s ongoing development. From the Franklin Institute to the School Administration Building, the Parkway—renamed after Benjamin Franklin in 1937—deceivingly had limitless possibilities. But the Depression slowed its continued development. By mid-century, the Parkway appeared unfinished, an imperfect arrangement of Gréber’s Parisian dream.
In the 1960s, the celebrated urbanist Jane Jacobs panned the Parkway. Jacobs was a leading skeptic of the City Beautiful movement, labeling its planning approach an unrealistic theory that destroyed the natural order of urban neighborhoods. “The library has no business being out here and neither do the Art Museum or the Franklin Institute,” she said. In The Death and Life of Great American Cities, Jacobs observed that Logan Circle was “discouraging to reach on foot,” noting that it was “mainly an elegant amenity for those speeding by” and “gets a trickle of population on fine days.”
Edmund Bacon, the esteemed urban planner and long-time executive director of Philadelphia’s City Planning Commission, disagreed with Jacob’s assessment. Although Bacon steadfastly defended the Parkway, his planning vision disrupted Gréber’s intentions from an early age. Bacon’s 1932 thesis at Cornell called for “A Civic Center for Philadelphia,” which included the replacement of City Hall with two small civic buildings, along with a tree-lined promenade that awkwardly met the Parkway.
Bacon’s plans arguably hastened the Parkway’s decline. In the 1950s and 1960s, Philadelphia embraced ill-advised urban renewal projects. The city attempted to respond to the ascendant automobile culture and the accompanying flight to the suburbs. More cars made the Parkway nearly inaccessible to pedestrians endeavor, as it became less of an urban boulevard and more of a route for escaping downtown. Bacon attempted to “rescue” the city by planning the Vine Street Expressway, a gross interruption of the Parkway’s streetscape that created gaping holes and pedestrian inaccessibility. Such planning also failed to stall Philadelphia’s industrial and population decline. Between 1960 and 2000, the city lost nearly 500,000 people.
By the time Ed Rendell became mayor in the 1990s, the Parkway was an urban wilderness. Lifeless office buildings—modernist slabs of concrete—disrupted the scenery. The Park Towne Apartments awkwardly stood near the Art Museum, a lonesome structure reached by car, not foot. Chain restaurants were more common than cafes and bars. In Hidden City, the urban designer Greg Meckstroth called the Parkway “a veritable museum ghetto that often becomes desolate at night.” He concluded that it appeared to be “condemned to sour monotony.”
The Parkway endures as a symbol of disappointment in Philadelphia. In a 2014 piece, Gregory Heller—a biographer of Bacon—wrote that the “Parkway looks good on the Fourth of July and during the [Philadelphia] marathon. But otherwise it’s an underutilized, poorly planned, highway wasteland of an urban space.” Heller argued that Philadelphians “should face up to the fact that as much as we love the view from the museum steps, and the boulevard of flags on the evening news, the Parkway is a disaster and the city would be better if it were never built.”
During the spring and summer months, the Parkway transitions into a landlocked pit for festivals and concerts. Temporary security fences and portable bathrooms vandalize a frustrating boulevard of unfulfilled promises. In a recent Philadelphia Inquirer column, the architecture critic Inga Saffron wrote that this year’s “anniversary events, which…focus on high-toned cultural offerings, are a painful reminder that we still haven’t figured out what the Parkway should be.”
At the same time as the Parkway is struggling, Philadelphia is also experiencing a renaissance. Over the past decade, the city witnessed a boom in commercial and residential development, game-changing investments made by universities, and gentrifying neighborhoods that had recently suffered from crime and blight. The Fairmount neighborhood, near the Art Museum, has turned into a thriving residential quarter and a popular nightlife destination. A controversial property-tax abatement program, spearheaded by the city and embraced by developers, contributed to this rapid change. But Philadelphia, a city with an ambivalent future in the 1990s, continues to enjoy an upward trend in population growth, business investment, and neighborhood-level improvements.
Philadelphia’s leading planners still debate how to turn the Parkway into the European-style street envisioned a century ago. The city is leading a study to properly manage festivities and for-profit events. Frequent street closures and disruptive concerts spoil the revival occurring in the Parkway’s surrounding neighborhoods. The city’s study would join numerous plans previously intended to address the Parkway’s flaws.
In 1999, Paul Levy, president and CEO of Center City District, called for more pedestrian-friendly park space, apartment buildings, and cultural venues. Levy’s radical proposals languished through the decade. In 2013, however, Philadelphia’s park officials released “More Park, Less Way,” which prescribed ideas to attract people to the Parkway. In recent years, the Parkway experienced minor improvements, including reconfigured sidewalks, expanded plazas on the north side of Logan Square, and the opening of a serene art museum, the Barnes Foundation. But the Parkway lingers as a lonely pocket in a lively setting, a major city artery with cultural institutions more likely accessed by taking Uber than walking.
In 1998, Buzz Bissinger wrote A Prayer for the City, a magnificent profile of the Rendell years and Philadelphia’s fight for survival during that period. Nearly two decades later, Philadelphia’s prayers appear answered. A city that ignited America’s industrial rise seems ready for the future. But the Parkway, designed to signify Philadelphia’s place in the world, remains in purgatory. It awaits special dispensation. The Parkway, along with Logan Circle, have significantly evolved since the Cathedral opened its doors to immigrants. For now, it remains a destination to daydream. It could use a prayer or a lucky penny in Logan Circle’s fountain.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania. Follow him on Twitter at @cfmcelwee.
When responding to his administration’s darkest hours or confronting steadfast opponents, President Donald Trump turns to his social media arsenal, deploying an artillery of tweets to shock and awe the political, cultural, and broadcasting elite. But Twitter offers only temporary relief, a virtual discharge while confined to the lonely White House. For Trump, hitting the road is the ultimate release, and it’s before admiring and unwavering crowds that the president campaigns for his preferred platform and finds validation. Nearly a year after his stunning victory, Trump’s rallies remain an unchanged ritual, comprised of a congregation celebrating the president’s populist agenda with all its solemnities.
But Trump didn’t invent this exhibition or emotional approach. It was Morton Downey, Jr., a bombastic provocateur, who transformed our media culture through his nationally syndicated television show in the late 1980s. This autumn marks the thirtieth anniversary of The Morton Downey Jr. Show’s debut on New York’s WWOR-TV, and while the short-lived program ended with scandal and fleeing advertisers, it still established a rhetorical style replicated by talking heads, politicians, and ultimately, a winning presidential candidate.
Downey, who died in 2001, presided over the original Trump rally. His show launched during the final stretch of Reagan’s presidency, a period of cataclysmic change in the media realm that was only realized in hindsight.
The era’s shifting cultural, political, and economic fault lines slowly cracked the foundations of established institutions, whether it was print publications, network news, or political parties. And yet it remained a simpler time, when households digested current events in local newspapers, tuned into Tom Brokaw or Dan Rather on countertop televisions over dinner, and understood the clear ideological sympathies of Republicans and Democrats. These households, often working class, were located in small to mid-sized cities and aging postwar suburbs that were only beginning to witness the rapid changes wrought by technology and globalization. Downey intuitively understood the budding frustrations of these households, and he tapped into that resentment before a fawning television studio audience.
Downey’s show arrived just months after Gary Hart’s ignominious downfall as the presumptive Democratic nominee in the 1988 presidential election. Hart, a U.S. senator from Colorado, was considered a formidable candidate against then-Vice President George H.W. Bush, but rumors of adultery and a newspaper’s stakeout of his Capitol Hill home created a media frenzy that ended his campaign. The frantic coverage of this scandal was a “Franz Ferdinand” moment, one which completely transformed the confluence of American media, politics, and culture just as the Austro-Hungarian Archduke’s assassination in 1914 changed the course of geopolitical history.
If there is a Guns of August-like book that brilliantly captures this period, it’s Matt Bai’s 2014 account of the Hart scandal, All the Truth Is Out. “Within about a decade or so of Gary Hart’s spectacular collapse,” wrote Bai, “new technologies—first the satellite revolution that made twenty-four-hour news possible, and then the advent of the Internet and the rapid spread of broadband technology—would obliterate this old order, creating a new world of instantaneous and borderless information.” In a cruel twist of irony, Hart spoke on the campaign trail about how America should prepare for the 21st century’s inevitable technological upheaval. What Hart failed to anticipate, however, was how a figure like Downey would trigger Americans’ emotional response to these changes.
Although raised in a world of privilege, Downey was always an outsider. Born in Los Angeles in 1932, Downey was the oldest son of Morton Downey, a famed Irish tenor during the Depression era (“When Irish Eyes Are Smiling”), and Barbara Bennett, an actress and dancer.
Dysfunction, marital drama, and violence marked Downey’s formative years. His alcoholic mother abandoned the family for a B-movie cowboy actor, and his father instilled fear with his ferocious temper. But the Downey family, for all their faults, still enjoyed the trappings of celebrity. Before they were shipped to Connecticut to be raised by their paternal grandparents, Downey and his siblings grew up in opulent homes and counted the Kennedy family as neighbors in Hyannis Port. By age 10, Downey was sent to military school, and his educational quest continued with a degree from New York University, a J.D. from a correspondence school, and a doctorate from an unaccredited college in California.
Downey inherited his father’s love for show business, but Morton Downey, Sr. responded to his son’s interest with stunning cruelty. “The name Downey opened doors,” Downey recalled in a 1988 interview with People. “But they slammed just as fast because that same name controlled them.” Downey’s father was perversely determined to keep his son out of Hollywood, going as far as arranging for his son’s imprisonment for bouncing a check. Downey served 61 days in a California jail. His father finally apologized many years later.
Downey persevered in spite of his father’s opposition, and his career trajectory included working as a singer, disc jockey, lobbyist, co-founder of the American Basketball Association, and eventually a radio host in a half dozen cities. He was also a campaign aide, first working for both John and Robert Kennedy’s presidential runs before switching sides, joining the National Right to Life Organization, and later running for president on the American Independent party ticket in 1980.
As a radio host, Downey’s evocative style tuned listeners into his show while also imperiling his allotted airtime with station bosses. In 1984, Downey was fired from a Sacramento station, fortuitously leaving his seat to be filled by a former Kansas City Royals ticket salesman named Rush Limbaugh. It was not until 1987, at age 55, that Downey got his lucky break with WWOR, which relocated to a studio in Secaucus, New Jersey.
Known by natives as “WOR,” WWOR-TV’s popular programming originally reached audiences in the New York metro region, New Jersey, Philadelphia and its suburbs, and pockets of Pennsylvania. The station’s most popular host was Joe Franklin, who interviewed a comically eclectic mix of celebrities, faded stars, and unknown performers. It was Bob Pittman, a producer and co-founder of MTV, who arranged Downey’s debut in what was then a superstation with 20 million people picking up WWOR’s signal nationwide.
Downey’s show developed a loyal cult following. His confrontational persona, distinguished by an intolerance for excessive political correctness, especially resonated with a large segment of the increasingly post-industrial Mid-Atlantic.
His audience, admiring baby boomers or members of Generation X, typically lived in the cities and towns grappling with the stagnating wages, job limitations, rising crime, demographic changes, and technological upheaval occurring in a newly emerging global economy. Their families lived in bungalows on Staten Island, rowhomes in northeast Philadelphia, ranches in Levittown, and duplexes throughout Pennsylvania’s anthracite coal region and Lehigh Valley.
The typical audience member was a young male who likely hailed from a third or fourth generation ethnic Catholic family, residing in neighborhoods experiencing the accoutrements of urban decline. His homemaking grandmother listened to Morton Downey, Sr. croon on the Victrola as a child, his grandfather recalled Father Coughlin’s radio diatribes before World War II, and he or his parents watched TV host Joe Pyne’s combative interviews on WDEL-TV in the 1960s. Downey communicated in a way that easily clicked with this demographic, which understood the language of conflict in working class urban America.
“The element that makes Mort work is that he is truly not a conservative; he is a populist,” said Pittman in a 1988 interview. “If you watch him, he is sometimes on the liberal side of the issue. He bounces around. What he’s really interested in is the little man’s opinion: that’s really what he is representing.” Noting Downey’s television style, Pittman concluded that, “Downey doesn’t take an intellectual approach to things, but an emotional approach.”
This approach manifested itself in Downey’s television optics, which hooked the viewer and rallied the crowd. Each episode started as a field entrance with Downey, kibitzing with a pack of crew members behind stage, slowly approaching the audience. A pulsating bass played and the adoring crowd cheered as the grinning Downey approached individual audience members, slapping high-fives with men and kissing or hugging women. Typically attired in a Ralph Lauren oxford cloth shirt, regimental tie, khakis, and loafers, Downey faced the camera on-stage, announcing the controversial theme for that evening. Downey then cued the intro video, with the memorable theme depicting the host wearing boxing gloves, making strange faces, and sporting his pearly-white smile amidst a dizzying montage of newspaper headlines, military tanks, women’s legs, TV sets, and electric guitars.
The first episode aired on October 19, 1987, known as Black Monday, when the Roaring Eighties ended with a global stock market crash. The show became an instant hit—it was nationally syndicated by 1988—but critics were unimpressed with Downey’s approach. “The Morton Downey Jr. Show is, beneath the surface, a show about bullies,” wrote David Bianculli, the New York Post’s TV critic. “Downey, with the power of TV, becomes Head Bully, and encourages members of his audience to ‘get involved’—the same way the guy with the torch encouraged the mob around Frankenstein’s castle to get involved.” When the camera rolls Downey was “ready to fight and win,” wrote New York‘s David Blum. “Facing him are his fans. They want blood, and Mort is going to give it to them.”
The chain smoking host (he estimated smoking an average 80 cigarettes a day) ambushed his guests on episodes that addressed political incorrectness, divorce, the death penalty, race relations, affirmative action, Communism, and Vietnam. He invited audience members to approach his “Loud Mouths,” two podiums sporting the show’s open-mouth logo. Guests or audience members stated opinions but their remarks were often interrupted by Downey calling them “pablum puking liberals” and barking “zip it” or “shut up.” His shows even sparked physical confrontations among guests, with CORE chairman Roy Innis once knocking the Reverend Al Sharpton to the ground. Downey was forced to break up a potential stage brawl in Harlem’s Apollo Theatre.
Downey unsettled critics and advertisers, but audiences craved his brash populism. He presented a conservative leaning viewpoint, yet his positions remained ideologically complex and failed to properly fit into either party. His sympathies were always with the working class, whether it was supporting government-funded healthcare for the elderly or discussing the perils of illegal immigration.
On his own TV show, On the Record, Alan Dershowitz asked Downey how he could appeal to this frustrated group of people. “Because I was frustrated most of my life,” he responded. “I’m able to come out there and present my frustration which I now recognize are the frustrations that every American goes through at one time or another in their life.” Downey told the Associated Press that Americans saw him as a “loud-mouth who gets in trouble like they do, who’s had problems like they had, someone that they can identify with a lot more than someone who’s squeaky clean.”
For all his faults, friends and foes alike acknowledged that Downey’s crude media persona didn’t match the kind and humble man they knew off camera. He was simply a performance artist who enraptured the country, but his show quickly faced an expiration date. By 1989, advertisers fled what was labeled “trash TV” or “tabloid TV.” Downey attempted a gentler show, but off-stage antics hastened its demise. In April of that year, Downey claimed he was assaulted by a group of neo-Nazi skinheads, but his high-profile story didn’t add up. His incident was considered a hoax, and by September his producers and distributor announced the show’s cancellation.
Fast forward three decades and we now have a president who makes Downey’s show seem relevant and fresh. The parallels between Downey and Trump are stunning. They were privileged sons of complicated men, excruciatingly persevered in their respective fields, embraced show business and nurtured a growing audience with their working class language, encountered business failures in Atlantic City’s casino industry, and created an onstage spectacle that captivated the nation’s attention. They cultivated a base of aggrieved Republicans and Democrats who once cheered at Downey’s crude insults and now laugh at Trump’s stinging tweets.
In 1989, Downey taped an interview with Trump, who discussed America’s trade imbalance with Japan. “It’s a huge problem, and it’s a problem that’s going to get worse,” Trump said. “And they’re laughing at us.” Downey also asked Trump if he resented press criticism. “I think I used to resent it, I’m not sure I do anymore,” Trump responded. “Somehow, over a period of time, you become harder to this. You also realize it means less. It’s yesterday’s news. They take a shot at you in the newspaper, some reporter doesn’t like something for his own personal reason, so they take a shot. I find now—10 years ago I used to say ‘boy how could they do that, it’s wrong, it’s unfair,’ I find now that it really doesn’t matter.”
Trump is a cultural product of the late 1980s, and Downey deserves posthumous credit for creating the media ecosystem that led to this administration. The country continues to rapidly change, but the issues remain the same. The modern working class Americans who appreciate Downey or Trump fail to click with either political party. They’re just navigating the frantic pace of a global economy that often appears to present little hope for their future. In 1987, their outlet for populist release was Downey’s show. Now it’s Trump’s presidency. Downey is long gone, but his legacy has evolved into a new political frontier, the boundaries for which have yet to be defined.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania. Follow him on Twitter at @cfmcelwee
The great 20th-century American novelist F. Scott Fitzgerald often evoked the fleeting passage of time. He described the final weeks of summer in This Side of Paradise as that “sad season of life without growth.” On a particularly warm summer afternoon, The Great Gatsby’s Daisy Buchanan lamented her boredom. “‘What’ll we do with ourselves this afternoon,’ cried Daisy, ‘and the day after that, and the next thirty years?’” Her friend Jordan Baker dismissed Daisy’s morbidity, reminding her that, “Life starts all over again when it gets crisp in the fall.”
The mood in Washington easily corresponds with Fitzgerald’s melancholic theme. Congress returns to Capitol Hill this autumn hardly rejuvenated. President Donald Trump, in survival mode without a substantive legislative accomplishment, will market tax reform. Speaker Paul Ryan and Senate Majority Leader Mitch McConnell, estranged allies of the White House, will pursue their own tax reform plan while also addressing the devastating effects of Hurricane Harvey. Healthcare legislation appears terminal and unforeseen developments in foreign affairs could disrupt everything. And so Trump, along with a Congress controlled by his own party, begins the season without a clear legislative agenda or ideological path.
With Congress back in session, Union Station’s concourse will swell with commuters, congested restaurants will turn diners away, and the Trump International Hotel will maintain its distinction as this administration’s crossroads for lobbyists, policy makers, tourists, and loyal officials.
The Trump Hotel operates in the Old Post Office building, a stunning Romanesque Revival structure completed during the gilded 1890s. Its granite façade, crowned with a domineering clock tower, occupies a full city block on Pennsylvania Avenue. In the 1970s, the building faced an uncertain future on a deteriorating stretch of the street that also included the White House. It was Daniel Patrick Moynihan—esteemed policy wonk, indefatigable egghead, and later U.S. senator from New York—who crusaded to save the avenue he deemed the “main street for both the city and the nation.” Moynihan’s lobbying endeavors resulted in Congress’ creation of the Pennsylvania Avenue Development Corporation, which renewed the thoroughfare and saved the Old Post Office from the wrecking ball.
As meetings and dinners commence beneath the hotel’s chandeliers, perhaps Trump’s acolytes should revisit the man credited with saving their place of congregation. After all, it was Moynihan who gave Hillary Clinton his blessing to run for his Senate seat in 2000—a decision that indirectly led to this presidency. But the senator and his wife Liz were known for their reservations about the Clintons. In 2008, just a few years after Senator Moynihan’s death, Liz endorsed Barack Obama over Clinton, arguing that Moynihan would have approved the candidate’s work to “rekindle hope in our young as he encourages them to participate in the political process.” Liz, who served as her husband’s campaign manager, was also dismayed by Clinton’s hostility toward Obama’s candidacy.
We know the outcome of Clinton’s story, and the former candidate will exhaustively remind us of that story as she launches her autopsy tour in the coming weeks. But could the Trump administration, members of Congress, and pundits learn from her predecessor’s legacy? As we navigate the uncertain contours of this political era, could Moynihan’s voluminous writings, contradictory policy positions, legislative tactics, and astute synopsis of socio-economic problems offer a mild prescription for our present paralysis?
* * *
Daniel Patrick Moynihan was born an outsider. The son of a journalist and homemaker, Moynihan’s dysfunctional Irish Catholic family was affected by the working-class issues that fuel populism. When Moynihan’s alcoholic father abandoned the family, his childhood featured the unhappy experiences of shining shoes, hopping from apartment to apartment, and witnessing mass poverty in pre-war New York. Moynihan’s sociological diagnoses stemmed from this youth.
“From the wild Irish slums of the 19th-century Eastern seaboard to the riot-torn suburbs of Los Angeles,” wrote Moynihan, “there is one unmistakable lesson in American history: a community that allows a large number of young men to grow up in broken families, dominated by women, never acquiring any stable relationship to male authority, never acquiring any set of rational expectations about the future—that community asks for and gets chaos.”
When Moynihan issued this controversial warning, he could demonstrate that he succeeded in spite of exposure to this ecosystem. Moynihan graduated valedictorian of his high school class, worked on the piers of the Hudson River, earned his college degree from Tufts University on the G.I. Bill, became a Fulbright fellow, and completed his doctorate. He served as an advisor to both Democrats and Republicans—including the Kennedy, Johnson, Nixon, and Ford administrations—but he never wavered in his allegiance to Franklin Delano Roosevelt’s New Deal coalition. Moynihan believed that a person’s political affiliation was determined by the year he was born, and as a product of 1927, he considered himself in alignment with FDR’s Democratic Party.
But it remains difficult to decipher Moynihan’s ideological fidelity. He espoused liberalism, yet neoconservatives, neoliberals, and traditional conservatives considered him a member of their respective tribes. We are left with his innumerable writings, and they show a scholar whose views easily melded with those shared by Nathan Glazer, James Q. Wilson, Irving Kristol, Norman Podhoretz, and William F. Buckley. Moynihan was a creature of civility, and he shared this perspective during a period when conservatism ascended and liberalism retreated. Nearly 15 years after Moynihan’s passing, we find ourselves in a political epoch where both ends of the ideological spectrum appear in flux. During these years, blunders in Iraq and Afghanistan—accompanied by economic collapse, technological upheaval, rapid demographic change, and unrepentant globalization—have engendered disillusionment, socio-economic instability, and ultimately populist backlash.
Although Moynihan sometimes seemed contrarian for the sake of it, his work remains a brilliant exercise in prescience, providing borderline biblical wisdom for our present challenges in domestic and foreign affairs. If Moynihan were alive for the present upheaval, he would see his most acute societal warnings confirmed. He would understand why Trump is president, for he criticized both parties’ political failures for decades.
The Trump administration and Congress should take note of Moynihan’s prophecy. In 1992, he wrote that U.S. healthcare suffered from “Baumol’s disease,” commenting that the system was “plagued by cumulative and persistent rises in costs at a rate which normally exceeded to a significant degree the corresponding rate of increase for commodities generally.” Moynihan’s invocation of this economic phenomenon is hardly dated, and if Congress revisits healthcare reform, rising costs remain a fundamental problem. Yet as the Senate Finance Committee chair during Bill Clinton’s health care pitch in 1994-1995, Moynihan curiously declared that there was no crisis, playing an important role in derailing the universal health-care endeavor led by Hillary and senior advisor Ira Magaziner. But at the time, Moynihan questioned Clinton’s “intentionally obscure” proposal and shared the belief that the overall plan was too coercive.
On Obamacare, the perpetually skeptical statesman would have supported some type of reform. Moynihan would also champion welfare reform, rejecting the idea of permanent governmental dependence. The New Deal always guided Moynihan, and he was a beneficiary of that period, yet he was hostile toward the excesses of welfare that sprouted from Lyndon B. Johnson’s Great Society. Regardless, Moynihan still expressed concern for how reform could affect the poor and elderly. He once told his aide, Lawrence O’Donnell, that he could easily reform health care by deleting “65 and older” from the legislation limiting the Medicare plan to seniors.
Moynihan also presented a case for tax reform. In a 1981 newsletter, he wrote about the “‘bracket creep’ that has increased the real rate of taxes for people who have had no real increase in income.” Nearly four decades and six administrations later, the statement remains timeless. “A tax system must not only be fair, it must be seen to be fair,” he declared. “It is useful to keep in mind that our country was founded in a revolution that started over unfair taxation.”
Moynihan’s commentary rarely seems dated. His positions would reassure or dishearten both sides on today’s most pressing debates. The senator opposed NAFTA in 1994, complaining that “the United States government still doesn’t seem prepared to address the needs of those workers who will lose their jobs as a result of this agreement.” But at the same time, he supported lax immigration laws, arguing that enforcement would not discourage immigrants from coming to the U.S. “So, take down the walls and you make sure this migration will continue,” he said.
Moynihan also understood the perils of drugs and their corrosive impact. His observation in 1998 still rings true for the countless communities grappling with a heroin crisis: “Since the desire of man to alter his state of consciousness is as old as human history, and technology continues to provide a breathtaking array of drugs capable of producing everything from oblivious to nirvana, I think it safe to assume that we may never win a ‘war’ against drugs.”
Perhaps the closest modern-day comparison to Moynihan is former U.S. Sen. Jim Webb, whose moderate to conservative viewpoints didn’t quite fit in either political party. Webb himself confirmed Moynihan’s influence in a 2008 interview with the New Yorker. “My prototype really is Daniel Patrick Moynihan, who was able to combine an academic, intellectual career and a government career, really put new ideas out there, and think,” Webb said.
In 1993, Moynihan wrote an American Scholar essay, “Defining Deviancy Down,” in which he argued that American society was increasingly lowering its standards for tolerating bad behavior. He lamented eroding families, rising crime, and a general permissiveness toward traditionally unacceptable behavior. Nearly a quarter-century later, it seems quite clear that what was once abnormal is now normalized. This is most evident in the post-industrial communities ravaged by socio-economic maladies. Moynihan understood these problems and offered a substantive intellectual take on the issues that resonate in these cities and towns.
In what often seems like a dark age, it is worth revisiting Moynihan’s work. An ideal start is Daniel Patrick Moynihan: A Portrait in Letters of An American Visionary, a hefty volume of the senator’s political writing, memorandums, and letters. Few lawmakers could compete with Moynihan’s wide-ranging intellectual interests, but perhaps they will find solace, or even inspiration, as they navigate our ambivalent future. But for now, Congress returns to Trump’s Washington as boats against the current, rowing toward an irresolvable future.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania.
In a more uneventful year, the centennial of John F. Kennedy’s birth would attract media-induced introspection. But the dervish news cycle prohibits reflecting upon such benign, yet relevant, political moments. As we breathlessly keep pace with Twitter feeds, chyrons, and emailed newsletters, Kennedy’s 100th birthday appears confined to Time Life retrospectives in grocery checkout lines. Perhaps this fate was unavoidable. A cluttered Kennedy canon, accumulated over years of painstaking probes, stripped the 35th president’s life of the cultural relevance it once represented.
But this frenzied present deserves some historical clarity. In a poignant commencement address this year, Peggy Noonan lamented the absence of historical perspective among young reporters. “They have seen the movie and not read the book. They’ve heard the sound bite but not read the speech,” said Noonan. “Their understanding of history, even recent history, is therefore superficial.”
And so Kennedy’s prosperous upbringing, inevitable political rise, and tragic death prove worthy of revisiting during this era of socio-economic and political upheavals. His life, and the period in which he presided, could help us interpret the fractured fault lines of this current cultural moment. Unfortunately, there is no series that fully captures the kingdom, power, and the glory of the Kennedy hierarchy (as Robert Caro did with his multi-volume The Years of Lyndon Johnson). But the latest addition to Kennedy’s biographical canon, The Road to Camelot, is an important companion to understanding the presidential campaign process and the coalitions that produce surprising, yet razor thin, electoral outcomes.
Written by Thomas Oliphant and Curtis Wilkie, the veteran journalists entertain without injecting political gossip, and analyze without risking historical groupthink or indulgent revisionism. The book demonstrates how Kennedy’s victory in 1960 was the culmination of years of preparation and grassroots recruitment. Under the initial oversight of Kennedy’s father, Joseph, the family’s inner circle commanded what became an unconventional path. Following the Democratic convention in 1956 and through the 1960 election, the Kennedy campaign tirelessly navigated a volatile and evolving electoral map. During this five-year quest for national office, Oliphant and Wilkie write that Kennedy’s non-stop “traveling, speaking, maneuvering, and writing enabled his ‘positives’ (fresh, vigorous, new thinking, Catholic, activist) to keep his ‘negatives’ (inexperienced, unknown risk, bigger government, Catholic) at bay.”
When grieving the present electoral landscape, pundits often blame how reform disempowered the professionals and hacks of both political parties. But Oliphant and Wilkie remind us that rebellion against party norms is nothing new to campaigning. Kennedy disdained how the Democratic machine operated, and after his first Congressional victory in 1946, he aggressively worked outside the party’s suffocating confines. With the help of his brother, Robert, and a pack of loyal comrades, Kennedy maneuvered his rise as a national political figure through grassroots organization (“Kennedy secretaries”), innovative marketing, and policy triangulation. Kennedy’s cozy relationship with the press helped this endeavor. The Kennedy family’s fortune and fame attracted regular coverage, and the frequent magazine profiles and photo spreads made the candidate’s rise almost pre-destined. Regardless, Kennedy was an energetic campaigner, having made over 140 appearances across the country in 1957 alone.
The Road to Camelot shows how the mythology of Kennedy, masterfully crafted at the time, concealed the sins that would imperil any modern campaign. The authors are clearly fond of their subject, but they present Kennedy’s odyssey without clouding the reality of dirty tricks, reckless behavior, and extramarital affairs. Kennedy successfully hid these darker compartments thanks to an adoring press. This turned out much to Vice President Richard Nixon’s misfortune. Unlike his opponent, Nixon encountered a series of unlucky and widely reported developments throughout 1960. During a critical campaign stretch, an infected knee confined Nixon to bed rest. At one point, even his own boss, President Dwight Eisenhower, humiliated him at a White House press conference. When asked if he could give an example of any major idea Nixon contributed to his presidency, Eisenhower responded, “If you give me a week, I might think of one. I don’t remember.”
The overall campaign’s mythology wounded Nixon. Countless profiles recount Nixon’s cadaver-like appearance at the first presidential debate, but the initial reaction to both candidates’ performances was more measured. As New York Times columnist James Reston concluded that September, the debate “did not make or break either candidate.”
Nixon could never compete with Kennedy’s telegenic persona, a gift less endowed than painfully crafted through constant practice and diaphragm exercises. Kennedy’s advertising team also embraced the embryonic stages of the modern Information Age. Their tactics seem primitive and innocent compared to the algorithms, data mining, and psychological profiles culled by today’s campaigns. But the campy jingles, pamphlets, and television commercials were innovative measures that gave high hopes for Kennedy’s electoral success. After all, this was the pre-smartphone age, a time when the critic Dwight Macdonald could soberly make the following declaration: “As smoking gives us something to do with our hands when we aren’t using them, Time gives us something to do with our minds when we aren’t thinking.”
* * *
The Road to Camelot also reminds us that Kennedy understood how Catholicism could derail his electoral chances. But Kennedy refused to downplay this factor, embracing his Catholic identity throughout the campaign trail. A top advisor, Walt Rostow, recalled Kennedy’s decision to barnstorm the West Virginia primary. Campaign officials feared a punishing defeat, but Kennedy understood that confronting a state with historically anti-Catholic pockets was necessary. “I have no right to go before the Democratic convention and claim to be a candidate if I can only win primaries in states with 25 percent or more Catholics,” Kennedy said. “I must go in there. And I am going in.”
The authors devote generous coverage to Kennedy’s Catholicism, showing how he gained national popularity without religion disqualifying his candidacy. Beginning in 1956, Kennedy delegated his long-time aide, Ted Sorensen, to research the Catholic vote. Sorensen’s findings resulted in the “Bailey Report.” Misleadingly attributed to John Bailey, a party boss from Connecticut, Sorensen’s report explored the viability of a Catholic presidency. Before the report’s prescient findings, political operatives had concluded that Al Smith’s 1928 defeat spelled the end of placing a Catholic on a presidential ticket. But Sorensen concluded that there was “such a thing as a ‘Catholic’ vote, whereby a high proportion of Catholics of all ages, residences, occupations and economic status vote for a well-known Catholic candidate or a ticket with special Catholic appeal.”
The book’s exploration of ethno-religious politics is an especially powerful subject to revisit following the 2016 campaign. The Democratic platform is now beholden to identity politics. Their polarizing coalition building ignores the ethnic Catholics who built the party in Northern and Midwestern cities during the twentieth century. Amazingly, it was only a half century ago that pastors, columnists, and politicians could openly incite anti-Catholic bigotry. When the Protestant minister Norman Vincent Peale railed against Kennedy’s Catholicism and questioned his national allegiance, the candidate responded by invoking the death of his brother Joseph P. Kennedy Jr. in service during World War II.
Oliphant and Wilkie reveal that Kennedy’s 1960 victory was no mandate. Outside of his home state of Massachusetts, more Americans voted for Nixon than Kennedy. His triumph was made possible by the Catholic turnout in states like New York, Pennsylvania, Michigan, and Illinois. For Catholics in urban and semi-rural regions, Kennedy’s election was a tribal benediction. The candidate himself was less important than what his candidacy signified. In row homes, duplexes, and shanties, Kennedy quickly earned his real estate on living room walls alongside boxers, union bosses, and the Sacred Heart.
Should we be surprised that the descendants of this coalition voted for Donald Trump? Exit polling indicated that Trump carried the Catholic vote, with Pew reporting a 7-point winning margin. Barack Obama previously won the Catholic vote in 2008 and 2012. While cultural issues certainly played a role in this shift, a closer inspection indicates where Catholic voters switched sides.
The growing literature on the plight of Appalachia deflects attention from the struggles of dwindling ethnic Catholic communities. For so many ethnic hamlets in Michigan or Pennsylvania, a middle-class existence either evaporated or never arrived. Their cash strapped local governments fail to provide adequate services. Rapid demographic change overwhelms their school districts and neighborhoods. Manufacturers announce layoffs and retail malls close.
The collective effect erodes any semblance of social cohesion. In these communities, individuals live in the same neighborhoods where their immigrant Catholic families arrived a century before. But the jobs are gone and their parish is consolidated or closed. These Democratic strongholds supported the labor movement, disproportionately served in the First and Second World Wars, and rallied around Kennedy in the hope of cultural vindication. This coalition remembers when Kennedy famously asked what Americans could do for their country. After reading this masterful study, one cannot help but ask what the current Democratic party has done for working-class Catholics.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania.
The Past and Future City: How Historic Preservation is Reviving America’s Communities, Stephanie Meeks, Island Press, 352 pages.
In the mid-1960s, San Francisco enjoyed a blue-collar existence. It remained a city where Sicilian fishermen caught salmon, painted boats, and repaired nets along the pier. Middle-class families contentedly resided in ethnic neighborhoods that could easily blend in any East Coast city. Joe DiMaggio, long retired from the New York Yankees, carried out an enigmatic existence running a restaurant along Fisherman’s Wharf.
DiMaggio’s retreat to San Francisco inspired legendary journalist Gay Talese to visit the baseball legend and his home terrain in 1966. When Esquire published Talese’s classic piece, “The Silent Season of a Hero,” San Francisco was already distinguishing itself as a tourist attraction and haven for the hippie counterculture. But the city had yet to experience the shifting socio-economic fault lines that now make his portrayal seem more like a time capsule than a timeless cinematic presentation.
In today’s San Francisco, the idea of enjoying a reasonable financial existence is foreign to the descendants of immigrants who flocked to the Bay Area in the late 19th and early 20th centuries. Google’s private buses shuttle tech bros down to Silicon Valley in a city where the unrepentant pace of gentrification and technology are erasing any sense of historical identity. To live a middle-class life in San Francisco is now a form of augmented reality. The city’s transformation, and the economic maladies that inevitably arose, is one of the subjects of Stephanie Meeks’ The Past and Future City, a manual describing the role of historic preservation in America’s cities.
Meeks explores how gentrifying neighborhoods imperil heritage businesses and displace middle-class and immigrant families—disregarding the prescriptions of diversity from Jane Jacobs’ influential 1961 book, The Death and Life of Great American Cities. According to Meeks, commercial rents in San Francisco have risen 250 percent since 1999. In 2014 alone, an estimated 13,000 businesses closed, including 4,000 that had operated for over five years. While San Francisco has responded by creating a Legacy Business Preservation Fund, which provides grants to businesses and nonprofits that have a historical impact on local neighborhoods, the city has yet to resolve the problem of balancing gentrification with making the city financially habitable for all residents.
Meeks, who co-authored the book with Kevin Murphy, oversees the National Trust for Historic Preservation. During her tenure, the organization has worked to revitalize communities and create programs like National Treasures, which identifies threatened historic places and aggressively works to save them. Meeks admirably reminds readers that preservation is an issue far more complex than just rescuing old buildings. Instead, saving places is about “defin[ing] a community so that future generations can know their past, feel a connection to those who came before, and build a foundation for the future.”
The book is frequently a lamentation of the casualties of gentrification. As major cities celebrate a renaissance of new commercial and residential development, long-time residents mourn the loss of storied bars, diners, and stores. To express grief over a closed coffee shop or record store is understandable, but it is now fashionable to grow nostalgic over the squalor that once defined today’s thriving neighborhoods. Many baby boomers, harkening back to the ‘70s and ‘80s, habitually romanticize the stomach-churning sense of apprehension one once felt walking to a row-home apartment or passing seedy bars and theaters. In hindsight, this atmosphere allegedly sparked creativity.
Publishers have embraced a literary niche of memoirs mourning how gentrification sterilized the grime and extinguished the last vestiges of baby boomer youth. Mayor Abe Beame’s New York is usually the stage for these autobiographical expressions of sorrow. Whether it’s Patti Smith’s Just Kids or James Wolcott’s Lucking Out, we’re told that art, love, and happiness thrived in a downtown Manhattan that featured crumbling tenement buildings, lurking muggers, and drug addicts, along with a municipal balance sheet full of red ink.
None of these books would exist if that metropolitan Dark Age had endured. The luxuries of safety, stability, and success allow these writers to regret whatever blight and decay wasn’t preserved. What stands out among this canon is Anatole Broyard’s Kafka Was the Rage, a reminiscence of New York just before the nadir of the 1970s. Broyard, who wrote for the New York Times, movingly described Greenwich Village as it existed immediately following World War II, when returning soldiers—armed with the G.I. Bill—flocked to the neighborhood to learn, create, and debate in a bohemian ecosystem.
“Though much of the Village was shabby, I didn’t mind,” recounted Broyard. “I thought all character was a form of shabbiness, a wearing away of surfaces. I saw this shabbiness as our version of ruins, the relic of a short history. The sadness of the buildings was literature. I was twenty-six, and sadness was a stimulant, even an aphrodisiac.”
Of course, to be young and comfortably enjoy artistic pursuits in today’s Village is a fantasy. But should we condemn the unforeseen consequences of gentrification? Concerns about its impact on city neighborhoods date back to the 1960s. Losing a small business or facing an apartment eviction are tragic outcomes, but do native, poorer residents really lament cleaner and safer streets?
Meeks cites studies from the University of Washington and Columbia University that found poorer residents less inclined to flee a neighborhood undergoing gentrification or experiencing higher rents. “The most plausible interpretation,” wrote Columbia’s Lance Freeman and Frank Braconi, “may be the simplest: As neighborhoods gentrify, they also improve in many ways that may be appreciated by their disadvantaged residents as by their more affluent ones.”
Arguably, it’s through gentrification that preservation prevails in city neighborhoods. An investor is more inclined to restore an aging corner store into a microbrewery if the neighborhood sheds its unpleasant past. In American cities lucky enough to experience gentrification, the survival of historic buildings often depends on a neighborhood’s transformation.
Unfortunately, a healthy city doesn’t always translate into municipal leaders protecting historic buildings and neighborhoods. Instead, the mutually dependent forces of globalization and technology threaten our traditional concept of cultural or architectural heritage. In San Francisco or New York, the explosion of private global wealth since the 1990s created a demand for premium housing. For Manhattan, this radical development resulted in the construction of narrow high rises that puncture the island’s elegant skyline. But this dramatic change also expanded the boundaries of New York’s prosperity. Without gentrification, the blocks of structurally unique housing in Harlem, Park Slope, or Fort Greene would have fallen into disrepair.
Meanwhile, in Philadelphia, a generous tax abatement program—combined with flawed preservation policies—changed the landscape of Center City and its surrounding neighborhoods. During the past decade, the city experienced a demolition bonanza. In neighborhoods of historic housing like Powelton Village, developers “have discovered they can make a tidy sum simply by replacing one of these old houses with a stucco-clad apartment building and then cramming it with students,” observed the Philadelphia Inquirer’s architecture critic, Inga Saffron.
Preservationists have protested how Philadelphia’s demolition permits are approved. In the past year, two historic structures were razed for projects that never materialized. Currently, the preservation fight is focused on Center City’s Jeweler’s Row, a charming set of commercial properties on a block laid out in 1799. The historic storefronts, slated for demolition, would be replaced by a massive condo tower. Ill-advised demolition projects continue throughout Philadelphia, its future depending on an influx, however temporary, of millennials and students.
Cities like Philadelphia risk losing what shaped their identities, but their economic perseverance and favorable location at least provide prospects for preservation-minded investment projects. In The Past and Future City, Meeks’ greatest shortcoming is her failure to address how preservation can succeed in smaller Rust Belt cities confronting economic decline, urban blight, and rapid demographic and cultural change. After all, any of these post-industrial hubs would welcome the fruits of gentrification.
In recent months, field producers for cable news networks have flocked to these cities, attempting to document how Donald Trump triumphed in this so-called “flyover country.” Arriving like colonial administrators, they snap photos of dilapidated buildings and shuttered factories before heading to the nearest bar or fast-food restaurant to interview natives about how they endure in places overcome by urban decline.
Revitalization and reinvestment shouldn’t be confined to coastal cities. At a time when New York and Philadelphia regret what gentrification wrought, smaller urban pockets aggressively search for ways to make such blessings occur. Unfortunately, these communities do not enjoy the socio-economic advantages that allowed larger cities to overcome urban decay.
In countless smaller cities, multi-generational families reside in neighborhoods overwhelmed by blight and dysfunction. In 1984, George L. Kelling and James Q. Wilson famously attributed urban decay to modern mobility patterns and the retreat of police authority through vagrancy laws. They noted that before World War II, “City dwellers—because of money costs, transportation difficulties, familial and church connections—could rarely move away from neighborhood problems.” This forced residents to reclaim normalcy on their city streets.
We have returned to that pre-World War II trend, but without the institutions that once held these communities together. Stagnant wages leave many no alternative but to live in neighborhoods overwhelmed by drugs, gangs, and blight. Rapid demographic and economic change—accompanied by a fragmenting civic culture—make it difficult to apply the lessons of historic preservation in these cities. Magnificent old churches are shuttered or demolished. Stunning commercial or industrial buildings—the legacy of architects trained in New York or Paris—fall into disrepair. The fading vitality of a city’s physical surroundings becomes emotionally paralyzing. As Pete Hamill reminds us: “Nostalgia is genuine—you mourn things that actually happened.”
If applied properly, practical attempts at preservation could protect architectural treasures, build community morale, and improve overall neighborhood-level sociability. But gentrification may never arrive in these communities. While Meeks discusses the challenges of gentrification in major cities, she does not explain how post-industrial regions can embrace preservation without economic development as part of the cycle of urban revitalization.
For Meeks, a critical preservation tool is flexible or mixed-use zoning, which permits historic structures to be readily converted through adaptive reuse. But this could easily backfire in smaller cities. As Nicole Stelle Garnett wrote in her brilliant work on land use policy, Ordering the City, mixed-use zoning “might lead some low-income entrepreneurs to establish the types of businesses equated with urban decay.” In other words, a city may develop a comprehensive plan that encourages mixed-use zoning, but the resulting pawn shops, used-car lots, and check-cashing stores would destroy the character of whichever neighborhood was targeted for preservation.
Smaller cities and towns boast architectural, cultural, or industrial legacies that could unite communities, encourage preservation, and, if properly marketed, spur revitalization. But preservation transcends saving an old office building, factory, or church. It’s no longer decay that risks deconstructing our urban landscapes. Empty pews, folded newspapers, and defunct social clubs remind us that the foundation of America’s civic culture is collapsing.
As we retreat to a parallel existence through pixelated screens, our society becomes disengaged from the pillars of heritage that created communities, sustained neighborhoods, and brought families together. A steeple, marquee, or bank entrance tells a story. How we preserve that story remains to be seen. At the very least, repairing our civic culture could save many struggling cities and towns that still nurse an acute sense of culture and history. But gentrification is an unavoidable step in embracing the neighborhoods of our forefathers. It’s only then that we can come to love these cities as William Faulkner loved his native Mississippi—in spite of, not because.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania. This article was made possible with support from the Richard H. Driehaus Foundation.
Writing in his memoir about the literary avant-garde of Paris and New York in the 1950s and 1960s, Richard Seaver lamented the imperfections of memory. The late publisher and editor cautioned that, “Time is not kind to the harried mind.” Seaver explained that he “tried to tell these tales as they were, accurately and fairly, and if I have erred or memory has betrayed, blame it on those Irish genes my wife has encountered over the years.”
It’s a safe assumption that the once prominent American writer John O’Hara would never issue such an apologia. Irish genes predispose their recipients to hyperbole and embellishment, but O’Hara wasn’t a carrier of this ancestral foible. His meticulous devotion to detail produced novels and short stories that captured the cultural temperament and consumer preferences of America between the 1880s and World War II. He didn’t want us to forget either, as his epitaph suggested: “Better than anyone else, he told the truth about his time. He was a professional. He wrote honestly and well.”
O’Hara died in 1970, but the imperfections of literary appreciation—accompanied by the unremitting passage of time—have sentenced the author’s work to a cruel purgatory of faded memory. The latest attempt to revive O’Hara’s legacy comes through the Library of America, which recently released a volume of sixty stories. The collection, compiled by the New York Times’ Charles McGrath, reminds us that O’Hara remains the Bard of the American Short Story.
Once known as the “master of the fancied slight,” O’Hara has suffered a posthumous assault for his grudges, arrogance, and alcohol-induced temper. His contemporaries in mid-20th century Manhattan acknowledged his talents but denied him absolution. And so O’Hara’s penance lingers with each passing decade, with no revival restoring his invaluable and boundless literary contributions.
As critics frantically search for historical and literary clarity during this newly christened Trump era, John O’Hara’s work is a necessary start. His stories provide a factual account of the period of unparalleled growth during the late 19th and early 20th centuries. At a time dominated by Snapchat, where people express without reflection, O’Hara offers a curative message into how people actually think, feel, and experience the extraordinary and ordinary of everyday life.
Born in 1905, O’Hara’s anthropological sense of place wasn’t nurtured on the tension-soaked streets of some coastal Eastern metropolis. His brilliant gift for observation and ear for dialogue germinated instead in Pottsville, a small city tucked away in the anthracite coal region of Schuylkill County, Pennsylvania. It’s impossible to understand O’Hara without understanding his hometown.
Pottsville was the commercial center of the region’s southern coal field. Approaching O’Hara’s hometown on Route 61, one can recognize the important role this city once played during the height of the Industrial Revolution. Its grid of dramatic hills supports blocks of pre-World War II architecture, structures dressed up in the ornate styles that only Gilded Age wealth could build. Overlooking the city of steeples and Federalist townhomes is the Schuylkill County Courthouse, a massive Queen Anne edifice that conceals the region’s darkest chapter in history.
Located behind the courthouse is the county prison, a brownstone castle façade dating back to 1851. It was here, on June 21, 1877, that six men were executed for their alleged acts of violence against mine bosses. The men were believed to be members of the Molly Maguires, an oath-bound Irish secret society with agrarian roots in Ulster.
During the 1860s and ’70s, Irish laborers protested the medieval conditions of the mines and company towns. A series of heinous crimes were committed against Welsh and English mine bosses, and the acts were subsequently tied to these Irish laborers. The powerful Philadelphia & Reading Railroad, which controlled Schuylkill County’s mines and railroads, hired a Pinkerton detective to uncover the alleged Mollies’ crimes. Twenty Irishmen were placed on trial and they were convicted by mostly German-speaking juries. The hanging of six men in Schuylkill County was part of the largest mass execution in Pennsylvania history.
Historians continue to debate the very existence of the Molly Maguires. The historian Kevin Kenny wrote that the Mollies “have been depicted in every imaginable way, from sociopaths and terrorists at one end of the spectrum to innocent victims and proletarian revolutionaries at the other.” Regardless, any association with the Mollies or Ancient Order of Hibernians risked excommunication from the Roman Catholic Church. Archbishop James F. Wood, a convert from a patrician Philadelphia family, frightened the region’s Ulster and Connaught immigrants with this threat, and so the very mention of the Mollies remained verboten through the early 20th century.
The Mollies’ legacy still haunted the region during O’Hara’s youth, a time when he digested the stark and latent cultural complexities of his surroundings. A devout social historian, O’Hara recounted the ethno-religious tensions in his “Gibbsville” stories, the fictional name he pinned on his hometown in honor of The New Yorker’s Wolcott Gibbs. In just one example, “Afternoon Waltz,” O’Hara reports the unspoken strain between an Irish-Catholic maid and her Welsh-Methodist host family. In spite of the divisions that the Mollies created among the Irish, O’Hara wrote that the “non-Mollies were loosely united with the Mollies by the common enemy, the Welsh Protestants, and when Sarah Lundy first went to work for the Evans family neither party could guess how the arrangement would work out.”
When not deciphering the socio-economic fault lines of his native geography, O’Hara revealed the endless flaws and disappointments of characters in Hollywood, Philadelphia, and New York. But when discounting O’Hara’s undervalued novels, his richest contribution remains the “Gibbsville” stories, a number of which can be found in the Library of America collection. Although the stories bracket a unique period of American society, O’Hara’s themes remain timeless through his evocative character portrayals. The stories grapple with money and class, with O’Hara exposing the resentment, perversion, insecurity, and cruelty of humanity as it reaches, retains, or loses its aspirations.
O’Hara’s stories were not clever literary inventions, but accurate depictions from his own memory and experience. He was raised in a “lace-curtain” Irish-Catholic family, growing up in a spacious rowhome—across from the Yuengling Brewery—on Pottsville’s fashionable Mahantongo Street. The O’Haras enjoyed a better place in the city’s social hierarchy than fellow members of their tribe, but their position remained tenuous. His father, Dr. Patrick O’Hara, was known as “the Irish doctor,” a founder of the city’s Catholic hospital, and a legendary surgeon for victims of mining accidents.
The Molly Maguire episode, bouts of labor unrest, and a growing Democratic Party induced a lingering anti-Catholicism and prejudice against the Irish. The Klan-fueled defeat of Al Smith in the 1928 presidential election only amplified this reminder. But the O’Haras enjoyed their place in society, and whatever subtle hostility existed was checked by their financial security.
As a boy, O’Hara enjoyed all the material possessions of a young country squire. If anything, O’Hara matured in a city where groups were segregated for their socio-economic standing. But a simple stroll along Centre Street exposed O’Hara to the WASP elite, Pennsylvania Dutch merchants, poor southern and eastern European immigrants, and the struggling working class members of his own tribe. It was through Pottsville that O’Hara learned how financial predestination determined not only where a person lived, but which college he attended and clubs he joined. The alternative was a life of factory jobs, too many children, and eternal economic insecurity.
As a young man, O’Hara was considered a snob, and a preference for riding boots didn’t help his reputation. But his social standing and material preferences didn’t discourage the boy O’Hara from playing with all children, regardless of background. O’Hara demonstrated this “spurious democracy” in his 1934 masterpiece, Appointment in Samarra.
The novel, set in Gibbsville in 1930, follows the downfall of Julian English over a three-day period during Christmas. O’Hara occasionally flashes back to Julian’s earlier years, such as when the sons of the wealthy and poor would play a ball game. O’Hara delineated what was left unsaid during ball games: “you did not talk about jail, because of Walt’s father; nor about drunken men, because there was a saloon-keeper’s son; nor about the Catholics, because the motorman’s son and one bookkeeper’s son were Catholic. Julian also was not allowed to mention the name of any doctor.”
O’Hara refined his craft when driving his father to house calls in the patch towns. While waiting in the car, he read F. Scott Fitzgerald, Sinclair Lewis, and Booth Tarkington. An early immersion in realistic character development, along with careful listening to how people spoke, inspired stories like the autobiographical “The Doctor’s Son.”
In life and fiction, O’Hara was paralyzed by his preoccupation with class. This yearning for recognition stemmed from a heavy dose of misfortune and self-destruction in early life. As a teenager, he floundered in prep school after prep school, until finally excelling and becoming valedictorian at Niagara University Prep. Approaching the academic finish line, O’Hara thought he might fill a slot at his beloved Yale. But a drunken night on graduation eve resulted in expulsion and shattered dreams. O’Hara’s misfortune continued with his father’s early death. Among Dr. O’Hara’s last words were, “poor John,” and a warning that his son had a spot on his lung. The father died without a will, leaving his family with almost no money.
After a faltering newspaper career, O’Hara’s misfortune slowly turned in New York, where he befriended members of a famed literary collective, the Algonquin Round Table. He is best remembered for his long and turbulent career with The New Yorker, which ran most of his short stories. But O’Hara’s fame never made him feel fully accepted into the club. O’Hara’s character flaws were indisputable, and yet his Irish Catholicism—no matter how hard he worked to drop it—was likely the biggest reason he only achieved partial membership in the Protestant-dominated establishment. When reporting for Time, O’Hara commented that his editor, Henry Luce, always gave him “that Protestant look.” He expressed this bitterness in his writing as well: “We’re Micks, we’re non-assimilable, we Micks.”
At a time when a large proportion of the population once again seems to feel alienated from the establishment, the characters of Gibbsville and beyond remind us that geographical and class divisions have long been part of the American story. And that is why it’s time to read O’Hara again.
Charles F. McElwee III works in the economic development sector in northeastern Pennsylvania.
It’s been more than a quarter-century since Harry “Rabbit” Angstrom, the protagonist of John Updike’s sweeping quartet of middle-class life in America, died in the final novel of the series, Rabbit at Rest. In Rabbit, Updike presented an everyman who inelegantly navigated the political, social, and economic coordinates of his time. The glance of a newspaper headline, an overheard song on the radio, the survey of a changing neighborhood—these were the plot elements that directed Rabbit’s dysfunctional march into modern time. Revisiting Updike’s Rabbit novels is a rendezvous with prescience, for no collection of postwar fiction could help us better understand how working-class populism—in the form of Donald Trump—prevailed on Election Day 2016.
In the course of four decades (1959–1989), we read how Rabbit travels through life in a fictionalized Brewer (Reading), Pa. We view him as a jaded ex-high school basketball star, an instigator of dysfunction, an inheritor bound for excess, and finally, a middle-aged man overcome with nostalgia. Rabbit’s life parallels the political and social milieu of postwar America, whether it’s rebellion against conformity in the 1950s (Rabbit, Run), racial conflict and cultural anarchy in the 1960s (Rabbit Redux), financial excess in the late 1970s (Rabbit Is Rich), or uncertainty about the country’s future in the late 1980s (Rabbit at Rest).
We learn how Rabbit’s political thoughts evolve throughout Updike’s tetralogy, a time capsule of how Americans responded to the current events of their time. Rabbit and his Diamond (Berks) County ilk are conservative Democrats, products of the New Deal who support entitlements, defend Vietnam, possess an unbending patriotism, question their country’s economic future, and nurture a working-class intuition. “I’m a conservative,” Rabbit proclaims in Rabbit Redux. “I voted for Hubert Humphrey.”
Although Rabbit supported Humphrey in 1968, he later has a “Reagan Democrat” conversion, voting for George H.W. Bush in the final novel. If anything, he’s the fictional embodiment of a political prototype, a cross-party coalition infuriated by the loss of what communities like Brewer once symbolized: economic prosperity and a shot at a stable middle-class American life. The Rabbit novels could serve as the fictional companion to any social-policy book by Charles Murray. The realism of Updike’s characters and plot lines is a tribute to Updike’s understanding of this durable voting bloc, one that determined Hillary Clinton’s fate.
Updike’s insight stems from his childhood, rather than some removed anthropological immersion. A native son of Berks County, Updike was the prodigy of a region he fondly profiled throughout his literary work. As Adam Begley wrote in his biography of Updike, he was “very much preoccupied with Berks County (Plowville, Shillington, Reading), both in his stories and in his novels; it’s not too much of a stretch to say that he lived there most mornings.” Updike’s knowledge produced not only groundbreaking fiction, but an accurate portrayal of his home base.
A short drive from Philadelphia, Berks County is a magnificent vista of sprawling lowland underlain by rich soil and limestone. Its valleys contain meandering streams, tributaries of the Schuylkill River, once a canal system that delivered anthracite coal from the state’s northeastern cities and towns to Philadelphia. The broad line of the Blue Mountain ridge is a constant presence, while gently sloping hills bound the county’s southern tier.
The urban focal point is Reading, the county seat and Pennsylvania’s fifth largest city. From a distance, the city is an impressive layout of turreted row homes and Romanesque churches, their colorful brick and stonework amplified or shadowed by Mount Penn. In the city’s center is the courthouse, an Art Deco structure strikingly reminiscent of Los Angeles City Hall. A bridge on Reading’s Penn Street links the downtown to its leafy suburbs, towns like Wyomissing or West Reading that could easily blend in to New York’s Westchester County.
Berks County is also the cultural pulse of the Pennsylvania Dutch, a group of German-speaking immigrants who settled in the south-central and eastern portions of William Penn’s colony in the early 18th century. They settled not only in farming communities, but also Reading, which was far more ethnically homogenous and Protestant compared to most urban centers. The passage of time never diluted their heritage, a tenacious combination of conservatism, family tradition, geographical contentment, and Protestant sensibility. As a political force, they tended to deliver the county to Republican candidates at the presidential level, while supporting Democratic candidates for lower offices. Their support for labor often resulted in perplexing electoral outcomes. During the Depression, Reading was the nation’s only city whose council was completely controlled by the Socialist party.
It’s upon closer inspection that one can discern the socioeconomic dynamics of Updike’s literary protectorate. Reading’s impressive architecture fails to conceal the city’s many challenges, from strained municipal resources and corrosive blight to rampant crime and drug use. Once a hub for railroads and textiles, Reading now confronts the realities of urban decline. During Updike’s youth, flashing theater marquees, lively nightclubs, and bustling hotels defined “The Pretzel City.” But this reputation had faded when Updike described his hometown in the 1970s and 1980s. By 2011, Reading earned the unfortunate distinction of being the country’s poorest city, with U.S. Census Bureau data showing the population of 88,000 having the largest share of residents living in poverty.
A few weeks before the election, the Wall Street Journal’s Bob Davis and Gary Fields profiled Berks County’s fraying social institutions in their “Great Unraveling” series. They noted the county’s 30 percent decline in manufacturing jobs, compounded by a 6 percent decrease in inflation-adjusted median income since 1995. They also noted the changing demographics, with Latinos comprising a majority of Reading’s population and the white working class moving into its sprawling suburbs.
In April, Donald Trump and Bernie Sanders carried Berks County in their respective party’s primaries. At a Sanders rally in Reading before the primary, a New York Times reporter asked a retired steel worker whom he’d support in a Trump-Clinton matchup. He responded, “I would probably go for Donald Trump.” A drive along Interstate 78 in the succeeding months clearly signified where the county’s political preference was heading. Countless Trump signs adorned the barns and equipment of farms lining the highway. Trump ultimately won Berks County, significantly outperforming Romney’s victory in 2012 and nearly reaching Barack Obama’s winning margin in 2008.
If Updike’s novels taught us anything, it’s that the Trump coalition is the consequence of the budding frustrations found in the Rabbit novels. The problems that Rabbit encountered were decades in the making. As a political force, their populist strand just happened to reach its explosive point at the polls this November. They represent an economic bracket burdened by financial insecurity, negatively impacted by trade deals, and resentful over current immigration policies. They’re also voters, like Rabbit, who still constitute the largest demographic group in the U.S. But as David Frum wrote earlier this year, this group of “noncollege-educated people of European descent, are not only earning less than a generation ago, they are marrying less, raising more children outside marriage, taking more drugs, and dying earlier.”
In places like Berks County, Trump’s supporters personified a labor movement, comprising Democrats and Republicans who were devoid of ideology and believed Hillary represented the policymakers who eroded the state’s working class. They’re voters who drive through their hometowns, see the structural carcasses that housed steel mills and textile factories, and lament the shuttered churches and dilapidated homes that once provided spiritual and physical shelter for their ancestors. They’re witnesses to how communities like Reading have suffered from decades of job loss and rising poverty. They’re the reason Trump became the first Republican to win Pennsylvania since 1988.
Updike passed away only a week after Barack Obama’s presidential inauguration in 2009. The author’s departure from this world, where he sought to “give the mundane its beautiful due,” robbed him from observing the economic tumult, technological change, and political gridlock that ultimately led to Trump’s presidential victory. We’re left wondering how this chronicler of post-industrial America would interpret the Obama years and the subsequent populist revolt. Would Updike have created a fictional character, like Rabbit, who would listlessly traverse the unrepentant speed of automation and globalization, and the resulting erosion of working class life? In a year of unforeseen events and surprising political developments, this time Rabbit’s fellow Pennsylvanians didn’t run—they voted.
Charles F. McElwee III works in the government affairs sector in Harrisburg.