Raise your hand if you’re a conservative who has cited Edmund Burke without actually having read him closely.
Really—you’re all scholars of the Irish-born MP and oft-celebrated “father of modern conservatism”?
Okay, what did Burke mean by the phrase “the little platoon”?
Yuval Levin explains in his wonderful new book The Great Debate: Edmund Burke, Thomas Paine, and the Birth of Right and Left:
The division of citizens into distinct groups and classes, Burke writes, “composes a strong barrier against the excesses of despotism,” by establishing habits and obligations of restraint in ruler and ruled alike grounded in the relations of groups or classes in society. To remove these traditional restraints, which hold in check both the individual and the state, would mean empowering only the state to restrain the individual, and in turn restraining the state with only principles and rules, or parchment barriers. Neither, Burke thought, could be stronger or more effective than the restraints of habit and custom that grow out of group identity and loyalty. Burke’s famous reference to the little platoon—“To be attached to the subdivision, to love the little platoon we belong to in society, is the first principle (the germ as it were) of public affections”—is often cited as an example of a case for local government or allegiance to place, but in its context in the Reflections, the passage is very clearly a reference to social class.
Still feeling Burkean? Ready to go the pipe-and-slippers, Brideshead cultist route and declare yourself a loyal subject of the queen?
Levin reminds us that the context in which Burke wrote those words was a long-running intellectual dispute with a European-born radical, a man who was cheering on the secular revolution in France—and, oh, by the way, also one of the forefathers of our own revolution, favored by none other than Ronald Reagan himself—the Common Sense and The Crisis pamphleteer Thomas Paine.
That the rivalry between Burke and Paine cuts both ways through our hearts—this is precisely the kind of dialectic, if you will, that Levin hopes to provoke in the reader.
Make no mistake, though; Levin is a Burkean. In fact, the most eloquent exponent of Burkean conservatism, properly understood, since George Will circa 1983’s Statecraft as Soulcraft.
While scholarly and measured in tone, The Great Debate is a readable intellectual history that fairly crackles with contemporary relevance.
Indeed, The Great Debate is the must-read book of the year for conservatives—especially those conservatives who are profoundly and genuinely baffled by the declining popularity of the GOP as a national party. How can America, these conservatives ask, the land of the rugged individual, the conquerors of the frontier, choose statism and collectivism over freedom and liberty?!
Levin’s book provides the answer: You’re looking at the Democratic Party all wrong. It’s just as individualist as you are—maybe more so.
And that is the problem!
When, after the massacres at Newtown and the Washington Navy Yard, Republicans refused to outlaw the AR-15 rifle or require background checks for gun purchasers, we were told the party had committed suicide by defying 90 percent of the nation.
When Republicans rejected amnesty and a path to citizenship for illegal aliens, we were told the GOP had just forfeited its future.
When House Republicans refused to fund Obamacare, the government was shut down and the Tea Party was blamed, word went forth: The GOP has destroyed its brand. Republicans face a wipeout in 2014. It will take a generation to remove this mark of Cain.
Eight weeks later, Obama’s approval is below 40 percent. Most Americans find him untrustworthy. And the GOP is favored to hold the seats it has in the House while making gains in the Senate.
For this reversal of fortunes, Republicans can thank the rollout of Obamacare—the website that does not work, the revelation that, contrary to Obama’s promise, millions are losing health care plans that they liked, and the reports of soaring premiums and sinking benefits.
Democrats, however, might take comfort in the old maxim: If you don’t like the weather here, just wait a while.
For, egged on by Bibi Netanyahu and the Israeli Lobby AIPAC, the neocons are anticipating the return of Congress to start work on new sanctions on Iran. Should they succeed, they just might abort the Geneva talks or even torpedo the six-month deal with Iran.
While shaking a fist in the face of the Ayatollah will rally the Republican base, it does not appear to be a formula for winning the nation. Read More…
By 1968, Walter Lippmann, the dean of liberal columnists, had concluded that liberalism had reached the end of its tether.
In that liberal epoch, the 1960s, the Democratic Party had marched us into an endless war that was tearing America apart.
Lyndon Johnson’s Great Society had produced four “long, hot summers” of racial riots and a national crime rate that had doubled in a decade. The young were alienated, the campuses aflame.
Lippmann endorsed Richard Nixon.
For forty years, no unabashed liberal would be elected president.
Jimmy Carter won one term by presenting himself as a born-again Christian from Georgia, a peanut farmer, Naval Academy graduate and nuclear engineer. Bill Clinton ran as a centrist.
So toxic had the term “liberal” become that liberals dropped it and had themselves rebaptized as “progressives.”
Barack Obama, however, ran unapologetically as a man of the left. An opponent of the Iraq war, he had compiled a voting record to the left of Bernie Sanders, the socialist senator from Vermont.
And Obama proudly placed his signature achievement, Obamacare, right alongside, and in the tradition of, liberal giants FDR and LBJ.
This is the new progressivism of the 21st century, Obama was saying, and I the transformational figure who will usher in the post-Reagan era. Where Clinton failed, I will succeed.
But now that Obamacare is coming to be perceived as a political catastrophe, not only does it threaten Obama’s place in history, it could invalidate, indeed, eviscerate the defining idea of the Democratic Party itself.
The Senate majority leader and 51 other Democrats voted today to change the chamber’s rules—departing from over 200 years of tradition—to make a 51-vote majority sufficient to confirm most presidential nominees, though notably not those for the Supreme Court. It’s arguably the biggest change to the institutional character of the Senate since the ratification of the 17th amendment. Consensus-seeking collegiality had already broken down, but now the formal incentive to seek it is gone. That incentive, it need hardly be said, hadn’t proved very effective lately.
In theory you might now get more intensely partisan nominees and greater polarization of the federal bench, as well as cabinets that are even less aware of the need to build more-than-majority-support for policies that affect everyone. American government has generally not been a thing of bare majorities, which tend to be unstable, and the political problems of Obamacare—quite apart from its policy and technological ones—are a product of a short-lived Democratic majority trying to make a great change without securing even grudging support from the other side. This all-at-once approach cost the Democrats their House majority in the following election and set the stage for years of acrimony.
Other changes in the way the Congress operates, under Republicans and well as Democrats, have similarly chipped away at the consensual—or at least supermajoritarian—character of the federal government: without earmarks and the old committee system, the wheeling and dealing that could be used to build consensus across party lines has been made much more difficult. Even earlier, reforms to the appropriations process in 1974 paved the way for Congress to become an “incompetent bureaucracy,” as this Bruce Bartlett piece explains, unable to pass budgets in anything like a reliable manner.
Reid’s change to the filibuster rules, and the conditions that prompted it, are another chapter in a decades-long tale of institutional decay. The book still has many pages to go, but one can guess how it ends. In 1776 Americans were so passionate about the idea of representative government that they were willing to fight a revolution for it. Nowadays Congress is lucky if its approval ratings hover just above the single-digit range.
Had there been no Dallas, there would have been no Camelot.
There would have been no John F. Kennedy as brilliant statesman cut off in his prime, had it not been for those riveting days from Dealey Plaza to Arlington and the lighting of the Eternal Flame. Along with the unsleeping labors of an idolatrous press and the propagandists who control America’s popular culture, those four days created and sustained the Kennedy Myth.
But, over 50 years, the effect has begun to wear off. The New York Times reports that in the ranking of presidents, Kennedy has fallen further and faster than any. Ronald Reagan has replaced him as No. 1, and JFK is a fading fourth.
Kennedy is increasingly perceived today as he was 50 years ago, before word came that shots had been fired in Dallas. That he was popular, inspirational, charismatic, no one denied. But no one would then have called him great or near great. His report card had too many C’s, F’s and Incompletes. His great legislative victory had been the passage of the Trade Expansion Act of 1962. His tax cut bill was buried on the Hill. His triumph had been forcing a withdrawal of Soviet missiles from Cuba. But we would learn this was done by a secret deal for the withdrawal of U.S. missiles from Turkey and a secret pledge not to invade Cuba.
And after the missile crisis, Bobby Kennedy pushed the CIA to eliminate Castro, eliciting a warning from Fidel that two could play this game. Lyndon Johnson said that under the Kennedys, the CIA had been running “a damned Murder Inc. in the Caribbean.” What caused Nikita Khrushchev to think he could get away with putting rockets in Cuba? His perception that JFK was a weak president.
Kennedy had denied air cover for the Cuban patriots at the Bay of Pigs, resulting in the worst debacle of the Cold War. He was then berated and humiliated by Khrushchev at the Vienna Summit in June 1961. In August, Khrushchev built the Berlin Wall. Kennedy sat paralyzed. In September, Khrushchev smashed the three-year-old nuclear test-ban moratorium with a series of explosions featuring, at Novaya Zemlya, a 57-megaton “Tsar Bomba,” the largest man-made blast ever.
“Less profile, more courage,” the placards read.
In Southeast Asia, JFK had Averell Harriman negotiate a treaty for neutralizing Laos, resulting in Hanoi’s virtual annexation of the Ho Chi Minh trail through Laos into South Vietnam. Where Eisenhower had 600 advisers in Vietnam, JFK increased it to 16,000 and gave his blessing to a generals’ coup in which our ally, President Ngo Dinh Diem, was assassinated. Then and there, Vietnam became America’s war.
Kennedy had made a famous phone call to Mrs. Martin Luther King during the 1960 campaign when her husband had been arrested. Yet, he kept his administration away from the March on Washington and directed J. Edgar Hoover to wiretap Dr. King to learn of his associations with Communists.
Since his death, Kennedy’s reputation has been ravaged by revelations of assignations and mistresses from Marilyn Monroe to Mafia molls to White House interns from Miss Porter’s School. All of this was covered up by his courtier journalists who would collaborate in perpetuating the Kennedy myth and collude in destroying their great hate object, Richard Nixon. Yet, contrast what Nixon did, with what JFK failed to do.
If you are literate today, it does not mean you can write — not even close to it in many cases. But if you were literate in 1863, even if you could not spell, you often could write descriptively and meaningfully. In the century and a half since, we have evolved from word to image creatures, devaluing the power of the written word and turning ourselves into a species of short gazers, focused on the emotions of the moment rather than the contemplative thoughts about consequences and meaning of our actions. Many everyday writers in the mid-19th century were far more contemplative, far more likely to contextualize the long-term meaning of their actions. They meticulously observed and carefully described because, although photography was the hot new medium during the Civil War, words remained the dominant way of communicating thought, memory, aspiration, hope.
Raasch’s theory is not a new one. Back in the 1980’s, when Internet was still in its primordial days and television was king, Neil Postman wrote Amusing Ourselves to Death. His book cautioned against the developing “Age of Show Business,” fed by television’s sensory, visual medium.
Postman believed three “ages” were prominent throughout information’s history: first, ancient oral cultures encouraged the preservation of information through spoken records and stories. When the printing press and writing became more prominent, oral cultures dissolved into the “Age of Exposition”: a time when written records were perceived as holding the greatest truth. Then as photography and videography developed, media began to change again—for the worse. Postman believed we would lose more than writing ability in the wake of the entertainment era: he warned of a depleting mental and emotional capacity. He believed we would become as obsessed with pleasure as the humans in Aldous Huxley’s Brave New World. Postman’s descriptions of distracted, sensationalistic consumers relate well to Raasch’s “species of short gazers.”
We can only imagine what Postman would have said about the Internet; perhaps Raasch gives us a taste when he describes it as “the Great Din”: “Today, throwing barbs and brickbats into the Great Din of the Internet has become as second nature as breathing … The Great Din requires no forethought, no real calculation of purpose or result, no contemplative brake, no need to seek angles or views beyond those that reaffirm or reassure what we think right now.”
Is this truly the future of media? Will we lose any true, deep, thoughtful communication in its havoc of pixels and pictures?
One interesting counter-opinion comes from former Daily Beast editor Tina Brown. Having recently left the world of journalism for event production, Brown has told reporters that she no longer reads magazines herself—in fact, she thinks “the whole writing fad is so twentieth century” (in the words of New York Magazine). But rather than warning of impending havoc and din, Brown calls people back to oral communication: “I think you can have more satisfaction from live conversations,” she said, adding that we are “going back to oral culture where the written word will be less relevant.”
If we experience the “death of writing,” as Rassch puts it, could we come full-circle and return to the age of oral communication? Will grandfathers sit down with their grandchildren and tell them stories, like our ancestors so long ago? One can only hope; but if such an experience were truly to flower from “The Great Din,” it would be rather surprising.
Did the Republicans get their butt kicked by shutting down the government and delaying the debt increase last month? On substance, Republicans in fact won, although there is a lurking danger in the budget settlement that could snatch defeat from the jaws of victory.
The real issue for the Democrats has always been the sequester. As the Washington Post’s Zachary Goldfarb noted “Democrats hate the sequester because it’s basically the opposite of their vision of domestic investment” where the government experts know best how to allocate society’s resources. At the beginning of the year, President Barack Obama, Senate Leader Harry Reid, and House minority boss Nancy Pelosi all set reversing the sequester spending cuts as Democrats’ top priority. Reid was especially incensed and even blamed the president and himself for being “too lenient” in being suckered into the deal in 2011 and this time won a pledge from Obama to keep the White House out of the negotiations.
With Obama president for three more years and the Senate still under Democratic control, Reid held the whip hand. What could conventional Republicans do? With the president refusing to bargain at all, what would have happened if the Tea Party backbenchers in the Senate and House had not gone ahead with the shutdown and debt threats? The whole focus of this year’s debate would have been on overturning the sequester and how the evil Republicans were punishing the poor by refusing to do so, reprising the winning theme used successfully against Mitt Romney. The compliant mainstream media would have happily picked up their favorite theme for their favorite politicians and made the Republicans look just as bad as they actually did on the shutdown. The polls would have ended exactly the same, only the spending cuts would be gone. Read More…
“Maybe the folks in Washington, D.C., should tune in their TVs right now and see how it’s done,” said the big winner of Tuesday last.
“I did not seek a second term to do small things,” Chris Christie went on, but “to finish the job—now watch me do it.”
Humility is not the governor’s strong suit.
Yet Christie registered a remarkable victory. He won with 60 percent in a blue state, winning 55 percent of women, half of the Hispanic vote and 20 percent of African-Americans.
If he could replicate those numbers in New Jersey and nationally in 2016, Chris Christie would be elected president in a landslide.
“[T]his fellow is really on the right track,” says seven-term Sen. Orrin Hatch of Utah, “if the Republican Party is not too stupid.” To fill out Christie’s ticket in 2016, Hatch proposes Susana Martinez of New Mexico, who made eight campaign stops with Christie on Monday.
Democrats concur with Hatch. The headline on the lead story on page one of Thursday’s Washington Post reads: “Democrats Take Aim at Christie: He’s Seen as GOP’s Best Hope for 2016.”
“The Elephant in the Room” is the title of Time‘s cover story.
And with the corporate contributors and Beltway bundlers gravitating to him, Christie is emerging as the establishment’s hope to recapture the GOP from its Tea Party, libertarian, social conservative, and populist wing.
Will Christie be the candidate in 2016?
Put me down as a skeptic. Read More…
One of the landmark studies of America, published 120 years ago, is Fredrick Jackson Turner’s essay “The Significance of the Frontier in American History.” Turner’s essay was inspired by a line that appeared in the Census report of 1890:
Up to and including 1880 the country had a frontier of settlement, but at present the unsettled area has been so broken into by isolated bodies of settlement that there can hardly be said to be a frontier line. In the discussion of its extent, its westward movement, etc., it can not therefore any longer have a place in the census reports.
Turner recognized that contained within this small passage of officialese was a momentous turning point in American history. The official announcement of the end of the existence of the American frontier marked “the closing of a great historic moment.” In Turner’s view, the existence of the frontier, and all that it entailed, constituted the deepest source of the American character—more than any other explanation, including even the Constitution.
Turner was a fervent Progressive during the years when Progressivism was gaining steam—indeed, he receives top billing in Richard Hofstadter’s study The Progressive Historians: Turner, Beard, Parrington. In Turner’s view, the role played by the frontier—and the type of values and attributes it fostered in Americans—was the root of the progressive thrust in American history, including, importantly in his view, the rise of the sense of American nationalism.
The wilderness has been interpenetrated by lines of civilization growing ever more numerous. It is like the steady growth of a complex nervous system for the originally simple, inert continent. If one would understand why we are to-day one nation, rather than a collection of isolated states, he must study this economic and social consolidation of the country.
What is striking in these and similar passages is how closely Turner’s analysis echoes the hopes and intentions of the Founders of whom the Progressives were often fervent critics. He particularly echoes the Hamiltonians who envisioned a “national system” that would draw the allegiances of people away from local and parochial identities through the soft but persistent pressure of a nationalizing economic and political order. Turner recognized that this thrust toward an increasing national identity would be achieved through the encouragement of the individualistic spirit of the American frontiersman. The John Wayne, Daniel Boone spirit of the self-standing, self-made, independent, free individual would, ironically, forge the conditions for a national identity and usher in the possibility and even necessity of the Progressive stage.
Bill de Blasio is mayor-elect of New York. According to many of de Blasio’s critics as well as his supporters, the unsurprising outcome of Tuesday’s election reflects a decisive turn in the city’s politics. The Nation claims that “Bill de Blasio’s exhilarating landslide victory over Joe Lhota in New York’s mayoral election offers a once-in-a-generation chance for progressives to take the reins of power in America’s largest—and most iconic—city.” In National Review, Kevin D. Williamson evokes John Carpenter’s b-movie classic, Escape from New York.
I say: not so fast. Neither turnout and polls nor de Blasio’s career so far support hope (or fears) that he’ll try to transform New York into Moscow on Hudson. Mayor de Blasio will not cultivate the the chummy relationship with Wall Street that Michael Bloomberg did. But there’s likely to be more continuity between their mayoralties than most people expect. In fact, that continuity is a bigger threat to the city’s future than the immediate collapse that some conservatives fear.
First, the election data. As Nicole Gelinas points out, de Blasio’s election was not the mandate for change that the margin of victory suggests.
Preliminary results show that about 1 million New Yorkers voted yesterday. That’s 13 percent lower than four years ago. Back then, remember, many voters disillusioned with the choices—Mayor Michael R. Bloomberg was running for a third term against the uninspiring Bill Thompson, Jr.—just stayed home. Turnout this year was as much as 30 percent below 12 years ago, when Bloomberg won his first victory. Voters supposedly so eager for change this year didn’t show that eagerness by voting. And a slim majority of the people who did vote—51 percent—told exit pollsters that they approve of Bloomberg, anyway. Though de Blasio’s victory margin was impressive, the scale of the win looks less stellar when put into recent historical context. As of early Wednesday, de Blasio had 752,605 votes—a hair shy of Bloomberg’s 753,089 votes in 2005.
So de Blasio did not win the votes of unprecedented number of New Yorkers. And many of those who did vote for him also supported Bloomberg. That doesn’t mean that they like everything Bloomberg did. But there’s no evidence here of a progressive tsunami.
What about de Blasio’s career? The tabloid press paid a great deal of attention to de Blasio’s visits to communist Nicaragua and the Soviet Union as a young man. More recently, however, de Blasio worked as a HUD staffer under Andrew Cuomo, and as campaign manager for Hillary Clinton. De Blasio took liberal positions during his tenure on the city council, particularly on symbolic issues involving gay rights. But this is not the resume of a professional radical.
It’s true that de Blasio made “a tale of two cities” the central theme of his campaign. As many observers have pointed out, however, he lacks the authority to enact his signature proposals: a tax increase on high earners, to be used to fund universal pre-K. Nothing’s impossible, but the chances of the state legislature approving such a tax hike are slim. The same goes for several of de Blasio’s other ideas, including a city-only minimum wage higher than the state’s minimum and the issuance of driver’s licenses to illegal immigrants.
The real issues under the de Blasio’s administration will be matters over which the mayor has some direct control. That means, above all, contracts with city workers, and policing. Will de Blasio blow the budget to satisfy public employee unions? And will he keep crime under control after eliminating stop-and-frisk ?
I’m cautiously optimistic about public safety. New Yorkers didn’t like living in fear, and de Blasio is smart enough to know that he’s finished if crime returns. Stop-and-frisk is not the only weapon in the NYPD’s arsenal.
The unions are a bigger problem. New York’s short-term fiscal outlook is reasonably good, so it’s hard to imagine de Blasio playing Scrooge to faithful supporters.
Even so, de Blasio’s mayoralty is unlikely to be a revolutionary moment. As Walter Russell Mead has argued, the biggest risk is that he pursues the same high-tax, high-regulation, high-service style of government on which Bloomberg relied. Those policies have helped transform much of New York into a gleaming shopping mall, which is admittedly a lot better than an urban jungle. But they are a bad strategy for prosperity in the years and decades to come.