A few people may be a little unclear about the argument of my last post on secession as a principle of liberty (or not, as I argue). Its inspiration was the fact that it seemed curious for Americans to long for Scottish secession when the Scots themselves had voted against it. Whatever was being expressed was not sympathy for the Scottish people, so what was it? The answer was a general case for secession as an inherently good thing, in radical libertarian theory, because it leads to smaller states, and maybe no states. I pointed out various problems with this notion, which seems to have greater emotional force than reasoning behind it.
Obviously there are cases in which secession or political breakup—we’ll get to the difference in a minute—can be good things. In the case of Czechoslovakia, division came peacefully and fulfilled a democratic-nationalistic wish on the part of the constituent peoples to govern their own affairs separately. The relationship between such democratic-nationalistic motives and the individual liberty that is dear to U.S. libertarians is complex, but it looks as if there are cases where the former doesn’t harm the latter and may even advance it. That does not mean the two things are always compatible, however, and nationalism—including of the Scottish variety—very often involves policies abhorrent to the advocates of free markets.
Ironically or not, criticizing a general enthusiasm for secessionism elicits a certain amount of patriotic ire from some individualists who play the American Revolution as their trump card. Is the argument that secessionism is generally good because the American Revolution was good, or is it that the American Revolution was good because secessionism generally is? There’s a difference here between constitutionalist libertarians, who take the former position, and radical libertarians, who take the latter. But in any case, both are wrong: questions of political union or breakup depend upon the particulars. The American Revolution didn’t derive its legitimacy from radical libertarian arguments about anti-statist secession, and the revolution doesn’t tell us anything about the merits of other breakaway movements.
The American Revolution itself is a very complicated thing and doesn’t in all senses belong in the category of secession. The Declaration of Independence, for example, decribes the break from Britain in terms of a revolution in which an already separate people deposes its monarch and severs its ties with his government in another country. This is framed rather as if the relationship between the American colonies and Britain were akin to the relationship between Scotland and England before the Act of Union. Before that, Scotland and England had a single monarch but separate parliaments and governments—they were, constitutionally speaking, separate countries with a joint head of state. The American colonies had sound grounds for considering themselves a parallel example: they too had their own legislatures, even though they shared a king with the United Kingdom.
Note that had Scottish secession succeeded, it would only have undone the Act of Union while retaining the Queen as head of state—in other words, it would have put Scotland after secession in much the same constitutional position as the American colonies were in before the War of Independence, which is another reason to think “secession” is not the right word for what the colonists were doing. They were already outside of the Union of Great Britain and Ireland and by their own lights were simply affiliated with it through a shared king. This, by the way, is why the Declaration of Independence is directed against the Crown rather than Parliament, even though by 1776 the latter made most policy decisions.
What the colonists had been demanding before they turned revolutionary was also something rather like what the Scottish got out of the Act of Union: representation in the Parliament of the United Kingdom. The colonists were aggrieved because the king was allowing Parliament, in which Americans were not represented, to set policy for the colonies, over the objections of the colonists’ own legislatures. Self-government was very much the crux of the Americans’ concerns, and it applied not only to legislatures and governors (who were often royal appointments) but also to church governance: even American Anglicans were quite Protestant in character, and they feared that the precedent of the king allowing a Catholic bishop certain authority in Quebec—which Britain had taken in the French and Indian War—would ultimately translate into the king sending Church of England bishops to America to take control of American church governance.
But there was a foreign-policy angle to the revolution as well, one that should disquiet patriotic libertarians who think of America as a freedom-loving republic fighting to separate itself from an evil empire. For the American colonists, Britain was not imperial enough, at least where the Indians were concerned. Britain had provided for the colonists’ security against the French and Indians, and the taxes Britain wanted to impose to help pay for that war were one of things to which the Americans objected. But it wasn’t just the money: Americans were being taxed for an ongoing foreign policy that failed to do what many colonists dearly wanted to do—namely, seize more western land.
This is why the Declaration’s litany of grievances against the king ends with the claim that he “has endeavoured to bring on the inhabitants of our frontiers, the merciless Indian Savages, whose known rule of warfare, is an undistinguished destruction of all ages, sexes and conditions.” The British did not want Americans taking it upon themselves to settle the west, and they were not much inclined to extend military protection to settlers whose disregard for the frontier provoked Indian attacks. The Declaration is naturally couched in defensive terms, but what many colonists wanted wasn’t defense but a foreign policy that would enable private land-grabs and colonization efforts. It’s no accident that the one really big piece of legislation the U.S. managed to pass under the Articles of Confederation was the Northwest Ordinance. Colonizing the west was an imperative of building the American nation.
Mercenary and strategic motives are hard to separate: the “vacuum” of Indian territory would have been filled by an organized state or states sooner or later—if not by the U.S. then by a European colonial power or by Americans like Aaron Burr (or Sam Houston) carving out republics of their own, or by the Indians themselves forming a lasting, state-like confederation. Any of these alternatives would have had security implications for the Americans whether they were independent or part of the British Empire. Needless to say, the Americans had more freedom to set their own policies for conquest and defense alike if they were out from under the thumb of the king and his Parliament. The independent U.S. itself tried to regulate expansion into Indian territory—one of many respects in which the newly established federal government picked up where the British had left off—and tensions between a restraining central government and eager-to-colonize states and individuals continued. But ultimately an American central government was going to be more sympathetic to expansion than London was.
The security logic of the American Revolution is hard to argue with, and Americans certainly considered what they were doing to be extending freedom—their own, at least. But a libertarian today who wants to take a universal view of things has to see all this as something murkier than a victory for self-governing good against imperial evil. Had America remained British,
the slavery trade might well have been abolished as soon and as peacefully as it was by the rest of the British Empire—the very fact that colonial slave-holders did not have formal representation in Parliament (the thing that Americans were clamoring for) was what allowed abolition to take place. Three wars might have been averted: the War of Independence; the unsuccessful U.S. war of conquest that followed in 1812; and the Civil War that arose from as a result of a Constitution that divided sovereignty and left the question of slavery open. On the other hand, America would have had some involvement in the Napoleonic Wars—assuming there had been a French Revolution and a Napoleon in the first place without the American Revolution.
Such counterfactuals are troubling, and they can hardly be waved away with talk about American freedoms, such as free speech, that are actually British in origin, if more strongly established here. What would have been lost is not individual liberty but a model of republicanism that is so tightly entwined with the American psyche that in a real sense we would not exist as a people without it. But for that reason—the fact that this deeply, originally Protestant commitment to self-government has defined who we are since long before the revolution—independence was perhaps inevitable and would always have set the world on an unpredictable course.
Ron Paul has stirred a media buzz by praising Scotland’s secession effort—an effort the Scots themselves rejected. Dr. Paul’s views are shared by many libertarians and conservatives, as well as a few folks on the left. Americans tend to think of secession only in the context of our own Civil War, but most acts of breaking away from a larger political unit have nothing to do with chattel slavery. Unfortunately, they don’t necessarily have anything to do with individual liberty either.
“The growth of support for secession should cheer all supporters of freedom,” the former Texas congressman writes, “as devolving power to smaller units of government is one of the best ways to guarantee peace, property, liberty—and even cheap whiskey!” Alas, there’s reason to think otherwise, and not just because Diageo is a London-based multinational.
The specifically libertarian case for secessionism is manifold: in fact, it’s several cases for different things that may not add up to a coherent whole. First, there is the radical theory that secessionism in principle leads to free-market anarchism—that is, secessionist reduction of states to ever smaller units ends with reduction of the state to the individual. Second, there is the historical claim that smaller states tend to be freer and more prosperous. Third is the matter of self-determination, which is actually a democratic or nationalistic idea rather than a classically liberal one but historically has been admixed with liberalisms of various kinds. What it means is that “a people” has “a right” to exit a state along with its territory and create a new state.
A fourth consideration is that suppressing secession may require coercion. And finally there is the pragmatic idea that secession is the best way to dismantle the U.S. federal government, the summum malum for some libertarians. (As an addendum, one can mention the claim that the U.S. Constitution in particular tacitly approves secessionism, but that’s a separate argument from cheering for secession more generally.)
It should be obvious that the first and third claims negate one another, and in practice the third overrules the first: real-world secession never leads to individualist anarchism but only to the creation of two or more states where formerly there was one. The abstract claim that every minority within the newly formed states should then be allowed to secede doesn’t translate into anyone’s policy: instead, formerly united states that are now distinct security competitors tend to consider the residual minorities who belong to the other bloc to be internal security threats. These populations left behind by secessionism may or may not be disloyal, but they are readily used as pretexts for aggressive state actions: either for the stronger state to dismember or intimidate the weaker one in the name of protecting minorities or for either state to persecute minorities and build an internal security apparatus to suppress the (possibly imaginary) enemy within. Needless to say, none of this is particularly good for liberty.
The coercion point doesn’t stand without support from nationalistic or democratic claims. After all, “coercion” is a function of legitimacy—no libertarian thinks that using force on one’s own property against trespassers constitutes coercion. Yet radical individualists have no adequate theory of national self-determination. What gives the people in seceding territory X the right to shoot at people from integrated territory X+Y? “Coercion” is a question-begging argument: it says, on some unstated non-individualistic principle, that the South has the right to shoot at the Union but not vice versa.
Only the second argument for secession is not easily dismissed. It can be divided into two kinds of assertion: 1.) an abstract claim that smaller states are always better (freer, more prosperous, etc.) than larger states, and 2.) concrete historical claims that many in fact have been better.
I’ll offer a few summary remarks on this point. First, smaller states can indeed be freer and more prosperous, although there’s a hitch: circumstances in which this is true tend to be those in which small states are free riders on international security provided by large states. Hong Kong, Singapore, Monaco, San Marino, Belgium, and Switzerland are all cases in point. None of these micro-states are capable of defending themselves against large aggressors. Their security depends on great powers keeping the peace on a continental or oceanic scale. Hong Kong had first British, now Chinese protection–hardly an unmixed blessing, to be sure. Singapore had first British, now U.S. protection. Small states such as Monaco, San Marino, Belgium, and Switzerland have derived their security from a balance of power in Europe underwritten by Britain or the United States. None of them alone, or even in concert with one another, could prevail against a Revolutionary France or Nazi Germany (or indeed non-Nazi Germany).
A few radical libertarians seem to think that foreign conquest shouldn’t matter because it just trades one master, one state, for another. But of course, if that’s true, there’s no argument for secession since in practice it too merely trades one state for another. The question that has to be asked is a prudential one: is a particular state more or less free than the alternatives? There’s no abstract, dogmatic answer where secession is concerned.
To note that secession is not a “good idea” in principle is not to say there aren’t good examples of secessions in practice. Czechoslovakia peacefully separated. The decomposition of the Soviet Union was a positive development of world-historic proportions—though surprisingly large numbers of former Soviet citizens themselves disagree: Gallup found last December that “residents in seven out of 11 countries that were part of the union are more likely to believe its collapse harmed their countries than benefited them.”
The leaden lining to the silver cloud of Soviet secession comes in part from the security competition it has entailed: Russia and neighboring Soviet successor states have had difficult dealings with one another—including wars—and ethnic tensions are sometimes grave. Despite all that, the world is well rid of the Soviet Union. But even this sterling example of secession is not without its tarnish.
Closer to home, the case that most Americans think of when they hear the word “secession” runs entirely in the other direction from Soviet disintegration. Successful Southern secession would have entailed results even more illiberal than the outbreak of the Civil War, which is saying a lot. The Southern Confederacy would have maintained slavery and looked to extend it to new territory, including perhaps Western territory claimed by the United States. Instead of fighting a war with the powerful North, however, the Confederacy might have sought expansion to the south through Cuba and Latin America, as indeed some Confederates dreamed of doing. In the North, meanwhile, you would have had industrial “Hamiltonian” policies and a domestic political climate over time much closer to a European level of statism than has ever been possible with the South as part of the Union. A social-democratic North and a slave South, each ready for war with the other, and at least one looking to expand. What’s libertarian about this?
The case can be made that the threat of secession at least imposes a check on central government expansion—a Washington with the secessionist sword of Damocles hanging over its head would have to respect to states’ rights. But this neglects reasons why the Union was created in the first place: notably, in the competitive world of empires and nation-states, bigger is more secure—not always, but often enough. Keeping the British Empire at bay—fortified as it was in Canada and for many years on the Mississippi and even in U.S. territory—was best achieved with a federation more tightly knit than that provided for by the Articles of Confederation.
But hadn’t America beaten the British once before under the Articles? Yes—with the help of another predatory superpower, France. A country that has the choice of providing its own security or living at the pleasure of others tends to go for growth, unless, like Japan and Germany in the last century, it gets beaten down. And to say that a territory is too large for self-government begs an important question—how can there be self-government at all if a state is not large enough to be secure?
American independence from Great Britain was in the first place driven by concerns for civil and ecclesiastical self-government: the colonists gambled their security and won. The continental United States proved to be a defensible territory without need of larger imperial union with Britain or permanent alliance with France. America’s neighbors north and south were weak and underpopulated. (Mexico’s population boomed only in the 20th century.) Further wars with the British Empire after 1815 were precluded by a balance of power: the growing military, demographic, and economic power differential between Canada and the booming U.S. meant that Canada, far from being a British imperial threat to the U.S., became a hostage held by the U.S. to insist upon British good behavior. The British could not defend Canada; America did not want war with the Royal Navy. That was the balance. A weaker or fragmented U.S. would have been in less of a position to keep it.
Elsewhere in the world, union and secession are questions of ethnonationalism, but for the English-speaking peoples security has always been the crux. It accounts in large part for why Scotland and England united to begin with. Scotland for most of its history was too weak and poor to resist English power. An independent Scotland was a Scotland subject to English predation. But England’s own security was jeopardized by its weak neighbor, which was at times a near-failed state—quite capable of launching raids across the border, if not much more—and at others, even when it posed no direct military nuisance, was a strategic threat as a potential base for French influence.
Great Britain as an island is readily defensible. So the English solved a security problem, and the Scottish conceded the unchangeable reality that their neighbor to the south was more powerful and prosperous. Scotland could start wars with England; it could not win them. Better for both, then, to have no more. Prosperity may be unequally divided, but Scots would be no worse off as a minority within a union dominated by England than they were as a weaker people outside its borders and legal order. British security was perfected by the Act of Union, and it led to a century of British global preeminence—quite an attainment for a country much smaller than France or Spain.
Today, of course, Britain’s security is guaranteed not by British arms, which proved inadequate by themselves in two World Wars, but by an American alliance. Scotland could indeed act like Switzerland or San Marino in this secure, American-backed European order. But such an “independent” Scotland today would only be choosing a hoped-for union with Europe over union with the rest of the UK. Would America’s enthusiasts for Scottish secession want their own state to secede from the U.S. merely to join Mexico or Canada? Size still matters, and Scotland depends for its security and prosperity on someone else—the U.S. and UK or the U.S. and Europe—in any scenario.
Self-government is only possible within the context of security, and individual liberty arises from the rule of law that self-government makes possible. None of these things is synonymous with the others, but all are intimately related. Union and secession have to be considered, even at the theoretical level, in this setting.
The world is relatively peaceful today not because peace among states is natural but because the power differential between the top and almost everyone else is so great as to dissuade competition. Indeed, the world order is so top-heavy that the U.S. can engage in wars of choice, which have proved disastrous for almost everyone. A world consisting of more states more evenly matched, however, would almost certainly not be more peaceful. Libertarians and antistatist conservatives, of all people, should appreciate that all states are aggressive and seek to expand, if they can—the more of them, the more they fight, until big ones crush the smaller.
For America, as historically for Britain, secession and union are questions of security and power, which undergird prosperity, self-government, and individual freedom. For much of the rest of the world, poisoned by ethnic and sectarian hatreds, secession means nationalism and civil strife. In both cases, breaking up existing states to create new ones is a revolutionary and dangerous act, one more apt to imperil liberty than advance it.
Ron Paul and others who make the case for secession do us all a service, however: these are serious matters that deserve to be taken seriously, not taken for granted. Secession is a thing to be discussed—and finally, as in Scotland, rejected.
Two libertarians familiar to TAC readers—Robert Murphy and Sheldon Richman—have lately offered critiques of my “Why Liberalism Means Empire” essay. Libertarians consider themselves liberals, or at least heirs to classical liberalism, and have been among the most outspoken opponents of “empire” in contemporary American politics. So on the face of it, they have good reason to object to the connection I draw between their ideology and global power they abjure. The trouble is, the objections don’t stand up, and liberalism remains deeply implicated in the security conditions of empire.
Murphy attacks my argument at its strongest point, the case of World War II. “Look at what the Soviets did to Eastern Europe after the Americans provided them with all sorts of aid and attacked Germany from the west,” he writes. “If we dislike that outcome—and McCarthy and I do both dislike it—then to have achieved a more balanced outcome, surely the US should not have jumped in on the side that ended up winning.”
Soviet domination of half of Europe after the war is certainly an undesirable outcome. Can we imagine a worse one—or three? Easily.
Without the U.S., the outcomes available to Europe as a whole in World War II, West as well as East, were a.) Nazi control, b.) Soviet control, or c.) divided Nazi-Soviet control. Would any liberal prefer one of these outcomes to what occurred with U.S. intervention, namely d.) divided U.S./Western-Soviet control?
U.S. intervention certainly did strengthen the USSR. But strength in international affairs is a relative thing. A weaker USSR still strong enough to prevail against Nazi Germany without American help would have been more than strong enough to subdue war-torn Western as well as Eastern Europe. Stalin had fifth columns at the ready among resistance forces in France and elsewhere. As it happened, the USSR was in no position to claim France, Italy, West Germany, and Greece after World War II because the U.S. and the allies America rallied stood in the way. Without that check, little would have prevented Stalin from doing to Western Europe what he did to the East.
Alternatively, had a weaker USSR fallen to Hitler, the Nazis would have had an even easier time consolidating control of the Continent. What force could resist them? Franco and Salazar were not about to do so, whatever their differences with Hitler. Finally, had a USSR not supported by the United States fought the Germans to a standstill, the result might have been the worst of all worlds, with all of Europe unfree and the internal resistance to each totalitarian bloc being drawn toward the ideology of the other totalitarian bloc. A Cold War between the USSR and Nazi Germany would by any measure be worse than the one we had between the USSR and United States.
What of the hope that a draw might have brought anti-totalitarian revolution to Nazi Germany and the USSR? Murphy, Richman, and I all agree that war is bad for liberty and liberalism—but for that very reason, wars fought between totalitarian powers are unlikely to have a liberalizing effect on them. Quite the contrary: wars and crises tend to create conditions that allow totalitarianism to arise in the first place, and war is one environment in which the totalitarian ethos seems to thrive. The idea that the USSR and Nazi Germany, fighting one another, would each have collapsed, giving way to some tolerably liberal government, could only be believed by someone whose ideology insists that he believe it. It’s Lysenkoism applied to history.
Richman has better arguments, but they tend to be tangential to my points. He writes:
McCarthy has a rather liberal notion of liberalism—so liberal that it includes the illiberal corporate state, or what Albert Jay Nock called the “merchant-state,” that is, a powerful political-legal regime aimed first and foremost at fostering an economic system on behalf of masters, to use Adam Smith’s term. (The libertarian Thomas Hodgskin, not Marx, was the first to disparage “capitalists” for their use of the state to gain exploitative privileges.)
What Richman calls a loose, “rather liberal notion of liberalism” is intended to be a broadly accurate description of real-world, actually existing liberalism. A few years ago I commissioned Richman to write an essay on “free-market anti-capitalism,” a term that might sound like a contradiction to people who equate capitalism with free markets. What we have here is a parallel case: just as capitalism in practice is often antithetical to market freedom as understood in theory, so liberalism in practice—a state system involving capitalism, free trade, representative government, legal individualism, religious liberty, etc.—often falls short of the tenets of liberal theory. The problem is, practice comes first.
My essay claims that the security provided by the British Empire and later U.S. hegemony—or American Empire, if we want to be indelicate—has promoted liberal practice, and liberal practice, messy and imperfect though it might be, has promoted liberal theory. The claim here is not deterministic at the individual level: it’s plainly not the case no one can come up with liberal ideas amid an illiberal environment. Rather, a liberal environment is more conducive than an illiberal one to the extension and refinement of liberal thought among a populace.
This is why the largest concentration of classical liberals in 19th-century politics and the greatest volume of classical-liberal literature were to be found in Britain, and it’s why libertarianism today finds the most followers and is most strongly institutionalized—in think tanks, magazines, and a nascent political movement—in the United States. Liberalism is a luxury security affords, and hegemons have the security in the greatest abundance.
Security by itself is not enough, of course: a state that enjoyed tremendous international security, as Japan did for centuries, might or might not spontaneously develop broadly liberal ideas. Given the presence of liberal seeds, however, security seems to encourage their growth—this was true even in the Soviet-dominated Eastern Bloc during the Cold War and in the USSR itself.
The extended Soviet Empire was distinctly illiberal in ideology but enjoyed supreme security: there was never much prospect that NATO would simply invade Eastern Europe. (Just as NATO deterred the Soviets themselves from doing any invading of the West.) What liberal ideas survived Soviet repression or otherwise made their way through black-market channels into Soviet-controlled domains often met with a welcoming audience, and over decades, under conditions of peace, those liberal ideas grew stronger while the totalitarian ideology of the USSR grew weaker, including in Russia itself. Ironically, the Soviet Union’s greatest success—its conquest of Eastern Europe and guarantee of Russia’s security—contributed to its undoing. It created conditions in which liberalism could grow.
(And note, alas, what is happening in Eastern Europe and on Russia’s periphery now that security competition has returned: nationalism and even fascism is gaining ground.)
Anyone as an individual may be able to hold almost any idea at any time. But of the many ideas to which our minds give rise, only a few find substantial political expression. “McCarthy is wrong in thinking that power, that is, force, rules the world,” Richman writes. “There is something stronger: ideas.” In fact, only some ideas rule the world—the ones that succeed in acquiring power.
But power is a protean thing; it doesn’t just mean state power but any kind of hold over human beings. This highlights one of the paradoxes of liberalism: the ideology gains more power in terms of popular appeal at the expense of the states that make it possible. This is a good thing to the extent that liberal attitudes check abusive government. It’s a bad thing to the extent that liberal attitudes deprive states and populations alike of the wherewithal to combat external threats when they do arise. Pacifism, as a cousin or acute manifestation of liberalism, is a case in point. It’s one of the ideological luxuries made possible by security, but if adopted generally there would soon be no security left to leave it a choice for anyone but martyrs.
What’s more, radical liberals may call for complete nonintervention, but most self-identified liberals, including a contingent of libertarians, favor humanitarian warfare and aggressive efforts to “liberalize” countries that are insufficiently liberal and democratic. This is another irony of liberalism: it was fostered by non-ideological empires—Britain obtained hers in a fit of absence of mind; America acquired hers with tremendous reluctance and a troubled conscience. But once non-ideological empire has promoted the growth of liberal ideology, that ideology takes on a more radical, demanding character: a liberal minority adopt the anarcho-pacifist position, calling for dismantling the empire today; while a larger number of liberals call for using the empire to promote liberal ideological ends. Reining in empire thus requires reining in the demands of liberalism—realism as an antidote to ideology.
Murphy and Richman both point to the ways in which war and empire have made the United States less liberal in practice. War’s illiberal effects are indeed a major part of my argument: war is the opposite of security, and conditions of war—i.e., the absence of security—are dreadful for liberty. The question is what minimizes conditions of war and maximizes conditions of security.
That’s not a question that can be answered in the abstract; it’s one that must be answered in the context of particular times. In the case of 19th-century Europe, a balance of power safeguarded by the British Empire as an “offshore balancer” seems to have done the trick. In the case of 20th-century Europe, a 45-year balance between the United States and a contained USSR kept the peace from the fall of Nazi Germany until the collapse of the Soviet Union. One thing I hope my essay will do is prompt libertarians to think more seriously about historical security conditions and what viable “libertarian” options there may have been in the foreign-policy crises of the past. If there were no viable libertarian options, that’s a problem for libertarianism.
It’s a practical problem being confronted by Rand Paul right now. What liberal or libertarian thinkers can he draw upon for practical foreign-policy advice? There are a few, but most radical libertarians are simply not interested in real-world foreign-policy choices. And once libertarians do engage with reality, they start to seem a lot less libertarian.
Richman compares the hazards of foreign policy to those of domestic economic planning. In the case of the economy, the libertarian alternative is the free market; no planning. In the case of foreign policy, is the libertarian alternative also no policy? How can a state in a world of states—all of which, as libertarians know, have a coercive character—have no foreign policy? It’s true that the less power foreign-policy planners have the less trouble they can get up to. This is something on which libertarians and realists who favor restraint can agree. But realists recognize that this tendency for too much power to lead to abuse must be weighed against the dangers of other states’ power. Libertarians seem to see no danger in that direction at all.
Any people that has ever been invaded might find that perverse—indeed, my libertarian friends are often confounded by how their fellow libertarians in Poland or Ukraine can be so hawkish. But the U.S. is in an exceptionally strong geostrategic position. Invasion is highly impractical, if not impossible. My essay, however, notes that world conditions can have a dangerous influence on the U.S. even without foreign boots on our soil. On the one hand, foreign ideologies exert a certain attraction to Americans; and on the other hand, Americans have historically been rather paranoid about foreign ideological influence. Threats both real and imagined attend insecurity, and both kinds lead to illiberal policies.
Luckily, there are at present only a handful of geostrategic positions around the planet that offer secure bases for power projection and ideological dominance. North America is one of them. The second is the European continent. And the third is East Asia, which of the three is by far the least island-like and defensible.
Preventing a hostile power from dominating Europe and keeping a balance in East Asia is “empire” enough. Beyond that, prosperity and industrial strength, along with our nuclear arsenal, are the keys to our security. This is a historically realistic vision, one that solves the great problems of the past—what to do about Nazi Germany or the USSR—and the otherwise insoluble problems of the present, such as what to do about the Middle East: namely, minimize our exposure to crises that we cannot fix and that do not affect the top-tier distribution of power. Today what is most ethical and what is politically and strategically realistic coincide reasonably well: we should not seek to enlarge our commitments; we should preserve our naval power; we should use diplomacy and economics to advance our interests and contain disruptive powers.
This is not a strategy of hard-heartedness toward the oppressed peoples of the world. A secure and prosperous U.S. is in a position to be an ideological counterweight to any illiberal state or insurgency, and it can act when necessary only because it does not act when not necessary. Morale is as limited as men, money, and materiel, and wasting any of these—on a strategic level, we wasted them all in Iraq, as the present crisis demonstrates—is bad for our prosperity, our security, and everyone else’s as well.
Realism and restraint are the watchwords. If libertarians have a stronger strategic argument, I’m eager to hear it.
You don’t win a war unless you win the peace. This ought to be clear enough—the situation in Iraq illustrates it perfectly. The U.S. won the war, both in terms of deposing Saddam Hussein and in withdrawing troops from a country nominally at peace in 2011. But that peace was lost from the start: the U.S. had not created institutions that could keep order and prevent the next war from happening.
It’s an old story, and it applies at home as well as abroad. Last year the antiwar movement thought it had won its own metaphorical war—by preventing Obama from launching a real one against Assad’s Syria—yet the subsequent peace was lost. No institution had arisen that could prevent another round of mass-media sensationalism from taking America to war again. No noninterventionist think tank or peace organization commanded the attention or imagination of policymakers, whose minds remained filled only with possibilities provided by interventionists. Neoconservatives and humanitarian interventionists are still almost the only players at the table in Washington, despite their decades of failure.
If peace is to be a reality rather than a dream, its institutions must be built—not in the Middle East but here at home, above all in the precincts where foreign policy is actually made. Policymakers must have more on their menu than the dishes prepared for them by Bill Kristol and Samantha Power.
The American Conservative, in a small but steady way, has been trying to win the peace since 2002, when we were founded to oppose the impending war against Iraq. A decade ago the situation seemed hopeless, even if the fight was as noble as ever. In the 2004 presidential contest, Democrats rebuffed even the mildly antiwar Howard Dean in favor of a hawkish John Kerry, who went down to defeat in November at the hands of the president who had launched the Iraq War.
Movement conservatives closed ranks to silence critics even before the war began. On its eve, National Review published an attack on the most outspoken figures of the antiwar right. And while its author, David Frum, claimed “Unpatriotic Conservatives” was not aimed against all antiwar conservatives, the only one he typically exempted from his charges, Heather Mac Donald, kept her dissent almost entirely to herself. As Alexandra Wolfe noted in the New York Observer on March 10, 2003:
Heather Mac Donald, a fellow at the right-wing think tank, the Manhattan Institute, is that somewhat rare breed, an anti-war Republican. Among the “pro-war fanatics” she dines with regularly, she said, “you’re confident in your opinion, but why bother when it’s a futile gesture anyway?” So she mostly keeps quiet.
“I have a friend who works at The Wall Street Journal on the editorial side,” said Ms. Mac Donald, “and he’s anti-war and he won’t even mention it, because there the unanimity is so strong.”
Such unanimity prevailed across the conservative movement, and there was a price to pay for breaching it. Donald Devine elicited outrage when he merely failed to stand and applaud for Bush at an American Conservative Union banquet. For actually criticizing the Republican president and his policies, foreign-policy scholar John Hulsman was fired from the Heritage Foundation. And economist Bruce Bartlett happened to lose his job at the National Center for Policy Analysis just as his book Impostor: How George W. Bush Bankrupted America and Betrayed the Reagan Legacy was about to be published. No wonder the adminisration’s right-leaning critics kept their mouths shut.
Before 2006, The American Conservative stood alone in D.C. as a conservative institution that refused to surrender its principles to the Bush administration and its wars. Traditional conservatives who had minds of their own that they refused to yield up to any party or president came close to being deprived of a presence in the nation’s capital and conversation. But we wouldn’t go away, no matter the financial hardships we faced.
And then a remarkable thing happened: the public lost its patience. It threw the president’s party out of the House Speaker’s chair and the Senate Majority Leader’s office. Conservatives long disgusted not only by Bush’s policies but by the right-wing omerta that protected him began to speak out loudly and forcefully, even if it cost them their standing in the movement.
Ron Paul soon rallied a new antiwar base on the right. His son was elected to the U.S. Senate in 2010. And today in the House of Representatives there is a liberty caucus with realist and noninterventionist leanings—Justin Amash, Thomas Massie, Mark Sanford, the stalwart Walter Jones. They reinforce Jimmy Duncan of Tennessee, the last of the Republicans who voted against the Iraq War over a decade ago.
What once looked like token resistance from a small institution like TAC helped keep alive a debate and a point of view that otherwise lacked expression. And once the political climate changed, that point of view became a force that could rapidly grow and gain further institutional footholds.
Consider the careers of some of the young people drawn to work at The American Conservative who have gone on to bring something of its sensibility to other outlets. Michael Brendan Dougherty is now at The Week. Jordan Bloom and James Antle are at the Daily Caller and Daily Caller News Foundation. An ocean away, former TAC literary editor Freddy Gray is managing editor of The Spectator in London. Former interns John Glaser and Matthew Feeney are at Cato. Lewis McCrary, now a Robert Novak Journalism Fellow of the Fund for American Studies, was before that managing editor of The National Interest.
The American Conservative has been happy to be an institutional force multiplier—a farm team for a new generation of conservative talent, and during the dark days of the last war fever both a shelter and a megaphone for traditional conservatives unwelcome in the pages of movement magazines and websites. What’s more, The American Conservative will be here even when a future Republican president demands “unanimity” for his wars and other follies. Keeping TAC alive and growing is vital institution-building work, and not just for TAC itself.
To build peace, you have to build institutions. Those institutions will have to be complex, durable, far-sighted, flexible—not ideological, brittle, and simplistic. They will have to confront the realities on the ground, hard political, economic, and strategic realities.
Based on the evidence of more than a decade, this is beyond the abilities of the United States in the Islamic world. So we might try building institutions of peace and civil balance here at home. Institutions that favor restraint. Institutions that encourage conservatism in its most basic sense, a defense of what we have and are in danger of losing: self-government, a strong middle class, national security not existentially jeopardized by even the bloodiest terrorists, and liberties hard won over many generations.
The American Conservative is intended as just such an institution—a small but indispensable one. The scope of debate and dialogue in our pages, online and in print, is part of our mission. This is not a conversation for traditionalist conservatives alone or for libertarians alone or even for the right alone. Our core sensibility is a capacious one, grounded in realism and a Burkean constitutional temperament.
The side of peace can’t repeat the errors of the zealots of war: a with-us-or-against-us mentality, oversimplified analyses, and an ideology to be forced upon the world without regard to the realities of human life and security.
Whatever you donate to The American Conservative helps not only TAC itself but a discourse that was almost silenced ten years ago. No single institution can win the peace alone, but The American Conservative has since 2002 been a foundation stone upon which much more can be built. Help us continue and grow—and win.
You do—you’re supporting it just by reading The American Conservative. Support it some more by making a donation, so a journal you enjoy lives and thrives.
Traditional conservatives have an obvious interest in seeing their ideas presented in the most articulate, passionate, and realistic fashion possible. Bold men and women of the left, meanwhile, enjoy a thoughtful challenge from the right that keeps them on their toes. And independent minds who identify with neither side appreciate principled pluralism in our national discourse, a pluralism made possible by reasonable voices that don’t scream from a box or reduce difficult questions to partisan cliches.
This is why you read The American Conservative. It sharpens, it clarifies, it informs and provokes—as a good magazine should, online or in print. (And in our case, both.) TAC heartens and encourages and invigorates, sometimes by ticking you off. That’s also what a good magazine does.
We are reader-supported, and most of our readers don’t have deep pockets. But there are a lot of you, and you’re a committed and generous group. If you can give, I think you will.
The military-industrial complex has big money—you can see it splashed on subway posters in the D.C. metro system and throughout the pages of all sorts of public-policy magazines. Corporate America has big money—and a devotion to free enterprise that extends only as far as crony capitalism’s bottom line. We’re opposed to war and Pentagon pork; we’re also opposed to an economy that’s increasingly hostile to the middle class, and indeed almost all Americans. So the big advertising bucks will not come our way any time soon.
We don’t need them; we have readers. Their support—your support—keeps the lights on and keeps The American Conservative‘s mindshare growing.
You aren’t alone in this struggle: small foundations, especially, are there to help, groups that support such unfashionable but deeply human and conservative causes as beautiful architecture and livable cities. TAC is very much about small voices and small institutions fighting back against the monotonous reverberations of the big media and partisan outlets of big business.
You know why you read The American Conservative. Help us continue bringing it to you—and to wider audiences—by contributing to our fall fundraising campaign. I thank you for it.
Barack Obama has adopted Bill Clinton’s policy toward Iraq: bomb it until it gets better. Clinton—and before him, George H.W. Bush—bombed Saddam Hussein’s Iraq to safeguard the Kurdish north, degrade Saddam’s military capabilities, and perhaps weaken his regime to the point of collapse. Twenty years later, Obama is bombing the Islamic State in Iraq and al-Sham to “degrade and ultimately destroy” ISIS, while protecting the Kurdish north and what remains of the Iraqi state until recently ruled by Nouri al-Maliki.
We are well into the third decade of U.S. military operations against Iraq—dating back to 1991—but a free, stable, non-sectarian state has yet to emerge. Maybe a few more bombing sorties will do the trick.
George W. Bush got one thing right: he recognized that what his father and Bill Clinton had been doing in Iraq wasn’t working. Rather than continue indefinitely with airstrikes and sanctions that would never tame or remove Saddam, Bush II simply invaded the country and set up a new government. In the abstract, that was a solution: the problem was the regime, so change it. But change it into what?
Iraq is a patchwork of tribes and blocs of Sunni and Shi’ite Muslims. A dictator like Saddam could keep order, but since the end of the Cold War America has found dictators distasteful, so Iraq would have to be democratic—which means, no matter how intricate the electoral system might be, one faction would dominate the others. So Iraq plunged into years of sectarian violence, and when it was over, the Shi’ite Maliki was in charge. Sunnis were never entirely happy about this, and they only became less so over time. Once ISIS surged across the border from neighboring Syria—experiencing its own (almost) post-dictatorial disintegration—many Sunnis welcomed them, and the bloodshed resumed.
Obama doesn’t want to answer the question that Bush I and Bill Clinton also avoided: namely, what kind of government could Iraq possibly have after Saddam that would satisfy the United States? Bush II had an answer, and it proved to be the wrong one. Obama knows it’s a trick question. There is no realistic outcome in Iraq that will not involve violence and repression. Either the country must have another dictator (unacceptable), or it must be dominated by one sect or the other (in which case it risks becoming the Islamic State or another Iran—also unacceptable), or else we have to pretend that it’s about to turn into an Arab Switzerland (entirely acceptable, and also impossible).
ISIS is an exceptionally violent revolutionary group, but it’s only a symptom of the more fundamental disease: the lack of a government strong enough to keep order but not so sectarian or tyrannical in temper as to persecute anyone. If Obama is successful against ISIS—as George W. Bush was successful against Saddam—how long before a new evil congeals? ISIS itself is a successor to another terrorist group, al-Qaeda in Mesopotamia, that was beaten once before in Iraq. The rise of yet another terrorist or tyrannical force is a virtual given, a fact established by more than one cycle of history: as surely as AQIM followed Saddam and ISIS followed AQIM, something else awaits to follow ISIS.
Unable to break this cycle, Obama resorts to bombing because our pundits demand that he “do something.” Leaving Iraq to its own devices, to suffer, burn, and ultimately rebuild, is too cruel, and ISIS with its spectacular propaganda videos makes a great cable news bite and social-media campaign. It’s evil, it’s scary, it’s on YouTube, so what are we going to do about it? Obama would be weak and callous if he did nothing. That he can’t actually do much that matters in the long run is unimportant—our humanitarian urges and Islamophobic fears will be satisfied as long as we get some kind of action right now. So we bomb.
There’s no political risk in bombing, as there is in putting “boots on the ground.” There won’t be too many body bags shipped home to Dover AFB to trouble voters. What’s more, bombing can be of any intensity political conditions demand: if John McCain is howling louder than usual on “Meet the Press,” just drop a few more bombs. That shows you’re a real leader.
This may sound grotesque—not the reality of what Obama is doing and the politics that lead him to do it, but to my saying it out loud, when there are real human beings in Syria and Iraq for whom none of this is abstract. ISIS is a deadlier threat to their lives than American bombing is, and real men and women can make choices about violence and politics that anyone’s fulfill anyone’s grim projections. There may be no ideal “moderate resistance” to Assad or ISIS itself, but there many degrees of better and worse, and they are matters of life and death to the people of the region.
All of which is true, and opponents of our 23-year policy toward Iraq, such as myself, should not be complacent about far-away people’s lives. If this is something that war critics must keep in mind, however, supporters must be equally serious about political realities—not immutable realities, but probabilities so strong as to require that our hopes and ambitions take account of them. Peace and tolerance depend on order, and under these circumstances order depends on a strong state. It would be foolish for Obama or anyone else to name in advance what kind of state, under whose control, will emerge victorious, but whenever the Iraqis and Syrians themselves give rise to a leader or faction capable of maintaining order, America must be prepared to accept the result and demand only the most basic concessions to our own values and security.
During the Cold War, it was often enough that a state or faction be anti-Communist for it receive American approbation. Dictators and sectarians as well as democrats passed the test, sometimes to our regret. After 1989, with our own security unassailable, we raised our expectations of others: we could afford to moralize and cajole. This proved to be disastrous in many cases, as botched attempts at democratization and economic liberalization urged on by the U.S. led to unstable regimes and countervailing extremism. If the U.S. no longer wishes to apply as crude a test for regime acceptability as it did during the Cold War, it must nonetheless devise criteria more realistic than those that prevailed over the past 20 years.
Obama’s bombs and other measures may or may not lead to regime change in the territory now controlled by ISIS. He can’t control that, and he cannot even do much, given the way our media and politics work. But what he can do is begin the long process of clarifying America’s understanding of how much like or unlike us we really expect other regimes to be. If U.S. can arrive at non-utopian answer to that question, we can perhaps again have a strategy that matches means to ends—rather than one that falls back on air power as the ever-present means to impossible ends.
Of the many foolish things said about the Hobby Lobby case, a contender for most foolish is the “Buy your own contraception!” snark on the right that runs parallel to the left-wing exaggerations about bosses dictating contraception choices to women. What the snark disguises is that the principle at stake is the same for both sides: if you’re compelled to purchase a service, you—whether “you” are a business or an individual—want to have some say in what it is you’re buying. Since it’s no longer a market transaction when government insists that it must take place, the question of just what is being bought has to become a political and legal question. Women who say that they should get a basic service when they are forced to buy an insurance plan are in exactly the same position as a company that says it has religious objections to certain kinds of services. Neither claim is risible; both arise from the straightforward notion that you should get what you want, and not get what you don’t want, when you have to buy something.
If sex and religion are too polarizing to offer a clear illustration, consider whether “buy your own!” would make sense in a different context. If Washington commanded that everyone must purchase a car through his or her employer, but employers didn’t offer, say, headlights with the vehicle, would having to pay extra out of pocket seem reasonable, or would most people feel ripped off? Conversely, if employers were told that they had to offer, say, gold-plated hubcaps with all cars, wouldn’t they be entitled to object?
With auto parts, it might be possible to come up with a public consensus on what features were reasonable. That’s simply not possible, and not even desirable, where the questions at the heart of the Hobby Lobby case are concerned. Despite the best efforts of ideologues, sex and religion are still personal rather than political matters for most people. Americans do not want their relationships or beliefs supervised by the federal government, or by any government—not by a bureaucracy, not by a court, and not by the democratic process. Why should anyone get to vote on your faith or whether your insurance covers contraception? But the problem with the HHS mandate, and with Obamacare itself, is that it makes these very personal matters unavoidably public as well: matters for bureaucrats, judges, lawyers, politicians, Rush Limbaugh, and the people you do business with.
Apologists for the mandate can say that there’s already a public dimension to these things, which is true. There’s no wall of separation between the people as a political actor and citizens’ personal feelings about sex and religion, and there’s obviously not supposed to be a wall of separation between the people and their government. But there’s a difference between the indirect, tiered influence that private persons exercise on the public and the public exercises on the state—and vice versa—and the kind of simplification that ideologues wish to see, in which individual, community, and state are all harmonized according to a single, unchanging set of values. The trouble with ideologues left or right isn’t just what they want, it’s how oblivious they are to their own excesses: they can’t imagine that anyone could have a reason not to want to subsidize someone else’s contraception or that any woman might feel cheated and demeaned by a company failing to provide insurance that covers birth control. You don’t actually have to agree with the metaphysical apparatus behind either side to see that something conscientious and intimate is being traduced by closing the gap between government, business, and private life.
Public policy is going to involve a clash of values one way or another, and even when one side “wins,” the political fighting doesn’t stop—the stakes are much lower, in practical terms, than ideologues can afford to admit. Popular governments aren’t meant to attain a steady equilibrium; opinions are always in motion, and thus so is politics. In pointing out the overreach that characterizes the simplifiers on both sides, the objective isn’t to arrive at a uniformly agreeable middle policy—some ideal formula for what’s personal and what’s political—but to maintain a certain space, however compromised, for life and feeling at a distance from politics. No house is completely private and invisible to the outside world, but that doesn’t mean we should let ideologues tell us that our walls might as well be transparent. Religion and sex ought to remain more personal than political, imperfect though the separation may be, and policies that more thoroughly mix these things are simply bad.
If you’re in the D.C. area, drop by the Cato Institute at noon Friday for a panel discussion of Ralph Nader’s new book Unstoppable: The Emerging Left-Right Alliance to Dismantle the Corporate State. Nader, AEI’s Tim Carney, and I will be taking part, with the Kauffman Foundation’s Brink Lindsey moderating.
The American Conservative excerpted the book—specifically, the chapter on the forgotten distributist conservatives of the 1930s–in our May-June issue. There’s much else in it that conservatives and libertarians will find fascinating, as Nader explores what figures as disparate as Frank Meyer, Peter Viereck, and Murray Rothbard have had to say about the conjunction and centralization of economic and political power—and what the alternatives might be.
(While I’m touting books, let me also mention that the distributist/agrarian classic Who Owns America?, which Nader discusses in the TAC excerpt, is on sale now from ISI.)
A few weeks ago, Cato’s Christopher Preble and I attended a conference Nader organized, and the two of us participated on a panel to discuss left-right approaches to trimming the defense budget. Here’s the video:
And here are all the panels from that event, including remarks by Grover Norquist.
In late 2008 I put myself through a crash course in the works of Willmoore Kendall, the “wild Yale don,” as Dwight Macdonald called him, who had been one of the founding senior editors of National Review. This was research for an essay that would appear in The Dilemmas of American Conservatism. I’d read some Kendall before—a desultory stroll through The Conservative Affirmation in America, at least—and hadn’t profited much from the experience. But the second, more attentive perusal was different. Kendall himself had told of how R.G. Collingwood had taught him at Cambridge to read a book by asking what question the author was trying to answer. I didn’t find that approach too insightful, but I picked up something else from Kendall’s own methods—the habit of asking “What conditions would have to be true in order for this author’s arguments to make sense?”
That’s a more productive thing to ask of a serious work than simply, “Do this author’s arguments make sense?” The latter invites the reader to supply a misleading context: the author’s arguments may not match up with reality, but they must match up at least with his own view of reality, and that’s something worth figuring out and contrasting against whatever the reader thinks he already knows.
Stated so plainly this isn’t likely to strike anyone else as particularly insightful, just as Kendall’s report of how Collingwood reshaped his thinking didn’t do much for me. But that’s a lesson, too: it’s the act of thinking along with a text or teacher, and the new context created by that act, which makes a dead question come alive.
I thought of this when I recently came across Peter Witonski’s 1970 NR review of The Basic Symbols of the American Political Tradition, a book that began as a series of Kendall lectures and was finished after his death by George Carey. The review doesn’t do justice to the book—it elicited a sharp letter from Carey, who thought Witonski hadn’t even read what he purported to be reviewing—but Witonski does capture the effect Kendall can have, even decades after his death, perfectly:
What Kendall is all about is thinking—thinking about theoretical problems in politics. The device is that of the master professor, the man who by definition professes because he is wise, and is wise because he professes. The failure to convince, the difficult prose, are the essence of this device. In not convincing, Kendall makes you think the problem over again and again. I recognized this for the first time several years ago, when I met Kendall, for the first and last time, in a suburb of Paris, and spent many hours arguing with him.
The man, like the writer, was convincing and unconvincing. That night he spent a good deal of time propounding the general idea behind a book he had been engaged in writing, dealing with the American tradition. His argument was, of course, brilliant. But when I left him I was as unconvinced as ever. As I walked away from his flat I found myself thinking about what he had said. Suddenly I realized that I was thinking about such things as the Federalist Papers and the Declaration of Independence with a new freshness and vigor. I was rethinking them. I still did not agree with Kendall, but in his own perverse way he had taught me a great deal in a short period of time about subjects in which I had long since considered myself to be expert. Kendall was a master teacher.
Kendall and Collingwood are by no means alone in this heuristic impact. But it’s a rare thing: there are many memorable books and teachers that impart facts or insights; there aren’t so many who change the way an interlocutor reads.
On air, Liz Wahl quits Russia’s English-language propaganda network.
She’s been getting a bit of snark from Twitter over her belated realization that maybe RT is a less than rigorously objective news source. Yet I’m more exasperated by RT’s viewers than by hosts who are, after all, only making a living, however dubiously, by reading from the Kremlin’s script.* In particular, how can certain libertarians or government-skeptical leftists think that as long as the spin is coming from a government other than America’s it must actually be the truth?
Unfortunately, the answer is all too plain: if you think that the U.S. federal government is the source of all evil in your life, your country, and the world, then it stands to reason—almost—that whatever contradicts Washington is on the side of truth. Moscow and Beijing therefore become beacons of light. The ideologues who fall prey to this don’t necessarily hate America—there’s a distinction between the country and its government, after all—and they don’t think of themselves as pro-authoritarian or, in the case of the Middle East, pro-dictator. But they do think, ultimately, that foreign authoritarians and dictators are really more liberal than the liberal-but-really-authoritarian United States. It’s a sour love affair: the U.S. fails to live up to liberal ideals, or even to come close, so regimes that have no intention of abiding by them must be no worse, or indeed a great deal better. Read More…