State of the Union

Communism for Kids

The fence and guard tower at the Soviet forced labor camp Perm-36. Gerald Praschl / Wikimedia

Day by day, it becomes harder to tell parody from reality. If I told you that a major press was publishing a book called Communism for Kids, would you believe me? After due investigation, I can confirm that the title does exist, and that it is deadly serious. Even more remarkable, it comes from a very respectable publisher, namely MIT Press—yes, that MIT. (I cannot presently confirm suggestions of such possible future MIT titles as Sure, Johnny, You Should Take Candy From the Guy in the Van.) While we might like to attribute this project to temporary insanity, it does reflect some larger and really troubling currents in U.S. political discourse.

Communism for Kids is the work of Bini Adamczak, “a Berlin-based social theorist and artist” heavily involved in “queer theory.” When it originally appeared in German, the book was titled Kommunismus: Kleine Geschichte, wie Endlich Alles Anders Wird—roughly, “Communism: A Little Story, How Finally Everything Will Be Different”—without the explicit provocation of being aimed at children. In fact, the book is a simplified, user-friendly account of Marxist theory, illustrated with cartoons. At its heart are a series of case studies in pseudo-fairy tale language, where people explore various economic arrangements before settling on utopian communism.

Somewhere along the line, MIT Press decided to market it “for kids,” inspiring some confusion in the process. Amazon lists it as a children’s book intended for grades 3–7, although also suggesting a much more realistic age range of “18 and up.” Conceivably, the press deliberately chose the new title as a marketing gimmick in order to drive controversy and thereby increase sales. Alternatively, on the basis of their experience in Cambridge, Mass., they decided that there actually were enough play groups that would be delighted to work through Adamczak’s scenarios.

Either way, the book is targeted at “youngsters” broadly defined, and it has attracted some amazingly laudatory blurbs. Celebrity academic theorist Fredric R. Jameson remarks that this “delightful little book may be helpful in showing youngsters there are other forms of life and living than the one we currently ‘enjoy.’” Oh, the lowering severity that the professor bestows on us when we dare “enjoy” anything in our present monstrous dystopia! Novelist Rachel Kushner thinks that Adamczak’s is precisely the book we need at a time when global capitalism has brought us “more inequality than has ever been experienced by humans on earth” (which is a precise inversion of actual historical reality).

Assume for the sake of argument that Communism for Kids is not in fact designed to propagandize third graders, but is rather intended for teens and young adults. Is that not enough of a scandal in its own right? Somewhere in the book, might it not be acknowledged that communism is the bloodiest ideological system in human history? Solely measured by the number of his victims, Mao Zedong alone leaves Hitler in the dust. Could the book not mention such monuments to communist utopia as Kolyma and Vorkuta, among the largest and cruelest concentration camps that have ever existed?

Should it not be said that a solid scholarly consensus now accepts that this record of violence and bloodshed was a logical and inevitable consequence of the communist model itself, rather than a tragic betrayal or deformation? Evil Joseph Stalin did not distort the achievements and goals of Noble Vladimir Lenin: rather, he fulfilled them precisely. Pursuing the “for kids” framework, should we not see some equally cheery volumes such as A Day at the Gulag, and even (for middle schoolers) Natasha Is Shot as a Class Enemy? How about Springtime for Stalin?

As an intellectual exercise, just imagine that a major U.S. press offered a youth-oriented book on some other comparably bloody or violent system, such as National Socialism or white supremacy. The book might contain vignettes showing how young people at first learned to accept racial mixing and miscegenation. Eventually, though, they would realize the deeper underlying evils of Semitic influence. Or to paraphrase the advertising copy for Communism for Kids, such a little book would discuss a different kind of National Socialism, “one that is true to its ideals and free from authoritarianism.” Not of course that the publisher would actually be advocating such a thing, God forbid; it would rather be encouraging debate about the options available to contemporary youth. What reasonable person would object to such free discussion?

Even to describe such a project betrays its intrinsic lunacy. The book would not be treated seriously at any stage; it would not be accepted, and if it were, the press’s personnel would resign en masse rather than be involved in its actual publication. The press itself would likely not survive the debacle, and nor should it.

How, then, is that hypothetical instance different from Communism for Kids? Put simply, a great many educated Americans believe that totalitarianism of the left is utterly different from that of the right, and that communism should not be placed in the same toxic category as Nazism or fascism. According to this delusion, Americans who turned to communism through the decades were stubborn idealists, in stark contrast to those monsters who succumbed to racist or fascist theories. For all its possible failings, communism was not evil of itself. To misquote Chesterton, communism was not an ideal that was tried and found wanting, but rather was found difficult and therefore left untried. One day, though, when the stars align, we will do it right!

Such a benevolent view of communism is appallingly false and betrays a near total ignorance of the history of the past century. When we treat communism with tolerance or levity, we are scoffing at literally tens of millions of murdered victims. This is a disgusting moral idiocy, for which we must blame our educational institutions, and our mass media.

For a publisher like MIT Press to reinforce that view, to trivialize the communist historical record, is unpardonable. Have they no decency?

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Scandal-Free Obama

Beyond weakening the administration, the seemingly incessant wave of Trump scandals seems to reinforce liberals’ narrative of the previous president. As The New Republic remarked after the resignation of Michael Flynn, “Obama went eight years without a major White House scandal. Trump lasted three weeks.” Or as Obama himself boasted in December, “we’re probably the first administration in modern history that hasn’t had a major scandal in the White House.” To the horror of conservatives, who can cite a litany of official misdeeds during the Obama years, the apparent integrity of that era will feature prominently as historians evaluate that presidency. (Spoiler: as those historians are overwhelmingly liberal, they will rate it very highly indeed.)

In a sense, though, both sides are correct. The Obama administration did a great many bad things, but it suffered very few scandals. That paradox raises critical issues about how we report and record political events and how we define a word as apparently simple as “scandal.”

Very little effort is needed to compile a daunting list of horror stories surrounding the Obama administration, including the Justice Department’s disastrous Fast and Furious weapons scheme, the IRS’s targeting of political opponents, and a stunningly lax attitude to security evidenced by Hillary Clinton’s email server and the hacking of millions of files from the Office of Personnel Management. Even on the available evidence, the IRS affair had most of the major elements of something like Watergate, and a detailed investigation might well have turned up a chain of evidence leading to the White House.

But there was no detailed investigation, and that is the central point. Without investigation, the amount of embarrassing material that emerged was limited, and most mainstream media outlets had no interest in publicizing the affair. Concern was strictly limited to partisan conservative outlets, so official malfeasance did not turn into a general public scandal.

Misdeeds themselves, however, are not the sole basis for official statistics or public concern. To understand this, look for instance at the recently publicized issue of sexual assaults on college campuses. The actual behaviors involved have been prevalent for many decades, and have probably declined in recent years as a consequence of changing gender attitudes. In public perception, though, assaults are running at epidemic levels. That change is a consequence of strict new laws, reinforced by new mechanisms for investigation and enforcement. A new legal and bureaucratic environment has caused a massive upsurge of reported criminality, which uninformed people might take as an escalation of the behavior itself.

Political scandal is rather like that. To acknowledge that an administration or a party suffers a scandal says nothing whatever about the actual degree of wrongdoing that has occurred. Rather, it is a matter of perception, which is based on several distinct components, including a body of evidence but also the reactions of the media and the public. As long ago as 1930, Walter Lippman drew the essential distinction between the fact of political wrongdoing and its public manifestation. “It would be impossible,” he wrote, “for an historian to write a history of political corruption in America. What he could write is the history of the exposure of corruption.” And that exposure can be a complex and haphazard affair.

We can identify three key components. First, there must be investigation by law enforcement or intelligence agencies, which can be very difficult when the suspects are powerful or well-connected. Facing many obstacles to a free and wide-ranging investigation, the agencies involved will commonly leak information in the time-honored Washington way. The probability of such investigations and leaks depends on many variables, including the degree of harmony and common purpose within an administration. An administration riven by internal dissent or ideological feuding will be very leaky, and the amount of information available to media will accordingly be abundant.

Second, a great deal depends on the role of media in handling the allegations that do emerge. Some lurid tidbits will be avidly seized on and pursued, while others of equal plausibility will be largely ignored. That too depends on subjective factors, including the perceived popularity of the administration. If media outlets believe they are battering away at an already hated administration, they will do things they would not dare do against a popular leader.

Finally, media outlets can publish whatever evidence they wish, but this will not necessarily become the basis of a serious and damaging scandal unless it appeals to a mass audience, and probably one already restive and disenchanted with the political or economic status quo. Scandals thus reach storm force only when they focus or symbolize existing discontents.

The Watergate scandal developed as it did because it represented a perfect storm of these different elements. The political and military establishment and the intelligence agencies were deeply divided ideologically, both amongst themselves and against the Nixon White House. Leaks abounded from highly placed sources within the FBI and other agencies. Major media outlets loathed Nixon, and they published their stories at a time of unprecedented economic disaster: the OPEC oil squeeze, looming hyper-inflation, and even widespread fears of the imminent end of capitalism. The president duly fell.

But compare that disaster with other historical moments when administrations were committing misdeeds no less heinous than those of Richard Nixon, but largely escaped a like fate. Victor Lasky’s 1977 book It Didn’t Start With Watergate makes a convincing case for viewing Lyndon Johnson’s regime as the most flagrantly corrupt in U.S. history, at least since the 1870s. Not only was the LBJ White House heavily engaged in bugging and burgling opponents, but it was often using the same individuals who later earned notoriety as Nixon-era plumbers. In this instance, though, catastrophic scandals were averted. The intelligence apparatus had yet to develop the same internal schisms that it did under Nixon, the media remained unwilling to challenge the president directly, and the war-related spending boom ensured that economic conditions remained solid. Hence, Johnson completed his term, while Nixon did not.

Nor did it end with Watergate. Some enterprising political historian should write a history of one or more of America’s non-scandals, when public wrongdoing on a major scale was widely exposed but failed to lead to a Watergate-style explosion. A classic example would be the Whitewater affair that somewhat damaged Bill Clinton’s second term but never gained the traction needed to destroy his presidency. In that instance, as with the Iran-Contra affair of 1987, the key variable was the general public sense of prosperity and wellbeing, which had a great deal to do with oil prices standing at bargain-basement levels. Both Reagan and Clinton thus remained popular and escaped the stigma of economic crisis and collapse. In sharp contrast to 1974, a contented public had no desire to see a prolonged political circus directed at removing a president.

So we can take the story up to modern times. The Obama administration did many shameful and illegal things, but the law-enforcement bureaucracy remained united and largely under control: hence the remarkably few leaks. The media never lost their uncritical adulation for the president, and were reluctant to cause him any serious embarrassment. And despite troublingly high unemployment, most Americans had a general sense of improving conditions after 2009. The conditions to generate scandal did not exist, nor was there a mass audience receptive to such claims.

So yes, Obama really did run a scandal-free administration.

What you need for an apocalyptic scandal is a set of conditions roughly as follows: a deeply divided and restive set of bureaucrats and law-enforcement officials, a mass media at war with the administration, and a horrible economic crisis. Under Trump, the first two conditions assuredly exist already. If economic disaster is added to the mix, history suggests that something like a second Watergate meltdown is close to inevitable

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Don’t Forget the Epic Story of World War II

www.hacksawridge.movie

The World War II film Hacksaw Ridge is in contention for multiple Oscars, and I hope it wins a gaggle of them. It is a fine, well-made film, and a rare attempt in mainstream cinema to portray the heroism of a faithful Christian believer. Having said that, I have to lodge an objection. Without the slightest ill intent, the film contributes to a pervasive lack of understanding or appreciation of the U.S. role in that vastly significant conflict, the popular memory of which is utterly dominated by radical and leftist perspectives. For most people under forty, the war is recounted in terms of the country’s allegedly pervasive racism, bigotry, and sexism, in which the only heroes are those resisters who defied that hegemony. It has become Exhibit A in the contemporary retrojection of modern-day culture wars into the transmission of American history.

Hacksaw Ridge tells the story of Desmond Doss, a devout Seventh Day Adventist, whose religious views forbade him accepting military service. As a conscientious objector, he served as a medic, and found himself on the extraordinarily bloody battlefields of Okinawa. His feats of courage and self-sacrifice earned him the only Congressional Medal of Honor ever awarded to a conscientious objector. No one would have dared invent such a story, which clamored to be told. But here is the problem. If such a treatment were part of a broad range of accounts of the war, then it would be a wonderful contribution, but it does not form part of any such continuum. While the main narrative of the war has faded into oblivion, major events like Okinawa are recalled only as they can be told from a perspective that appeals to liberal opinion, and even to pacifists.

For many years, I taught a class on the Second World War at Penn State University, and I have an excellent sense of the materials that are available in terms of films, textbooks and documentaries. Overwhelmingly, when they approach the American role in the war, they do so by emphasizing marginal perspectives and racial politics, to the near exclusion of virtually every other event or controversy.

At that point, you might legitimately ask whether minority contributions don’t deserve proper emphasis, as of course they do. Waco, Texas, for instance, was the home of the magnificent Dorie Miller, an African-American cook on the USS West Virginia, who responded to the Japanese attack on Pearl Harbor by blasting at enemy aircraft with a machine gun. Miller was a superb American hero, as also was (for instance) Daniel Inouye, of the Japanese-American 442nd Regimental Combat Team, who suffered terrible wounds and was later awarded a Congressional Medal of Honor. The legendary Tuskegee Airmen produced a legion of distinguished (black) fliers, but we might particularly cite Roscoe Brown, the first US pilot to shoot down one of the Luftwaffe’s terrifying new jet fighters. All these individuals, and many like them, have been lauded repeatedly in recent books and documentaries on the war, for instance in Ken Burns’s 2007 PBS series The War. They absolutely deserve to be remembered and honored.

But they should not be the whole story, and in modern cultural memory, they virtually are. If you look for educational materials or museum presentations about America in World War II, I can guarantee you will find certain themes or events constantly placed front and center. By far the most significant thing to be highlighted in the great majority of films, texts, and exhibitions are the Japanese-American internments. Depending on their approach, other productions will assuredly discuss women’s role on the home front, and “Rosie the Riveter”. Any actual tales of combat will concern the Tuskegee airmen, or the Navajo code-talkers. Our students enter classes believing that the Tuskegee fliers were basically the whole of the Allied air offensive against Germany.

A like emphasis dominates feature films of the past couple of decades such as Red Tails (2012, on Tuskegee) and Windtalkers (2002, the code-talkers). Especially when dealing with the Pacific War, such combat-oriented accounts strive very hard to tell their tales with a presumed objectivity, to avoid any suggestion that the Japanese were any more motivated by pathological violence and racial hatred than the Americans. That approach was amply illustrated by Clint Eastwood’s sprawling duo of Flags of Our Fathers and Letters From Iwo Jima (2006). Western productions virtually never address the mass murders and widespread enslavement undertaken by the Japanese regime. Not surprisingly, the Japanese neo-militarist hard Right loved Eastwood’s Flags and Letters. (Fortunately, you are still allowed to hate Nazis, or we wouldn’t have the magnificent Saving Private Ryan.)

The consequences of all this are apparent. For many college-age Americans today, America’s war was largely a venture in hypocrisy, as a nation founded on segregation and illegal internments vaunted its bogus moral superiority. If awareness of Nazi deeds prevents staking a claim of total moral equivalence, then America’s record is viewed with a very jaundiced eye.

Even setting aside the moral issues, the degree of popular ignorance of the war is astounding. I have complained that the materials available for teaching military history are narrowly-focused and tendentious, but the opportunities even to take such courses have all but collapsed in recent years. Most major universities today will not hire specifically in military history, and do not replace retirements. Courses that are offered tend to be general social histories of the home front, which can be excellent in themselves, but they offer nothing of the larger context.

In terms of actual military enterprises, under-40s might at best know such names as Pearl Harbor, Omaha Beach (exclusively from Saving Private Ryan) and maybe Iwo Jima (from Flags / Letters). Maybe now, after Hacksaw Ridge, they will know something about Okinawabut only as seen through the eyes of one pacifist. (So what were U.S. forces actually doing in Okinawa? Why did the battle happen? How did it end?)

Military buffs apart, younger Americans know nothing about the Battle of the Bulge, which claimed nineteen thousand American lives. They have never heard of Guadalcanal, or Midway, or the Battle of the Coral Sea, or a series of battles that prevented the Pacific becoming a Japanese lake, and the main trade route of its slave empire. They know nothing about the land and sea battles that liberated the Philippines, although that could be politically sensitive, as it would demand coverage of the mass killings of tens of thousands of Filipino civilians by Japanese occupiers. That might even raise questions about the whole moral equivalence thing.

Younger Americans know nothing of the battle of Saipan, one of the truly amazing moments in U.S. military history. Within just days of the American involvement in the D-Day campaign in France, other U.S, forces on the other side of the planet launched a near-comparably sized invasion of a crucial Japanese-held island, in what has been described as D-Day in the Pacific. In just a couple of days of air battles related to this campaign, U.S. forces in the Marianas destroyed six hundred Japanese aircraft, an astounding total. Japan never recovered.

Quite apart from any specific incident, most Americans have virtually no sense of the course of the war, or American goals, or the political context. Nor will they appreciate the stupendous feats of industrial organization that allowed U.S. forces to operate so successfully on a global scale, and which laid the foundations for all the nation’s post-war triumphs. There was so much more to the story than Rosie the Riveter.

Nor do they appreciate the critical role of the war in creating American identity and nationhood, in forging previously disparate immigrant communities into a new national whole. So the Civil War was the American Iliad? Then World War II was our Aeneid, an epic struggle against authentic evil, which at once created the nation and framed its destiny. It should not be commemorated as a study in victimhood and injustice.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

What Was ‘America First’?

Charles Lindbergh. Photo: Library of Congress

In his inaugural address, Donald Trump used a slogan that he had already quoted approvingly in earlier speeches: “From this day forward,” he said, “it’s going to be only America first. America first.” Liberal writers profess to find the phrase terrifying, a confirmation not just of Trump’s dictatorial instincts, but also of his racial and religious prejudice. Sidney Blumenthal is one of many to recall that the slogan “was emblazoned on the banner of the movement for appeasement of Hitler.” In reality, the original America First movement of 1940–41 was far broader and more complex than this critique might suggest, and was actually much more respectable and even mainstream. It was a sincere anti-war movement that drew from all shades of the political spectrum. Its later stigmatization as a Nazi front group is tragic in its own right, but it also closes off legitimate paths of public debate that have nothing whatever to do with authoritarianism or bigotry.

The America First Committee (AFC) was formed in September 1940 and operated until the Pearl Harbor attack. In all that time, though, historical attention focuses on just one shameful moment, namely the speech that Charles Lindbergh gave in Des Moines, Iowa, on September 11, 1941, when he openly attacked Jews as a force driving the country toward war. The speech was appalling—and worse, Lindbergh’s anti-Semitism really did represent the view of a small minority of AFC supporters. That event, above all, fatally tainted the memory of America First. In modern television documentaries, the movement is usually mentioned alongside authentic Nazi groups like the clownish German American Bund, whose members paraded in brown shirts and swastikas. The impression we are left with is that the AFC was a deeply unsavory pressure group that tried to undermine the political will of a nation united behind the Roosevelt administration in its determination to fight Hitler when the proper time arose. Any ideas associated with America First must, by definition, be regarded as anti-Semitic, pro-Nazi, and toxic.

The fundamental problem with that view is that on most critical issues, the AFC held positions that were close to a national consensus, and in these matters, it was FDR and the interventionists who were the minority. The AFC can be understood as the clearest institutional manifestation of the nation’s deep-rooted anti-war sentiment.

It is very difficult today to understand just how deeply and passionately pacifist the U.S. was through the 1930s, and how strongly that sentiment persisted almost to the outbreak of war in 1941. That had nothing to do with anti-Semitism, or with sympathy for Hitler. Rather, it arose from a widespread perception of the First World War as an unmitigated catastrophe. According to this consensus, U.S. involvement in that earlier war arose because of the machinations of over-mighty financiers and plutocrats—what we might call the 1 percent—who deployed vicious and false propaganda from London and Paris. Also complicit were the arms dealers, for whom the phrase “merchants of death” now became standard.

Throughout the New Deal years, Republicans and Democrats alike worked to expose these crimes, and to ensure that they could never be repeated. Between 1934 and 1936, the Senate committee headed by Gerald Nye provided regular media copy about the sinister origins of U.S. participation in the First World War, a theme taken up in bestselling books. Those critical ideas led to the creation of a detailed series of Neutrality Acts of 1936–37, designed to prevent U.S. involvement in any new conflict. From 1935 through 1940, Congress repeatedly voted on the Ludlow Amendment, a proposed constitutional amendment that would have made it impossible to go to war without the consent of a national referendum except in cases of direct attack. As late as August 1941, when the House of Representatives approved a measure to extend the term of military draft service and ensure that the U.S. would retain a large fighting force, it did so by a single vote: 203–202.

Anti-war ideas saturated popular culture. Look, for instance, at the 1938 film You Can’t Take It With You, directed by the thoroughly mainstream Frank Capra. In one scene, the sympathetic anarchist played by Lionel Barrymore mocks the bureaucrat who is trying to make him pay taxes. What do you want the money for, he asks? To build battleships? They’re never going to be any good to anyone.

From 1939 to 1941, the Nazi-Soviet pact brought American communists wholeheartedly into the peace crusade, with results that are embarrassing in retrospect. Today, popular-culture historians cherish the memory of iconic folk singers like Pete Seeger, Woody Guthrie, and the Almanac Singers. But listen to the songs they were singing in 1940 and 1941, which mock the Roosevelts as warmongers who dutifully serve the interests of the British Empire: why should I die for Singapore? One tuneful Almanac singalong compares FDR’s appetite for war with his agricultural policy. The government destroys crops to keep prices high, says the song, so let’s plow under, plow under, every fourth American boy!

In such an atmosphere, even cynical jokes went viral. In 1936, Princeton students organized a prank organization called the Veterans of Future Wars. Since the government was so likely to draw them into idiotic wars, they argued, could they please have their bonuses now, instead of after the conflicts had ended? Veterans of Future Wars became a national campus sensation, with 60,000 members at its height. As in the 1960s, the anti-war cause was a campus youth movement, with its own potent folk songs.

From 1939, a series of ad hoc anti-war committees and campaign groups emerged across the country, out of which the AFC coalesced. America First had 800,000 paying members, and that at a time when membership meant far more than merely ticking a box on a website. Some AFC leaders were hard-right anti-New Dealers, but many others were former members of the Progressive movement, dedicated to social improvement and liberal reconstruction. Across the ideological board, women were very well represented. The churches were another key source of recruitment. Most American Roman Catholics, especially the Irish leadership, disliked the prospect of an alliance with imperial Britain, but many AFC supporters were liberal Protestants, organized through the mainline’s flagship magazine, Christian Century. AFC was cross-party, and cross-ideological.

Lindbergh’s horrible speech apart, virtually none of AFC’s campaigning or publicity materials so much as mentioned Jews. Instead, they raised a series of challenging questions about whether American interests were well-served by direct involvement in a war that might have consequences as grim and futile as those of its predecessor. What corporate interests desired such an insane conflict, and which media? Why were Americans so persistently vulnerable to manipulation and crude propaganda by foreign nations? Why would people not understand the administration’s blatant efforts to provoke an incident that would start a new war? (The AFC was actually dead right here. Throughout late 1941, FDR really did strive to provoke clashes between U.S. destroyers and German U-boats.)

Such questions are all the more relevant in light of more recent historical developments: questions about the nature of executive power in matters of foreign policy and the official use of propaganda and deceit to achieve a political end. You do not need to be a crypto-Nazi to oppose any war except one fought in the direct defense of national interests.

If Donald Trump’s administration wants to start a national debate about those aspects of the America First tradition, then good for them.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Sexual Blackmail’s Long History

Maybe Donald Trump is an aficionado of weird sexual practices, and maybe not: I really have no way of knowing. But I do know that stories of precisely this sort are a very common tactic used by intelligence agencies to discredit political figures at all levels. With very little effort, I can also tell you exactly how such tales emerge and how they fit into the modus operandi of particular agencies and governments, and there is no excuse whatever for other people not to know this.

The fact that U.S. agencies and media are being so uncritical of these latest Trump exposés and “dossiers” is alarming. The whole affair should alert us to the wave of bogus stories we can expect to be polluting our public life in the next few years.

Since so many roads in this affair seem to lead back to Russia, some background might be helpful. In 1991, senior KGB officer Vasili Mitrokhin defected to the British, carrying with him an amazing haul of secret documents and internal archives. Working with respected intelligence historian Christopher Andrew, Mitrokhin published a number of well-received books, including The Sword and the Shield (1999). The Mitrokhin Archive demonstrates the vast scale of Soviet disinformation—the invention and circulation of stories that were untrue or wildly exaggerated, commonly with a view to discredit critics or opponents of the USSR. These functions were the work of the KGB’s Service A, which was superlative at planting stories around the world, commonly using out-of-the-way news media. Service A also did a magnificent job of forging documents to support their tall tales. (Even serious and skeptical historians long accepted the authenticity of a particularly brilliant forged letter notionally sent by Lee Harvey Oswald.) Such disinformation was a major part of the KGB’s functions during the 1970s and 1980s, which were the formative years of the up-and-coming young KGB officer, Vladimir Putin.

One instance in particular illustrates Service A’s methods and how triumphantly they could shape public opinion. The Soviets naturally wanted to destroy the reputation of FBI Director J. Edgar Hoover, a dedicated and lethally effective foe of Soviet espionage until his death in 1972. Even after his death, the vilification continued, and in the context of the time, the deadliest charge that could be levied against him was that of homosexuality. Accordingly, the Mitrokhin Archive shows how hard the Soviets worked to spread that rumor.

Now, Hoover’s real sexual tendencies are not clear. He was a lifelong bachelor who maintained a very close friendship with a male subordinate, and our modern expectations would be that he was a closeted homosexual. Having said that, during Hoover’s youth it was common for members of the same gender to share close non-sexual friendships. What we can say with total confidence is that Hoover was fanatically conscious of his privacy and security, to the point of paranoia, and there was no way that he would do anything public that could arouse scandal or permit blackmail.

That context must make us very suspicious indeed of claims that Hoover was not only homosexual, but also a transvestite, stories that have become deeply entrenched in popular culture. But the origins of the cross-dressing stories are very suspect. The only source commonly quoted is a 1991 affidavit by a woman who claimed that some decades earlier she had met a heavily dragged-up Hoover at a private event, where he allegedly wished to be addressed as “Mary.” The story is multiply improbable, most would say incredible, and has been rejected even by historians who normally have nothing good to say about Hoover. Yet the tale circulated widely, and wildly. In 1993, searching for a new head of the FBI, the ever-tasteful new president Bill Clinton joked that it would be hard to find someone to fill Hoover’s pumps.

We can’t prove that Service A invented or spread the specific cross-dressing stories about Hoover, but they fit exactly with other Soviet efforts through the 1970s and 1980s. And that legend has subsequently become a truth universally acknowledged.

The Russians were absolutely not the only agency that circulated disinformation. Such tactics are common to most major agencies, including those of the U.S. But looking at the recent Trump revelations, we constantly encounter Russian sources for scabrous tales that are about as improbable as that of “Mary” Hoover.

The Trump dossier that is the source of current attention might be accurate, but there are real grounds for suspicion. The main source is a respected and credible British private agency, which like most organizations of its kind draws heavily on former members of that nation’s mainstream intelligence service. But this particular dossier was evidently not compiled by the standard means used by such agencies, where competing sources are evaluated and judged in an adversarial process. (Team A makes the best case for one interpretation, Team B argues the opposite, and you see which side makes the most credible argument.) Rather, the Trump material was gathered on a parti pris basis with the specific goal of collecting negative information on the then-candidate. This is in consequence a partisan report, which just did not exercise the kind of skepticism you would expect in a normal intelligence analysis. The stories (including the one involving the fetish) look not so much like kompromat (compromising material, which might be true) but rather seem to be Russian dezinformatsiya, disinformation.

I was stunned to read a recent story in the British Guardian suggesting that the CIA and FBI had “taken various factors into consideration before deciding [the dossier] had credibility. They include Trump’s public comments during the campaign, when he urged Russia to hack Hillary Clinton’s emails.” Is there really anyone who did not hear that “urging” as a classically outrageous Trumpian joke? Yet that is here cited as major confirmation of his sinister Russian connections.

Why would the Russians create such material, if it might discredit a figure who actually promised to be a close ally on the global stage? Perhaps it was a kind of insurance, to keep Trump in line if he became too hostile to future Putin actions. Far more likely, it was part of a general series of stories, scares, myths, and downright falsehoods that have circulated so freely over the past 18 months, with the collective goal of discrediting the U.S. democratic system and fomenting dissension, paranoia, and outright hatred in American public life. The Russians control our elections! The Russians watch everything you say and write, no matter how private you think it is! Russian hackers gave the Rust Belt states to Trump! Trump is a Putin puppet!

If the Russians were seeking to undermine the American political order, to discredit the U.S. presidency, and in short to destabilize the United States, then they have succeeded magnificently in their goal.

Over the next few years, Donald Trump’s many critics will be very open to accepting any and all wild stories about him and his circle. Some of those rumors might even be true. But a great many will be disinformation stories spread by the Russians and, who knows, by other international mischief makers. It would be wise to exercise some caution before falling for the next tall tale.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Checks and Balances

We have recently heard a great deal about the alleged evils of the Electoral College, and progressive opinion would currently like to abolish what it views as an obstacle to true multicultural democracy. Reasonable people can disagree about the virtues of the college itself, but the underlying arguments here are deeply troubling for what they suggest about the very widespread ignorance of the Constitution. This is a matter of rudimentary civics education.

The main indictment of the Electoral College is that it stands in the way of the expression of the voting majority. The main problem with this argument should be apparent: very few people actually believe in using simple electoral majorities to decide each and every political question.

If we did that, then all contentious issues would be resolved not by the decisions of courts or elected assemblies, but by referenda or plebiscites. And if the United States did work on that basis, many victories that progressives cherish—such as the end of formal racial segregation, the legalization of abortion, and the recognition of same-sex marriages—would have been achieved much later than they were or not achieved at all. Quite assuredly, the mass immigration that has occurred since the 1970s would not have taken place. The Founders knew plenty of examples of direct, unmediated democracy from the ancient world—plebiscites, referenda, votes on ostracism—and they absolutely rejected them.

The fact that particular institutions stand in the way of the popular will does not mean that they are anti-democratic. The United States does not work on the basis of crude majoritarianism, and it was never intended to do so. Any elected representative who fails to understand that is in the wrong profession.

Progressives further allege that the Electoral College’s undemocratic setup betrays the institution’s origins in the political needs of slave states: states’ power in the college is based on their total number of representatives and senators, and the Three-Fifths Compromise originally gave Southern states extra representatives in proportion to their (non-voting) slave populations. The college as an institution is thus tainted by a history of racial oppression. Only by ending it can the country march along the path to true racial equality.

But if you actually read the Founding Fathers, you see they are constantly trying to balance and reconcile two competing forces, namely the people’s role as the basis of government and the need to restrain and channel the shifting passions of that same people. That is why the U.S. government has the structure it does of House, Senate, and presidency, each with its different terms of office. The goal is to ensure that all three will not suddenly be elected together in a single moment of national fervor or insanity. While those institutions all represent the people, they do so with different degrees of directness and immediacy.

At every stage, the Founders wished to create a system that defended minorities from the majority, and moreover to protect smaller and less powerful regions and communities from their overweening neighbors. That protection did not just involve defending the institution of slavery, but extended to any number of other potential conflicts and rivalries of a regional nature: urban versus rural, merchants against manufacturers, farmers against merchants and manufacturers, skeptics versus pious true believers.

The critical need for intermediary institutions explains the role of states, which is apparently such an acute grievance for critics of the Electoral College. As the question is often asked today: why does that archaic state system stand in the way of the national popular will, as expressed through the ballot box? But let’s pursue the argument to its logical conclusion. If the Electoral College is really such an unjustified check on true democracy, its sins are as nothing beside those of the U.S. Senate, whose members hold power regardless of the relative population of their home states. California and Texas each have two senators, and so do Wyoming and Vermont. Is that not absurd? Back in the 1920s, the undemocratic quality of that arrangement was a major grievance for the leaders of populous industrial states, who could not understand why the half-million people of Arizona (say) had exactly the same Senate clout as the 10 million of Pennsylvania.

The Founders were also far-sighted enough to realize that those crude population figures would change dramatically over time. Probably in 20 years or so, Arizonans will outnumber Pennsylvanians. In the long run, things really do balance out.

So the next time you hear an argument that the Electoral College has served its purpose and should be abolished, do press the speaker on how far that argument should be taken. Why should we have a U.S. Senate, and should it be abolished forthwith? Why do we have more than one house of Congress anyway? Why don’t we make all decisions through referendum?

The Founders actually did know what they were doing, and Americans urgently need to be taught to understand that.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

When ‘Paranoia’ Is Justified

The U.S. has a lengthy tradition of paranoia and conspiracy theory dating from colonial times onwards, and it would not be difficult to cite many examples of that sad trend—times when Americans directed their suspicions toward such groups as Catholics, Jews, and Mormons. Today, that paranoid tradition is usually deployed against conservatives, with Trumpism presented as the latest manifestation of the American Paranoid Style.

Running alongside that, though, is an equally potent tradition of false accusations of paranoia: of using mockery and psychobabble to minimize quite genuine threats that the country faces. Yes, Sen. Joe McCarthy made wild and even ludicrous claims about communist subversion, but communists had in fact infiltrated U.S. industry and politics, a potentially lethal menace at a time when the U.S. and USSR stood on the verge of apocalyptic warfare. We were dealing with far more than a “Red Scare” or vulgar “McCarthyism.” And through the 1990s, anyone suggesting that al-Qaeda might pose a serious threat to New York or Washington was accused of succumbing to conspiracy nightmares about James Bond-style supervillains.

Sometimes, alleged menaces are bogus; but sometimes the monsters are real. Sometimes, they really are out to get you.

Next year, American paranoia will be much in the news when we commemorate the centennial of U.S. entry into the First World War. At that time, xenophobic and nativist passions unquestionably did rage against people derived from enemy nations, above all Germany. Vigilante mobs brutally attacked German individuals and properties, and a whole culture war raged against any manifestations of German language, literature, or music. Outrageous scare stories circulated about German plots, spies, and terrorists. When those stories are recounted in 2017, you can be certain that they will conclude with telling contemporary references about modern-day paranoia and xenophobia, especially directed against Muslims.

But how valid were those historic charges? Some years ago, I researched the extensive domestic security records of the state of Pennsylvania during the First World War era, at a time when this was one of the two or three leading industrial areas, critically significant to the war effort. Specifically, I looked at the internal records of the Pennsylvania State Police, an agency notorious in left-wing circles as a bastion of racism, reaction, and anti-labor violence. The agency was on the front lines of concerns about spies and terrorists, as the primary body to which citizens and local police would confide their suspicions about German plots.

What I found about those security efforts differed massively from the standard narrative. The first impression you get from those confidential documents is how extraordinarily sensible and restrained those police officers actually were. In the face of hundreds of complaints and denunciations, the agency’s normal response was to send in an undercover cop, who would investigate the alleged German traitor. On the vast majority of occasions, the officer would then submit a report explaining why Mr. Schmidt was in fact harmless. Yes, the officer might say, Schmidt has a big mouth, and cannot restrain himself from boasting about German victories over the Allies. He might on occasion have said something stupid about how the gutless Americans will never be able to stand up against the Kaiser’s veterans. On the whole, though, he is a windbag who should be left alone. Quite frequently, the reports might say, sympathetically, that Mr. Siegel is a harmless and decent individual whom the locals dislike because of his strong accent, and they really should stop persecuting him. Or that all the evidence against Mr. Müller was cooked up by hostile neighbors.

Overall, the internal security efforts in Pennsylvania at least impress by their sanity, decency, and restraint. That might be a tribute to the human qualities of the cops in question, although it is also true that enemy aliens were so abundant in Pennsylvania that nobody could feasibly have tried to jump on every misplaced word. But you look in vain for evidence of official paranoia. Ordinary people might have been “spy mad,” as a report noted, but the cops weren’t.

So innocuous were the general run of reports that I tended to believe those that did refer to sedition, which was no myth. A great many Germans or German sympathizers wound up in trouble because they had said or done things that were truly destructive in the context of a nation at war. They did publicly laud German armies, disparage U.S. forces, and spread slanders about American atrocities in Mexico as well as on the Western Front. Some even flew the German flag and denounced German neighbors who supported the war effort.

And then there were the spies and terrorists. When you read next year about the alleged paranoia of the time, do recall the genuine German conspiracies of the time, such as the Black Tom attack that occurred in Jersey City in July 1916, while the U.S. was still at peace. German saboteurs destroyed a U.S. munitions shipment destined for the Allies, in the process unleashing an explosion so large that it made people in Maryland think they were hearing an earthquake. Also recall that German secret agents really had formed working alliances with dissident groups on U.S. soil—Irish Republican militants in the big cities, Mexicans in the Southwest.

If the Germans ever did plan to strike again at the U.S. war effort, Pennsylvania would surely be their first target, and petty acts of arson and sabotage abounded. How sensible, then, were the state police to focus their efforts on German sympathizers who really did look and act like potential terrorists? In one typical case, a young German miner in Carbon County was heard making pro-German remarks. This was alarming because he was located right in the heart of the strategic coal country. On further examination, Wagner speculated in detail about just how the Allies could be crushed militarily. He then boasted to an undercover officer that he knew how to convey information to Philadelphia, where radio transmissions could carry it to the Mexican border, and thence to German agents. The investigating officer concluded, with the far from “hysterical” judgment, that Wagner “is a dangerous man … for he is very loyal to Germany; would like to work for the Fatherland, and his people who are in the war.”

I mentioned the comparison between “paranoia” in the Great War and modern Islamophobia. Actually, that Islamic parallel is closer than we might think. Then as now, some people spread worthless slanders about foreigners and aliens, but also, then as now, some of the nightmare stories were actually true. Among the mass of harmless ordinary migrants devoted to working to improve themselves and their families, there really were, and are, people out to destroy America and Americans. Despite all the horror stories we hear about idiots in 1917 striking at the Kaiser by kicking a dachshund in the street, German spies and terrorists really existed, and they posed a lethal threat.

I mention this context now because you are not going to hear much about it in the coming year, when we will once again be lamenting the American Paranoid Style.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Will Faithless Electors Cause a Constitutional Crisis?

This past presidential election was among the most unsavory in U.S. history, and it might not even be over yet. Unless we are very careful, this election could yet come close to crippling constitutional government.

The problem is straightforward enough: namely, that large sections of the losing side stubbornly refuse to admit defeat. That is bad in itself. Around the world, one of the commonly accepted criteria in judging a democratic society is whether the losing party agrees to stand aside after losing an election. Initially, Hillary Clinton conceded defeat, and did so with grace and maturity, partly (it seems) under White House pressure. Still, an alarming number of her followers remained diehards, and the Clintons have now joined the demand to recount votes in Wisconsin and other swing states.

That recount demand is rooted in some murky statistics, and some assertions that can now be shown to be bogus. Initially, New York magazine quoted an expert study purporting to show that votes in critical states had been manipulated by computer hacking. According to those early claims, a technical comparison of regions within particular states showed that places using electronic voting turned out surprisingly high votes for Trump relative to areas that relied on paper ballots. Hence, said the advocates, there was prima facie evidence of malfeasance, and votes in swing states should be audited carefully before any final decisions can be made about the election’s outcome—however long that process might take.

Actually, these suggestions were absurd, and were promptly recognized as such by many liberal media outlets. The allegations were, for instance, utterly rejected by quantitative guru Nate Silver, who is anything but a Trump supporter. As he and others noted, the non-urban areas that were vastly more inclined to vote for Trump were also the ones most likely to use electronic voting. Big cities, in contrast, commonly used paper ballots. Obviously, then, we would expect votes cast by computer to lean heavily toward Trump in comparison to those marked on paper. That result certainly did not mean that Russian techies in secret fortresses in the Urals were hacking computers in Wisconsin and Michigan to delete votes cast for Hillary Clinton. Not long after the New York story, the main computer expert cited made clear that he himself did not accept the hacking explanation, although he still felt that an electoral autopsy was called for. That process is now underway, and with the support of the Clinton camp.

The fact that such mischievous allegations have even been made bespeaks liberal desperation at the defeat of their candidate, and a bottom-feeding attempt to seek any explanation for the catastrophe those liberals feel they suffered on November 8. Sadly, though, these unfounded allegations will remain alive in popular folklore for decades to come, with the simple takeaway: Republicans stole the 2016 election.

Those electronic issues pale in comparison with Democratic Party resistance in the Electoral College, where delegates are scheduled to meet on December 19. Normally, those electors would simply be expected to confirm the results of the November ballot, but liberals have demanded that they do the opposite, and actively nullify the result. Some electors have already stated that they will refuse to accept the majority votes cast in their Trump-leaning states. Notionally, these “Hamilton electors” will take this course not from any partisan motivation but rather to draw attention to the perceived injustices of the Electoral College system, which in their view should be replaced by a national popular vote. Online petitions urging other electors to join the defection have garnered millions of signatures.

Donald Trump’s lead in the college was so substantial—probably 306 to 232—that a handful of “faithless electors” should not affect the overall result, which could be overturned only through the concerted efforts of dozens of pro-Hillary activists. That is extremely unlikely to happen, but all the credentialed experts dismissed as unthinkable so many other things that actually have happened in this turbulent year.

For the sake of argument, imagine that enough electors go rogue to flip the election. Think through the likely consequences of such an outcome—in which Hillary Clinton is inaugurated in January, rather than Donald Trump. It is inconceivable that a Republican Congress would accept this result. It would offer zero cooperation in any legislative efforts, and it would presumably stonewall any and all approval of Clinton-nominated officials or judges. The only way to operate the government in those circumstances would be for the president to make extensive use of executive orders, and to fill official posts through an unprecedented volume of recess appointments. Theoretically, that method might even be used to fill Supreme Court vacancies. Constitutional government would have broken down, and we would be facing something like a Latin American presidential dictatorship. For several years, Washington’s political debate would be reduced to something like a Hobbesian war of all against all.

Does anyone really want to see a Clinton presidency at such a cost?

Nor is it easy to see how such a cycle could ever be broken once set in place, and particularly how the precedent set in the Electoral College could ever be overcome. Would not Republican electors seek revenge in 2020 or 2024? In that event, the November elections would become merely an opening gambit in an interminable legal process.

It is also ironic to see Hillary’s supporters demanding action in the Electoral College on the grounds of her convincing win in the popular vote. As they argue, how could any administration seriously claim a “mandate” with just the 46 percent or so of that vote earned by Donald Trump? Older election aficionados might cast their minds back to 1992, when an incoming Clinton administration decided to go full steam ahead on a number of quite radical policies, including a bold attempt to establish a national health-care system. The president then was Bill Clinton, who owed his presidency to gaining just 43 percent of the popular vote. Mandates are strange and flexible beasts.

Through the years, we have witnessed a number of elections so catastrophic that they seemingly threaten the existence of one or the other party. In the mid-1970s, few serious observers believed the Republican Party would survive the Watergate crisis, and similar pessimism reigned on the right following Obama’s victory in 2008. Yet despite such disasters, political currents soon changed, and Republicans won historic victories in 1980 and 2010. The despairing Democratic Party of the late 1980s likewise managed to resurrect itself sufficiently to hold power through much of the following decade. The lesson is straightforward: complain all you like about defeat, but console yourself with the prospect of future recovery and victory, probably in as little as two years’ time. To that extent, the American political system is remarkably forgiving of even egregious failure.

But that system also depends on elections securing clear and commonly agreed outcomes, in accord with principles very clearly described in the Constitution. If those decisions are not accepted, and are subject to constant sniping and subversion, then that Constitutional settlement will simply run aground.

If people don’t learn to lose, the Constitution fails.

As for me, I will breathe again on December 20.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

White Christian Apocalypse?

I have been deliberately holding off on election-related comments on matters about which I have little novel to contribute. On one critical issue, though, contemporary debate and theorizing really is trespassing on my areas of expertise.

For some 15 years now, I have been writing about the idea of the U.S. becoming a majority-minority country, in which no single ethnic or racial group constitutes a majority. I discussed this, for instance, in my book The Next Christendom, back in 2002. That idea has recently become quite standard and orthodox, and is an increasingly familiar element of political rhetoric, especially among liberals and Democrats. But at least as the idea is appearing in the media and political discourse, it is being badly misunderstood, in two critical ways. For some, these misunderstandings arise from excessive optimism; for others the flaw lies in pessimism. These points may seem stunningly obvious, but as I say, they escape a lot of otherwise informed commentators. Consciously or otherwise, observers are letting themselves be deceived by the fluid nature of American ethnic classifications.

Firstly, and obviously, “minority” is not a uniform category.

After the recent election, I saw plenty of articles saying this was the last gasp of White America before whites lost their majority status, maybe sometime around 2040. Well, 2040 is a long way off, but let us look at the projections for what the U.S. population will look like in mid-century, say in 2050. The best estimate is that non-Latino whites will make up some 47 percent of that population, Latinos 29 percent, African-Americans 15 percent, and Asians 9 percent. Allow a couple of percentage points either way.

In that situation, “whites” will indeed be a minority. But the future U.S. will be a very diverse nation, with multiple communities whose interests might coincide on some issues but not others. The fact that whites will be a minority in 2050 does not, for instance, mean that African-Americans will have unlimited latitude to achieve their goals, or that blacks can count on the reliable support of Asians and Latinos. On some issues, yes, on others, no. Just to take a specific issue, a distinctively African-American issue like reparations for slavery is presumably not going to appeal to the mass of Latino or Asian-American taxpayers any more than it presently does to old-stock whites.

I have actually talked with people who are convinced that by 2050, African-Americans will be a majority in this country. No, they won’t, not even close. Actually, the African-American share of the population will not even grow that substantially. The figure was around 12 percent in 1980, rising to 15 percent by 2050. Much of that growth reflects newer African migration, from communities that generally do not identify with African-American politics or traditions.

Also, what do we mean by “white”? Historically, the category of “whiteness” has been very flexible, gradually extending over various groups not originally included in that constituency. In the mid-19th century, the Irish were assuredly not white, but then they became so. And then the same fate eventually befell Poles and Italians, and then Jews. A great many U.S. Latinos today certainly think of themselves as white. Ask most Cubans, or Argentines, or Puerto Ricans, and a lot of Mexicans. Any discussion of “whiteness” at different points in U.S. history has to take account of those labels and definitions.

Nor are Latinos alone in this regard. In recent controversies over diversity in Silicon Valley, complaints about workplaces that are overwhelmingly “white” were actually focused on targets where a quarter or more are of Asian origin. Even firms with a great many workers from India, Taiwan, or Korea found themselves condemned for lacking true ethnic diversity. Does that not mean that Asians are in the process of achieving whiteness?

Meanwhile, intermarriage proceeds apace, with a great many matches involving non-Latino whites and either Latinos or people of Asian origin. (Such unions are much more common than black-white relationships.) Anyone who expects the offspring of such matches to mobilize and rise up against White Supremacy is going to be sorely disappointed.

The second point specifically concerns the book The End of White Christian America, by Robert P. Jones, a work I found rewarding and provocative. But the title has been much cited and misused (not Jones’s fault!). Typically doom-laden was the Washington Post’s headline, “White Christian America Is Dying,” and the takeaway for most liberals is: and good riddance.

Reading some post-election comments, it seemed as if commentators were expecting the “white Christian” population to evaporate, which it won’t do. Firstly, non-Latino whites will of course remain, and will still, at least through the 2050s, constitute by far the nation’s largest ethnic community. A 47 percent community still represents an enormous plurality. Actually, the scale of “white Christian” America will be far more substantial even than that figure might suggest, given the de facto inclusion of other groups—especially Latinos, and possibly Asians—under the ethnic umbrella. Intermarriage accelerates the expansion of whiteness.

Whites are not going away, and nor are Christians. One great effect of the 1965 Immigration Act was to expand vastly the range of ethnic groups in the U.S., who were overwhelmingly Christian in origin. That is true obviously of Mexicans, but also of Asian-Americans and Arab-Americans. New generations of Africans trend to be fiercely Christian. The American Islamic population, for instance, was and remains tiny as a proportion of the national total, and it will continue to do so.

So no, we are not looking to the end of white Christian America, nor to the passing of white Christian America. In 2050, this will be a much more diverse country religiously and ethnically. But if you are waiting for the White Christian Apocalypse, you may have the wrong millennium.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Of Monsters and Black Lives

Lonnie David Franklin is a monster.

Franklin, the “Grim Sleeper,” is a convicted serial killer who, between 1985 and 2007, murdered 10 to 200 women in Southern California. (I will return to that extraordinarily wide range of estimates later.) The fact that he is an African-American is centrally relevant to my purpose here. He is an appalling object lesson in the lethal power of official racism, past and present. At the same time, though, his case should also serve as a serious caution in current debates about reforming American policing in an age of urban unrest.

On a personal note, one of my odder claims to fame is that in the early 1990s, I pioneered the academic study of serial murders committed by African-Americans. At that time, characters like Ted Bundy and Jeffrey Dahmer had become worldwide folk villains, but the burgeoning literature on serial killers made next to no reference to black offenders. Some commentators even suggested that such killers did not exist, making this a distinctively white pathology. Knowing what I did about the number of real black offenders, I disagreed strongly. I argued that African-Americans made up perhaps 15 percent of all U.S. serial killers, and subsequent research has supported that figure.

In stressing the abundance of black multiple killers, my goal was not of course to highlight the savagery and cruelty of any particular race, but rather to point to the neglect of minority victims. This actually gets to the issue of how serial murder happens, and why we need to consider much more than the evil or derangement of any given individual. Everything ultimately depends on the availability of victims and the priority that society places on them.

A vicious thug who likes to kill police officers, corporate CEOs, or Catholic priests is unlikely to claim more than one victim before the authorities start paying attention and reacting very forcefully. Hence, the man does not become a serial killer in the first place. If, though, a comparably disturbed offender chooses instead to target “low-value” or disposable individuals, such as street prostitutes, he can kill a great many victims without the police taking much notice.

That is all the more true if we also factor in a social ambience where life is assumed to be short and tenuous, for example in an era of gang wars and rampant drug abuse. If police find a corpse in such circumstances, it simply does not become a high investigative priority. Often, in fact, the dead are not even recognized as murder victims, but are simply dismissed as likely drug overdoses. Even when women survive such attacks and escape from their assailants, police generally pay little attention to any complaints they might make.

This is where race comes in so centrally. One of the golden rules of homicide is that generally, like kills like. People tend to kill within their own social setting, and commonly within their own class and race (and often, their own neighborhood). Throughout American history, some black men have committed savage, random violence against people of their own race, and to that degree, they are exactly the same as their white counterparts. Consistently, though, the fact that their victims are also black, and usually poor, means that police have paid little attention to those crimes, allowing individual offenders to count their kills in the dozens or hundreds. Even if they are arrested and convicted, media bias has meant that such offenders receive little public attention, leading police and government to underplay or ignore the problem of serial violence in minority communities. Racial bias thus contributed to the mass victimization of poor communities, and above all of poor women.

Exhibit A in this story would be Los Angeles in the 1980s and early 1990s, the era of crack wars and rampant gang struggles, when murder rates were skyrocketing. Police focused heavily on those crimes and pathologies, and largely neglected the mass slaughter then underway of poor minority women, whose deaths were basically noted in passing. California media in the 1980s identified a prolific serial killer called the “Southside Slayer,” who in retrospect might have been a composite of six or seven independent and unrelated offenders. At more or less the same time, Los Angeles was the hunting ground for several terrifying serial killers, men such as Louis Craine, Michael Hughes, Chester Turner, and Lonnie Franklin himself—all African-American. DNA evidence suggests that other yet unidentified killers were also active in the same years, and often the very same streets.

And that was just Los Angeles. The total number of victims involved here is unknown, and probably unknowable. Lonnie Franklin, as I mentioned, was officially implicated in around ten deaths, but a substantial collection of portrait photographs was found in his possession. If in fact they are trophies of his other, unrecorded victims, then we might be counting his victims in the hundreds—virtually all black and Latina women.

Similar stories could be told of other crisis-ridden inner-city areas across the nation. Other notorious names included Lorenzo Gilyard in Kansas City and Anthony Sowell in Cleveland. Such offenders are not rare, and what they have in common is their choice of marginalized victims: poor, minority, female, and commonly drug users or prostitutes.

The solution would be to reshape police priorities so that forces place a much higher premium on minority victims and are more sensitive to the possible presence of compulsive sexual criminals. There should be no “low value” victims. Put another way, the message would be that black lives matter, and especially black women’s lives. Through the years, community-activist groups have made strides in this cause, so that murder series are now more likely to be acknowledged, but much remains to be done.

And this is where we face a paradox. As black communities have protested against violence and discrimination by police, the resulting conflicts have strongly discouraged police from intervening in minority areas, reducing proactive interventions. Although this is debated, much evidence now suggests that the immediate result has been an upsurge of crime and violence in those areas, through the “Ferguson Effect.” Police tend to ask why they should go into an area unnecessarily if what they do is going to end up on YouTube and the evening news. In fact, such an idea is by no means new. After the urban rioting of the mid-1960s, police massively reduced their footprint in most inner-city areas, and the consequence was the jaw-dropping escalation of violence and homicide between 1967 and 1971.

Today, we are only beginning to see the first inklings of the Ferguson Effect and the consequences of the reduced police presence. The less police intervene in troubled minority areas, the easier it will be for poor victims to die and disappear, and for men like Lonnie Franklin to hunt without check. In the worst-case scenario, these could be very good times for serial predators, not to mention rapists and domestic abusers.

Less policing means more crime, and more victims. If you reduce levels of policing sufficiently, you will create a perfect ecology for victimization.

Obviously, this is not a simple dichotomy: the choice is not between policing that is interventionist and brutal, on the one hand, versus total neglect on the other. What we need, ideally, is effective, color-blind policing firmly rooted in particular communities, where all groups can rely on the good intentions and law-abiding character of their police forces. Trust is everything.

But that situation will not come overnight. In the interim, withdrawing or reducing the police presence runs the risk of endangering a great many ordinary people, whose lives absolutely must matter. We are talking about equal justice, and equal protection.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Terrorism With the Religion Taken Out

When bombs went off in New York City’s Chelsea neighborhood on Saturday night, state and city officials said some very silly things. But understanding those remarks is actually key to understanding U.S. policy toward terrorism in general.

The immediate response of Mayor Bill de Blasio was to reject possible links to terrorism as such. Gov. Andrew Cuomo, meanwhile, declared that “a bomb exploding in New York is obviously an act of terrorism, but it’s not linked to international terrorism—in other words we find no ISIS connection, etc.” Both men also rejected possible linkages to another bombing earlier in the day at Seaside Park, NJ, which had targeted a Marine charity run.

At the time both comments were uttered, nobody had any idea who had undertaken the attacks or what their motives were. Feasibly, the attacks could have been launched by militants working in the cause of international Islamism, or black power, or white supremacy, or none of the above: they might have been the work of a lone mad bomber furious at his maltreatment by the power company. De Blasio was thus right to leave open the question of attribution, but he was wrong to squelch the potential terrorist link. His comment was doubly foolish given that the New Jersey attack had happened the same day and involved similar methods, which in itself indicated that an organized campaign had begun. As Cuomo had no hint who the attackers were, he actually could say nothing whatsoever about whether they had any connections to the Islamic State.

The New York Times, meanwhile, headlined that the explosion was “intentional,” which was helpful, as it meant that a New York City garbage can had not detonated spontaneously.

Why on earth would de Blasio and Cuomo make such fatuous comments, especially when their security advisors must have been telling them something radically different? (NYPD intelligence and counter-terrorist operations are superb.) Why, particularly, would they make remarks that are virtually certain to be disproved within days?

Both de Blasio and Cuomo made an instant decision to go the heart of the matter as they saw it, which was not analyzing or discussing terrorism, but rather preventing hate crimes and pre-empting “Islamophobia.” In doing this, they were closely echoing the approach of Barack Obama, who has explicitly stated that the danger of terrorism is wildly overblown, as fewer Americans die from terrorism than die from slipping in their bathtubs. (Thank heaven Obama was not president in December 1941, or he would presumably have been lecturing the American people about how small the casualty figures were on the USS Arizona, when set aside road deaths.) In contrast, the real and pressing danger facing the nation is ethnic and religious hatred and bigotry, which is bad in itself, and which also threatens U.S. prestige and diplomatic clout in the wider world.

Combating that threat must take absolute supremacy. That means (among other things) systematically underplaying and under-reporting any and all violent incidents committed by Muslims, or even overtly claimed for Islamist causes. Where Islamist claims are explicitly made, then the waters must be muddied by suggesting other motives—presenting the assailant as a lone berserker, motivated perhaps by psychiatric illness or homophobia. We rarely hear this ubiquitous strategy identified or named, so I offer a name here: this is the U.S. policy of de-Islamization.

With a mixture of bemusement and despair, we watch the extremely limited coverage of the savage attack in St Cloud, Minn., where a Somali man approached strangers in a mall, asked them if they were Muslim, and attacked any and all non-Muslims with a knife while shouting “Allah akbar.” The Islamic State rapidly acknowledged the assailant as one of its soldiers. The FBI has labeled the attack “a potential act of terrorism.” (You think?) Rather than focusing on the attacker or his motivations, CNN’s coverage of the incident emphasizes that “Community leaders fear anti-Muslim backlash, call for unity.” If you have the languages, you are much better off accessing French or German news sources for coverage of such American events.

What about Cuomo’s “international terrorism” point? This represents a throwback to what should be an extinct classification system for terrorist attacks.

In years gone by, some terror attacks were launched by U.S. citizens working in various causes, while others were the work of international forces. The latter might include an Iraqi militant assassinating dissidents in Michigan. But the label also had ethnic and religious overtones. In the 1980s and 1990s, domestic terrorism usually implied white supremacists or neo-Nazis, while “international” commonly denoted Islamic or Middle Eastern connections.

That distinction made sense when the U.S. had a small Muslim population, very few of whom were tied to international causes or organizations. That situation is now totally different, and most of the numerous Islamist terror attacks on U.S. soil of the past decade have been undertaken by U.S. residents or citizens. Orlando killer Omar Mateen was born in New York State, and Fort Hood terrorist Nidal Hassan was a Virginian serving in the U.S. Army. The man currently identified as a suspect in the Chelsea attacks is Ahmad Khan Rahami, a naturalized U.S. citizen.

All these events are thus domestic terror attacks, but they were committed in the name of global Islamist causes, specifically of the Islamic State. So why does the domestic/international dichotomy matter any more?

When Cuomo said the Chelsea attacks were not international in character, what he meant to imply was that they were neither Islamic nor Islamist in inspiration. His statement was simply deceptive, and was part of the larger campaign to de-Islamize the present terror campaign.

Whoever the next president may be, I am not too concerned about how “tough” they aspire to be toward terrorism in general. I just want them to acknowledge the deadly seriousness of the situation this country faces from domestic guerrilla campaigns, and most importantly, the religious and political causes in which most of that violence is undertaken.

Let’s end de-Islamization.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Low-Tech Terror

If a mysterious alien ray swept every gun off the North American continent tomorrow, very ordinary and low-skilled militants could still perpetrate horrendous violence quite comparable to last month’s Orlando attacks. In understanding and forestalling terrorist threats, access to weaponry is only one consideration out of a great many, and that fact is crucial to contemporary political debates.

But without guns, without so-called “assault weapons,” how could terrorists kill innocents in large numbers? One answer to that question comes from the Chinese city of Kunming, where in 2014 a group of eight Islamist militants from the Uighur people stormed the local rail station, killing 29 civilians. As private access to firearms is extremely difficult in China, the killers used long bladed knives, and used them to devastating effect. That tactic has been repeated, and some horrible Chinese “spectaculars” have reached international attention. Last year, the same Uighur movement attacked a Xinjiang coal mine, reportedly killing 50 workers.

Such mass knife attacks occur quite frequently in China, and by no means always for political motives. Still, the fact that any tactic has been so successful in one country attracts the attention of terrorist social media, such as the Islamic State publication Inspire, which brings them to global attention. IS especially recommends that followers around the world should use whatever means available to attack and kill unbelievers, and if guns and explosives are not easily found, then knives are quite acceptable.

Knife attacks have one major drawback for terror groups, namely the large numbers of people needed to inflict mass casualties. Mobilizing a group of eight attackers is difficult without a high danger of penetration and exposure. Other forms of non-traditional violence, though, can easily be committed by solitary lone wolves, and for that reason they are warmly advocated by IS and al-Qaeda.

The main weapons in question are vehicles, and the U.S. was the scene of one pioneering experiment in this form of terror. In 2006, an Afghan immigrant named Omeed Aziz Popal used his SUV to attack civilians in the San Francisco Bay area, killing one and injuring nineteen. His intentions were very clear: as one observer remarked, “He was trolling for people.” After hitting his victims, he returned to run over their prone bodies. Don’t worry if you have never heard of the crime, which was poorly reported, and in such a way that made virtually no reference to the driver’s ethnicity or religious background. The same year, Iranian Mohammed Reza Taheri-azar used his SUV to attack passers by on the campus of the University of North Carolina at Chapel Hill, injuring nine. The driver cited 9/11 pilot Mohammed Atta as his special hero.

If such attacks have not recurred in the United States itself, they have happened repeatedly in other countries, with the clear implication that tactics and methods are being developed through trial and error, awaiting full scale deployment. By far the commonest venue for these assaults has been Israel, presumably because militants there find it all but impossible to obtain guns or explosives. Vehicles, though, are much easier, and Palestinian guerrillas have used cars and also heavier machines such as tractors and bulldozers. Jerusalem alone has witnessed several such attacks since 2008, each with a number of fatalities. Uighurs (again) have used vehicles to ram crowds in Beijing.

2014 marked a turning point in this saga, when IS propagandist Abu Muhammad al-Adnani urged an all-out campaign of lone wolf violence. Find an unbeliever, he said, “Smash his head with a rock, or slaughter him with a knife, or run him over with your car, or throw him down from a high place, or choke him, or poison him.” Multiple vehicle attacks occurred around that time. A man yelling “Allahu Akbar!” drove down eleven pedestrians in the city of Dijon, and the very next day, Nantes witnessed an almost identical attack by a separate militant. Also in 2014, a recent Islamic convert in Quebec used his car against two members of the Canadian military.

So far, the most striking thing about these lone wolf vehicular attacks is just how relatively small the casualties have been, but that could change very easily. It would be easy to imagine drivers choosing denser crowds, during busy shopping seasons or major sporting events. In this scenario, long lines of fans or shoppers or travelers represent a target rich environment. On such occasions, a determined driver not afraid of being killed could easily claim twenty or more fatalities.

Whatever else we might say about limiting access to firearms (even assault rifles), such a policy of itself would do nothing whatever to prevent these kinds of low-tech violence. The solution lies in efficient forms of intelligence gathering, monitoring and surveillance, combined with psychological profiling. The danger with such methods is that they will not pick up every potential assailant, while running a serious risk of producing lots of false positives, aggressive blowhards who in reality will never commit a crime. Just how to walk that particular tightrope, between effective prevention and respecting rights to free speech, is going to offer a major challenge to law enforcement agencies of all kinds.

And yet again, it would be very useful if our political leaders felt able to speak the name of the actual cause for which all those murderous guns and knives and cars are being deployed. Perhaps that is too much to hope.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Is Brexit National Suicide?

Sam Valadi / Flickr

Over the past week, I have often been asked what I think about the British referendum vote to leave the European Union, or to seek “Brexit.” My standard response is that I would be happy to explain, provided the sight of me ranting and shrieking obscenities does not bother my listeners. In my view, Brexit is quite literally nothing short of national suicide. It is a cataclysm that that country must and can avert, at all costs.

In saying this, I run contrary to the view of many conservative writers who seem delighted by the vote. Those observers make some legitimate points about the referendum and the national and anti-globalization values that it proclaimed. Yes, the vote did represent a powerful proclamation by a silent majority who felt utterly betrayed and neglected by global and corporate forces. The Leave vote, they rightly think, was a mighty voice of protest.

Actually leaving, though, is a radically different idea. At its simplest, it means that Britain would abandon its role as a dominant power in Europe, a continent that presently has its effective capital in London, and with English its lingua franca. It also means giving up countless opportunities for young people to work and travel across this vast area.

Alright, perhaps those losses are too tenuous and speculative, so let’s be very specific, hard-headed, and present-oriented. How would Britain survive outside the EU? If the country had three or four million people, it could revert to subsistence agriculture, but it doesn’t—it has 64 million. That means that the country absolutely has to trade to survive, whether in goods or services. Fantasies of global commercial empires apart, the vast majority of that trade will continue to be what it has been for decades, namely with other European nations. All discussions of Brexit have to begin with that irreducible fact.

So trade on what terms? Since the referendum vote, it has become starkly apparent that none of the Leave militants had given a moment of serious thought to this issue.

One attractive model is that of the associated nation, which enjoys access to the free market, but is exempt from EU laws and regulations. Conservative politician Boris Johnson recently published an op-ed suggesting just such a model, drawing on the example of Norway, in what has been called a kind of EU-Lite. Beyond accessing the single market, he also specified that Britain would be able to maintain continent-wide mobility for its own people, while restricting immigration of foreigners into Britain. He also declared that future fiscal deficits could be solved by the limitless veins of fairy gold to be found under his house. Well, I am making up that last part, but it is perhaps the most plausible part of his scenario. European leaders made it immediately clear that no form of association would be contemplated under which Britain could exclude migrants. Mobility of labor must run both ways.

And that Norwegian example demands closer inspection. What it means in practice is that Norway’s government pays a hefty price for its EU relationship and market access, in the form of continuing to pay very substantial sums into the EU, while agreeing to easy immigration policies. The only thing it lacks is any say whatever in EU policy-making.

Let me put this in U.S. terms. Imagine that Texas seceded from the union. The American President is amenable to the scheme, and explains how it would work in practice. Henceforward, he says, Texas would be completely independent! It would however continue to pay federal taxes, while having no control of immigration or border policy. Nor would it benefit in any form from federal aid, support or infrastructure projects. Oh, and Texas would no longer have any Congressional representation in Washington, to decide how its funds were spent. It seems like a bad idea to me, continues the president, but hey, it’s your decision. Enjoy your sovereignty!

As they begin to consider the effects of Brexit, the Leave leaders are facing an irreconcilable contradiction. On the one side, you have the more mainstream figures, like Boris Johnson, who will very soon be pleading for a Norway-style association model, with all the negatives I suggested earlier. Against them will be the populists, like Nigel Farage’s UKIP, who will accept nothing implying open immigration, no form of EU-Lite.  Rejecting that element, though, also means abandoning any hope of access to the single European market. If implemented, that would mean industrial and financial collapse.

But there is a good side to that outcome! As the British economy disintegrated, millions would be forced to leave the country to seek their livelihoods elsewhere, and among those would be many of the recent immigrants whom UKIP so loathes. Who would choose to remain in a beggared and impoverished junkyard? The immigration problem would thus solve itself, almost overnight.

Realistically, the most likely outcome for Britain is some kind of association status, which means many of the burdens of EU membership, but without the essential pluses, of being able to control the process from within at governmental level. And the advantages of seeking that solution rather than the present model of full EU membership are… are… hold on, I’m sure I can finish this sentence somehow. No, in fact, there aren’t any advantages.

Full British membership in the EU as constituted presently—with all its manifold flaws—is infinitely superior to any possible alternative outcome.

But surely, one might object, the referendum can be neither reversed or ignored? Actually, it can, easily, if any politician had the guts to do so. Nor do we need a second referendum to achieve that result. Contrary to the impression given by many media reports, the recent referendum was an advisory and nonbinding affair, with no necessary legal consequences whatever. In terms of its necessary impact on legislation, it had precisely the same force as a Cosmopolitan magazine survey on sexual predilections. In the context of constitutional laws and customs established over a millennium or so of British history, the referendum exists for a single purpose, namely to advise the deliberations of the Crown-in-Parliament. Parliament must vote on this issue, and if it decides to overturn the result, then so be it.

If at that point, British parliamentarians still decided to validate the Brexit result, then so be it, and may their country’s ruin be on their conscience. But the decision remains entirely theirs.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Who Threatens Democracy?

Evan Guest / Flickr

The United States is currently facing a truly dangerous and unsettling political movement that poses real challenges to traditional concepts of democracy. That phenomenon is, of course, anti-Trumpism. Speaking personally, nothing could induce me to vote for Mr. Trump, but the violent opposition to him is becoming alarming.

Trump Could Threaten U.S. Rule of Law, Scholars Say,” reads a New York Times headline. Specifically, these experts warn of his “contempt for the First Amendment, the separation of powers and the rule of law.” And if they are experts, and if the Times has bothered to telephone them, then their views are of course impeccably objective. There are multiple issues here, not least that another Clinton presidency would assuredly involve precisely the same hazards, and presumably to the point of impeachable crimes, yet the Times is not seeking expert opinions on those matters.

But right now, let us consider the “rule of law” in the present election. In San Jose, anti-Trump protesters just physically attacked and chased supporters leaving a meeting, and events like that are becoming commonplace. They are assuredly going to escalate over the next few months. That prospect determines attitudes on both sides. Every left-wing activist group knows it is duty-bound to express its opposition to Trump, and supporters know that they are likely to be attacked if they attend meetings.

We can guarantee that certain things are going to happen within the next two months. One is that at least a handful of Trump supporters are not going to turn the other cheek. They know they cannot rely on police protection, and so some will turn up to meetings prepared to defend themselves, possibly with firearms. At that point, someone is going to be wounded or killed. At that point, expect a media outpouring about the inherent violence of Trump, his supporters, and the political Right. These animals are vicious! When attacked, they defend themselves.

The other prediction we can make with fair certainty is that in mid-July, we are going to be facing a major political crisis. The Republican convention will be held in Cleveland July 18-21, and it will assuredly be held in a state of siege. The exact outcome of that event very much depends on police behavior, preparation, and organization. If protesters can be kept sufficiently far removed, then perhaps some semblance of order can be preserved. If not, it is possible that the convention itself might be forced to suspend its activities. Either way, it is highly likely that individual convention delegates and participants are going to be attacked and, conceivably, harmed.

Political protests on some scale are not new, and political conventions are a natural target. But in modern U.S. history, has there ever been a national election where the candidates of one party were simply unable to show their faces without being met by violence? Where mob action simply makes it impossible for one campaign to function? We are not here talking about the candidate of some neo-Nazi sect or Klan group, but the Republican Party itself.

Ultimately, this is all a matter of policing and the workings of the criminal-justice system. In recent years, American police forces have become very conscious of the need to avoid overreaction at protests and public gatherings, for fear of generating embarrassing film that shows up on YouTube. In a version of the notorious “Ferguson Effect,” they have become much gentler in their approaches than they were in earlier years. Witness, for instance, the decision to allow groups like Black Lives Matter to block roads without facing even the danger of arrest. The reasons for caution are understandable, but something has to change. If the police cannot maintain public order sufficiently to allow the functioning of something as critical as a national election, have we not ventured into a full-scale national crisis?

If national elections cannot be held in safety, has democracy not ceased to function?

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

The Nuclear-Free Nightmare

B-17 Flying Fortresses of U.S. 8th Air Force bombing Dresden in April 17, 1945. (Everett Historical / Shutterstock)

Visiting Hiroshima last week, President Obama expressed the wish that in the future no community would ever have to suffer the horrors inflicted on that city in 1945, and moreover, that the bombing should never be forgotten. Those sentiments were obvious and unexceptionable. Much more debatable, though, was his restatement of his desire to see a world free of nuclear weapons. If expressed as a general platitude, that is fine, but if it represents any kind of serious strategy or policy goal, it is dangerous to the point of insanity. Hard though this may be to hear, we desperately need our nukes.

The year 1945 marked the beginning of an era of unprecedented peace and security in Europe, which with brief interruptions has continued until the present day. That glorious new age did not happen because of a sudden moral improvement, or the rise of the European unification movement, or any ideological tilt. It happened because both superpowers knew that if either broke the security balance within Europe, the result would be a catastrophic nuclear war that would have annihilated Europe, and deeply damaged the homelands of both the U.S. and the Soviet Union. That balance of terror removed the vast advantage that the Soviets had from their enormous superiority in land forces. That seventy years of peace is founded on the existence of large numbers of nuclear arms.

If we try to rewrite the history of Europe without assuming the existence of those weapons, it is scarcely possible to imagine such an enduring peace, except on the basis of the permanent total mobilization of all European powers, as well as the nations of the Anglosphere. Such mass mobilization seems bizarre to anyone brought up in our own nuclear world, but the idea was a reality from the Napoleonic Wars onwards. The normal calculation was that for every million people in a nation’s population, you could sustain two divisions. Only a society spoiled by the security of Mutual Assured Destruction could have forgotten such brutal mathematics.

A future world without nuclear weapons would have to return to mass mobilization and universal conscription in order to counterbalance the advantages of great continental states unafraid to venture “boots on the ground” in their many millions—China obviously, but also Russia, Pakistan, and India, among others. Fortunately, the modern U.S. would have the blessing of all the women who would happily agree to be drafted to serve alongside the millions of soldiers and sailors the nation would need to keep in uniform to maintain its security. How does a 12 million strong standing U.S. army sound?

Not, of course, that all future wars would necessarily be slogging matches between infantry, or trench warfare. Even without actually using nuclear weapons, we can see clearly how the world would have developed after 1945 without that overwhelming deterrent. By the end of the Second World War, the U.S. and Britain had perfected the art of destroying cities by air raids involving thousands of aircraft, using firebombs and napalm. Even in those early years, they were inflicting death-tolls running into the tens or hundreds of thousands. Those technologies would presumably be much more advanced today, so that any non-nuclear war could be incredibly destructive, and would claim many millions of lives.

During the height of the Cold War, the British also toyed with the idea of quite impressive non-nuclear deterrents. By the early 1950s, the cutting edge of their military thinking was Operation Cauldron, which sought deterrents based on biological warfare. The most promising components in this witches’ brew included brucellosis, tularaemia, and also pneumatic and bubonic plague. We know about these efforts today because of the truly chilling 1952 incident in which the trawler Carella inadvertently wandered into British test waters and was exposed to some of these hellish agents. Rather than quarantining the sailors, or even notifying them of the dangers they faced, the military began a covert surveillance to track their health and see whether they might spread infections when they reached British ports. The Carella scandal has shed light on the larger world of Cauldron.

That was 1952. Presumably the range of catastrophic biological agents available to even small powers today is vastly greater, and even more lethal.

So exactly what part of the nuclear-free world are we pining for? The constant mobilization and militarization of all major societies? The investment in massed bomber fleets to charbroil the cities of any potential enemies? Or the total dependence on biological deterrence, with the President’s finger constantly on the bubonic trigger? Dare I say that none of this actually sounds attractive? The main difference between that hypothetical world and the nuclear-armed world we know is that those alternative weapons would have stood a far greater chance of actually being used.

In 1970, the British heavy rock band, the Groundhogs, issued an album with the seemingly appalling title Thank Christ for the Bomb. Today, we might imagine this as shock for shock’s sake, like later punk numbers, but it was anything but that. The thoughtful lyrics of the title song stated a simple thesis, namely that the twentieth century had witnessed two hugely damaging wars that had killed millions, but that since 1945, the advanced nations at least had avoided any recurrence of such a fate. The reason for that sea change, said the Groundhogs, was quite clear, which is why they exclaimed, “Thank Christ for the bomb.” That argument demands examination, and respect.

If President Obama wants to reduce the number of nuclear weapons among all powers, he should be encouraged. If he wants to see arsenals of at most a few hundred warheads for even the greatest nations, with correspondingly smaller hoards for smaller nations, all well and good. If, though, he seriously contemplates a world without nuclear weapons, then he is utterly and perilously ignorant of modern history.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

Recognize the Islamic State

Few would deny that the recent horrors in Brussels demand action, beginning, certainly, with a thoroughgoing purge of Belgian security and intelligence agencies. Other proposed remedies are shambolic, meaningless, or simply farcical, such as the idea of police patrolling “Muslim areas” in the U.S. (Which areas? Where? With what goal?) I would suggest, though, that one urgent priority is for the U.S. and its European allies to consider immediate political and diplomatic recognition of the Islamic State. Let me explain the grounds for that proposal.

European societies are legendarily open, and most have a strong reluctance to anything that might look like curbing dissent. That willingness to tolerate virtually any opposition to government is vastly enhanced by the commitment to multiculturalism, and the perceived need to avoid persecuting or targeting ethnic and religious minorities. Only that tradition can explain the utterly perplexing comments that regularly appear in media reports of Islamist activities in Europe, especially after major terror attacks. We read that a certain area in Sweden or England or Germany is a notorious trouble spot, with so many local men having traveled to Syria and returned. Elsewhere, we hear that intelligence agencies are stretched to the limit in keeping track of militants. In Britain, for instance, the security services report the hard work of keeping tables on a couple of thousand known jihadis. After an attack, media usually report that the X brothers were “known to the police” as likely militants.

Let me ask the questions that seem obvious to me, but which apparently elude European agencies. Firstly, and most obvious, the category of jihadi is quite distinct from that of political dissenter, or critic of the regime, or radical reformer. Of its nature, it implies being willing and eager to engage in armed violence against democratic regimes, and also the preaching and advocacy of such activities. Given the present terrorist threat in Europe, support or advocacy for such views constitutes a clear and present danger to peace, safety, and public order. It therefore involves conduct that in the Anglo-American legal tradition clearly comes within the ambit of the criminal code, under such labels as sedition and incitement to kill. I am not an expert on Roman Law traditions, but I assume those countries have comparable notions of criminal behavior, of the advocacy of violence falling short of the deed itself.

Why are European governments not enforcing these laws? Why do they not go beyond proscribing organizations to prosecuting and punishing each and every individual member or office holder? Why are there no mass sedition trials? If direct criminal prosecutions are difficult, why can suspects not be interned for the duration of the emergency? Why, in short, are “known jihadi sympathizers” walking the streets?

Leading on from that, if a person has traveled from a European nation to Syria or Iraq in the past four or five years, the presumption is surely that they did so with a willingness to support the activities of the same terror group that has been active on European soil, namely the Islamic State, the Daesh. This places them in the category of jihadi, with the sanctions outlined above. Why are they ever, under any circumstances, allowed re-entry into Europe? Indeed, let us encourage their travels to the Middle East, where they can form a concentrated target for attack and annihilation by multiple nations. But return? Never.

Please understand, I am not naïve about the workings of intelligence, and I realize there are excellent reasons for allowing real or apparent terror suspects to wander loose. I once debated a conservative writer who was appalled that the British allowed a certain blowhard imam to remain free and active in London. My argument was that intelligence services often allow such figures much liberty precisely because they are double agents or informants, and they must be seen to be active in radical movements as a means of facilitating surveillance and penetration of terror organizations. My fear, though, is that the enormous latitude allowed to European jihadis does not, generally, result from such familiar tradecraft. European governments are simply too confused or gutless to round them up and jail them.

All of which brings me to recognizing the Islamic State. Presently, the Daesh is viewed as a dangerous terrorist group, membership in which constitutes illegal behavior in Western nations. But suppose that it was recognized internationally as a state, and its sympathizers and agents continued to advocate or practice violence against Western governments. In that case, they would be advocating or committing acts of irregular warfare, which would constitute treason. That would be all the easier if Western states formally declared war against the Islamic State.

The potential of treason charges would really, seriously, force Islamist thinkers to think very hard about the nature of their propaganda and activism. That redefinition would also make it vastly easier to frame and press charges, and to inflict maximum criminal penalties.

The people who would be happiest with such a development would be the leaders of most European mosques and Islamic organizations, who get very tired of banging their heads against the obstructive attitudes of police agencies. When sane, moderate, imams denounce the troublemakers in their midst, they would like nothing better than to get those fanatics put away for a great many years.

So, please, let’s recognize the Islamic State, and force its supporters and adherents to come to terms with the implications of advocating violence on the part of an enemy nation. Let them become traitors and saboteurs, and suffer accordingly.

I do ask one final question. Both Western powers and Russia are commendably anxious to avoid targeting civilian populations with tactics like carpet-bombing Mosul or Raqqa. Fair enough, and hence the countless pinpricks of drone attacks. But why on earth do those cities, and other Daesh strongholds, still have the slightest access to power, water, sewage disposal and desalination, and every other facility that permits the continuation of normal civilized existence? Cutting off those pleasant advantages would force an immediate and irreversible crisis within Daesh territories, pushing those already unhappy with the regime into immediate revolt. I am sure the states allegedly pledged to smashing the Islamic State have good reasons for not striking at such obvious targets, but offhand, I can’t think of any.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

History and the Limits of the Climate Consensus

“Winter Landscape” by Joos de Momper the Younger via Walters Art Museum Wikimedia Commons. The 1620s in the heart of the Little Ice Age.

I began my career as a historian of the century following 1660, an era of harsh climatic conditions that often affected political and cultural history. Some periods in particular, especially the years around 1680 and 1740, stand out as uniquely stressful. Extreme cold led to crop failures and revolts, social crises and apocalyptic movements, high mortality and epidemics, but it also spawned religious revivals and experimentation. If you write history without taking account of such extreme conditions, you are missing a lot of the story. That background gives me an unusual approach to current debates on climate change, and leads me to ask some questions for which I genuinely do not have answers.

I believe strongly in the supremacy of scientific method: science is what scientists do, and if they don’t do it, it’s not real science. Based on that principle, I take very seriously the broad consensus among qualified scientific experts that the world’s temperature is in a serious upward trend, which will have major consequences for most people on the planet—rising sea levels and desertification are two of the obvious impacts. In many religious traditions, activists see campaigns to stem these trends as a moral and theological necessity. Personally, I love the idea of using advanced technology to drive a decisive shift towards renewable energy sources, creating abundant new jobs in the process.

Speaking as a historian, though, I have some problems with defining the limits of our climate consensus, and how these issues are reported in popular media and political debate.

Climate scientists are usually clear in their definitions, but that precision tends to get lost in popular discourse. To say that global warming is a fact does not, standing alone, mean that we have to accept a particular causation of that trend. Following from that, we must acknowledge that the climate has changed quite radically through the millennia, and that equally is beyond dispute. Climate change of some scale has happened, is happening, and will happen, regardless of any human activity. The issue today is identifying and assessing the human role in accelerating that process.

This point comes to mind in popular debate when people who should know better denounce “climate change” as such. See for instance the recent Papal encyclical Laudato Si, with its much-quoted statement that

Climate change is a global problem with grave implications: environmental, social, economic, political and for the distribution of goods. It represents one of the principal challenges facing humanity in our day.

Well, not exactly. “Climate change” is a fact and a reality, rather like the movement of tectonic plates, or indeed like evolution. Particular forms of climate change may be exceedingly harmful and demand intervention, but that is a critical difference. It’s interesting comparing the English phrase “climate change” with the French phrase that was used at the recent COP21 Paris meetings, the Conférence sur les Changements Climatiques 2015: changes, plural, not change. Do you want to see a world without a changing climate? Look at the Moon.

That then gets to the human contribution to current trends. The basic theory in these matters is straightforward, simple, and (rightly) generally accepted. Carbon emissions create a greenhouse effect, which increases planetary temperatures. It should be said, though, that the correlation between emissions and temperatures is none too close. Rising temperatures do not correlate with any degree of neatness to overall levels of emissions. That is especially true when we look at the phenomenal growth in emissions from India and China since the 1980s, which should in theory have caused a global temperature increase far above anything we actually see. Sure, the effects might be delayed, but the correlation is still not working too well.

That disjunction is particularly telling when we look at the very recent era, from 1998 through 2012, when emissions have carried on rising sharply, but temperature rises have been slow or stagnant. This was a hiatus or slowdown in global warming, and it remains controversial. Some recent studies challenge the whole hiatus idea. Others accept the hiatus, but offer different explanations for its cause. Now, the fact that scientists disagree strongly on a topic certainly does not mean that the underlying theory is wrong. Arguing and nitpicking is what scientists are meant to do. But that lack of correlation does raise questions about the assumptions on which any policy should proceed.

That also gets us into areas of expertise. Climate and atmospheric scientists are not only convinced that the present warming trend is happening, but that it is catastrophic and unprecedented. That belief causes some bemusement to historians and archaeologists, who are very well used to quite dramatic climate changes through history, notably the Medieval Warm Period and the succeeding Little Ice Age. That latter era, which prevailed from the 14th century through the 19th, is a well-studied and universally acknowledged fact, and its traumatic effects are often cited. The opening years of that era, in the early-mid 14th century, included some of the worst social disasters and famines in post-Roman Europe, which were in turn followed by the massacre and persecution of dissidents and minorities—Jews in Europe, Christians in the Middle East, heretics and witches in many parts of the world. A cold and hungry world needed scapegoats.

Contemporary scientists tend to dismiss or underplay these past climate cycles, suggesting for instance that the medieval warm period was confined to Europe. Historians, in their turn, are deeply suspicious, and the evidence they cite is hard to dismiss. Do note also that the very substantial Little Ice Age literature certainly does not stem from cranky “climate deniers,” but is absolutely mainstream among historians. Are we seeing a situation where some “qualified and credentialed scientific experts” stand head to head with the “qualified and credentialed social scientific experts” known as historians?

If in fact the medieval world experienced a warming trend comparable to what we are seeing today, albeit without human intervention, that fact does challenge contemporary assumptions. Ditto for the Little Ice Age, which really and genuinely was a global phenomenon. Incidentally, that era involved a drop in temperature of some 2 degrees Celsius, roughly the same as the rise that is projected for coming decades.

The 2015 Paris Conference declared a target of restricting “the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5°C above pre-industrial levels.” It’s very important to set a baseline for such efforts, certainly, but what on earth is intended here? Which pre-industrial levels are we talking about? The levels of AD 900, of 1150, of 1350, of 1680, of 1740? All those eras were assuredly pre-industrial, but the levels were significantly different in each of those years. Do they want us to return to the temperatures of the Little Ice Age, and even of the depths of cold in the 1680s? The Winter of 1684, for instance, still remains the worst recorded in England, ever. Or are the bureaucrats aiming to get us back to the warmer medieval period, around 1150?

Seriously, does any serious climate scientist claim that “pre-industrial” temperature levels had been broadly constant globally for millennia, in fact since the end of the last true Ice Age some 12,000 years ago, and that they only moved seriously upwards at the start of industrialization? Really? And they would try to defend that? In that case, we should just junk the past few decades of writing on the impact of climate on history, undertaken by first rate scholars.

If pre-industrial temperature levels really varied as much as they actually did, why did the Paris conference so blithely incorporate this meaningless phrase into their final agreement? Did the participants intend “pre-industrial levels” to be roughly equivalent to “the good old days”?

I offer one speculation. Maybe the “Little Ice Age” was the planet’s “new normal,” a natural trend towards a colder world, and we should in theory be living in those conditions today. All that saved us was the post-1800 rise in temperatures caused by massive carbon emissions from an industrializing West. If that’s correct—and I say it without any degree of assurance—then I for one have no wish whatever to return to pre-industrial conditions. Climate scientists, please advise me on that?

Historical approaches are also useful in pointing to the causes of these changes, and therefore of much climate change that originates quite independently of human action. One critical factor is solar activity, and historians usually cite the Maunder Minimum. Between 1645 and 1715, sunspot activity virtually ceased altogether, and that cosmic phenomenon coincided neatly with a major cooling on Earth, which now reached the depths of the Little Ice Age. In fact, we can see this era as an acute ice age within the larger ice age. If you point out that correlation does not of itself indicate causation, you would be quite right. But the correlation is worth investigating.

So do I challenge the global warming consensus? Absolutely not. But that does not mean that all critical questions have been satisfactorily answered, and many of them depend on historical research and analysis. Pace the New York Times and large sections of the media, there is no such thing as “established science,” which is immune to criticism. If it is “established” beyond criticism and questioning, it’s not science.

Scientific claims must not be taken on faith.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

The Islamic State’s Retro Map

Policymakers and media people—as well as anyone interested in the Middle East, Islam, terrorism, and related issues—need to be talking about al-Jazira. I am not talking here about the Qatar-based media operation that we usually call al-Jazeera. Rather, I am referring to those regions of Eastern Syria and Northern Iraq that have been in the news so much recently because they are the main stamping grounds of ISIL, and the core of the Islamic State, the Daesh.

This is not just a question of applying a handy geographical label. If we don’t understand the Jazira, and its deep historical implications, we are missing so much of the present story.

The area spans major portions of the old states of Iraq and Syria (can I now assume these states are defunct as actual units?), and has its main centers at Ar-Raqqah, Mosul and Deir ez-Zor. Those territories feature very prominently on the military maps and targeting charts of most major Western air forces, not to mention Russian and Syrian militaries. When those cities feature in Western media, it is usually in the context of the slaughter of hostages or the expulsion of religious minorities.

Looking at a map of the Islamic State, Westerners find it hard to describe, except in terms of “fragments of old Syria and Iraq.” Actually, though, these regions belong to a specific and old-established unit with its own well defined, if turbulent history, that is very well known in Middle Eastern history. Ever since the early days of Islam, commentators have used the term al-Jazira (“the island”) for these parts, together with the southeastern corner of present Turkey. The term derives from the “island” between the rivers Tigris and Euphrates. Originally, it referred to Northern Mesopotamia, that area which combined with the “black land” further down the rivers to invent the state of Iraq. By extension, though, it also included those other borderland countries that eventually found their way into Syria and Turkey.

To say this area has a substantial history is a gross understatement. That story would include, for instance, most of the early development of Near Eastern civilization, not to mention the whole of early Syriac Christianity. Once upon a time, the area was as densely packed with churches and monasteries as any region of Europe, and cities like Nisibis were vital intellectual and spiritual powerhouses. Ar-Raqqah itself was once the mighty early Christian city of Kallinikos, with its bishopric and monastery. Mosul itself was, until very recent years, one of the greatest Christian centers in the whole Middle East.

In Islamic times, of course, that history took a very different direction, but it has always served as a distinct region, with a natural unity and geographical logic. The more you read this history, from the seventh century through the 20th, the more this fundamental unity becomes apparent. Any Muslim ruler seeking to establish wider power had to control the Jazira, even if their natural base was much further afield, in Baghdad or Damascus. Meanwhile, Byzantine and Islamic Empires contended to secure dominance here.

Yet that task was far from easy. The rough and complex terrain made it difficult to suppress independent-minded dissidents, who found a natural home here. That meant ethnic minorities, but also religious sects. Over the centuries, this is where you found the strongholds of the apocalyptic Kharijite sect, the Yezidis, Assyrian Christians, and (more generally) the Kurdish people.

Of its nature, this is warrior country, from which hard-bitten fighters expanded to conquer what they viewed as the effete city dwellers to the south and west. To those city dwellers, al-Jazira always has been dangerous borderland or bandit country. Anyone familiar with the long history of the U.S.-Mexican border will have an excellent sense of the mutual prejudices and stereotypes that prevail here, not to mention the subcultures of endemic violence.

Modern policy-makers should take many significant lessons from this history, but two in particular stand out. One is that the limits of al-Jazira certainly do not end on the old Syria/Iraq borders, but extend deep into Turkey, and that this is the natural direction for any future expansion of the Islamic State. That fact must be central to the thinking of any Turkish policymakers. If the Islamic State is a continuing fact, then it is imperative to maintain good relations with its rulers, and to draw firm boundaries. That entity will still be in place long after the Americans, Russians, and French have lost interest and gone home.

The other great fact is that al-Jazira is now starkly divided between two competing forces, namely ISIS and the Kurds, both of whom operate freely across the three states that notionally control the area. Any projections of the future of this region must centrally emphasize that reality, rather than the role of the ghost states operating from Baghdad and Damascus.

If any Western regime is thinking of restoring old Syria and Iraq, it is operating in a world of delusion. The central issue in the Middle East is the unity of the Jazira, and just who will control this critical heartland.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

The Challenge of Lone Wolf Terrorism

cunaplus / Shutterstock.com

The horrific recent attack in San Bernardino has attracted a good deal of bafflement from media and government alike. What kind of Islamist assault was this? If Syed Farook and Tashfeen Malik were not acting together with a “real” terror group or network, were they really terrorists? Were they taking orders from some mysterious head or group, still unidentified? Actually, there is an easy and unsettling answer to all these questions. Yes, they certainly were terrorists, and they were following one of the most dangerous tactics known in that world, “leaderless resistance.” The fact that U.S. law enforcement is still so startled by this method, and so utterly unprepared, is deeply alarming.

Do not for a second think that by using the term “resistance,” I am justifying these disgusting crimes, or comparing them to guerrilla resistance movements. Rather, I am using a well known technical term, albeit one with a very odd history.

Amazingly, the story goes back to the U.S. ultra-Right in the 1980s. Far Rightists and neo-Nazis tried to organize guerrilla campaigns against the U.S. government, which caused some damage but soon collapsed ignominiously. The problem was the federal agencies had these movements thoroughly penetrated, so that every time someone planned an attack, it was immediately discovered by means of either electronic or human intelligence. The groups were thoroughly penetrated by informers.

The collapse of that endeavor led to some serious rethinking by the movement’s intellectual leaders. Extremist theorists now evolved a shrewd if desperate strategy of “leaderless resistance,” based on what they called the “Phantom Cell or individual action.” If even the tightest of cell systems could be penetrated by federal agents, why have a hierarchical structure at all? Why have a chain of command? Why not simply move to a non-structure, in which individual groups circulate propaganda, manuals and broad suggestions for activities, which can be taken up or adapted according to need by particular groups or even individuals?

To quote far Right theorist Louis Beam,

Utilizing the leaderless resistance concept, all individuals and groups operate independently of each other, and never report to a central headquarters or single leader for direction or instruction … No-one need issue an order to anyone.

The strategy is almost perfect in that attacks can neither be predicted nor prevented, and that there are no ringleaders who can be prosecuted. The Internet offered the perfect means to disseminate information. Already in the mid-1980s, the neo-Nazi networks were pioneering early adapters of the electronic bulletin boards that preceded the World Wide Web.

In 1989, Rightist intellectual William Pierce published a book that provides a prophetic description of leaderless resistance in action. Hunter, published in 1989, portrays a lone terrorist named Oscar Yeager (German, Jäger) who assassinates mixed-race couples. The book is dedicated to Joseph Paul Franklin, “the Lone Hunter, who saw his duty as a white man, and did what a responsible son of his race must do.” Franklin, for the uninitiated, was a racist assassin who launched a private three year war in the late 1970s, in which he murdered interracial couples and bombed synagogues. The fictional Yeager likewise launches armed attacks against the liberal media, and against groups attempting to foster good relations among different races and creeds.

Central to the book is the notion of revolutionary contagion. Although the hero (for hero he is meant to be) cannot by himself bring down the government or the society that he detests, his “commando raids” serve as a detonator, to inspire other individuals or small groups by his example. “Very few men were capable of operating a pirate broadcasting station or carrying out an aerial bombing raid on the Capitol, but many could shoot down a miscegenating couple on the street.” He aimed at the creation of a never-ending cycle of “lone hunters,” berserkers prepared to sacrifice their lives in order to destroy a society they believe to be wholly evil.

Politically, the U.S. ultra-Right was too weak in the 1990s to follow Pierce’s model, and the movement never fully recovered from the Oklahoma City attack. But the Hunter tactics live on, precisely, in the modern Islamist world, and specifically in the influence of Yemeni-American propagandist Anwar al-Awlaki, who was killed by U.S. action in 2011. Reading texts from al-Qaeda’s online magazine Inspire, we so often hear what sound like direct echoes of Pierce, and especially of leaderless resistance.

That connection is actually not hard to explain. We know that al-Awlaki spent the 1990s in the U.S., where he would have had easy access to the rich array of paramilitary books and manuals circulated by far Right and survivalist mail order firms, which also sold anti-Semitic tracts. Both kinds of writing would have appealed to a budding jihadi. If he dabbled at all in this subculture, he would very soon have encountered the books of Pierce, who was a best-seller in these catalogues. Particularly in the early nineties, Hunter was the hottest name in this literary underworld. If not Hunter itself, al-Awlaki would certainly have heard discussions of leaderless resistance, which was all the rage on the paramilitary Right in those years.

In light of that, look at the domestic terror attacks in the U.S. in the past decade, all of “lone wolves” or of tiny hermetic cells, made up of siblings or married couples. Think of the Tsarnaevs in Boston, of Nidal Hassan in Fort Hood, of Muhammad Youssef Abdulazeez in Chattanooga, and now of Farook and Malik in San Berndardino. Think also of the many instances—never fully catalogued and collated—of Islamist “lone wolves” driving cars into crowds. Call it “self-radicalization” if you must, but what we have in progress, in the contemporary United States, is a textbook example of a Hunter-inspired campaign of leaderless resistance.

What that means is that virtually none of the counter-terror tactics currently deployed by U.S. agencies have the slightest relevance to detecting or preventing future attacks. You can’t track leaders because there aren’t any, you can’t infiltrate the group or turn participants, and all the propaganda and terror methods needed are on the Internet. They are getting their training and propaganda online from Qaeda sites like Inspire, and more recently by the ISIS/Daesh magazine Dabiq.

For the sake of argument, let us accept the optimistic view that 99 percent of American Muslims flatly reject terrorism. Not counting any future migration, that would still leave one percent of the whole, or some 30,000 potential Islamist militants. That is enough people for 10,000 leaderless cells.

Addressing this issue would seem to be the absolute number one priority of U.S. law enforcement in the coming years. Shall we begin, at least, by naming the enemy?

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

The Case for Mosque Surveillance

I am about to do something highly uncharacteristic, namely to agree with Donald Trump on something. Recently, Trump made a statement, apparently shocking to many, that “I want surveillance of certain mosques.” If you know anything about terrorism and counter-terrorism, there is only one thing shocking about this remark, namely that anybody is disturbed by it. Only the shock shocks.

Put another way, look at another recent remark from British film-maker Frankie Boyle, who complains about the French and Russian bombing campaign in the Middle East. Wrong solution, he says. What we really need is “an urgent debate about how to make public spaces safer and marginalized groups less vulnerable to radicalization.” Those are excellent and critical suggestions. Lacking extensive surveillance and intelligence operations, though, such strategies are irrelevant and suicidally dangerous.

Here is the problem. In order to investigate terrorism, law enforcement agencies absolutely must use various clandestine methods, including surveillance, infiltration and double-agent tactics. Agencies wishing to suppress terrorism must, of necessity, operate in a complex and dirty clandestine world. In order to succeed, they must often do things that they cannot publicize frankly.

The best and perhaps only means to defeat subversive or terrorist groups is to keep them under detailed and constant surveillance, a strategy that must be coupled with infiltration and penetration. This needs to be stated because of the common official emphasis on enhancing security to protect airports and public buildings against terrorist attacks. At least, that is what agencies say publicly.

Americans are all too familiar with the searches they deal with before flying on commercial airlines. Some of these precautions are useful and necessary, but most have no impact whatever on the likelihood of a terrorist assault. If you fortify aircraft, terrorists attack airports; if you fortify airports, they can bring down aircraft with missiles fired from remote locations; if you defend aircraft, they attack ships; and so on. Does TSA operate perfect security at its checkpoints? Then gun down the long lines waiting to pass through the checkpoints.

It is a near miracle that nobody has yet blown up a ship packed with high explosives in a U.S. port, creating something close to the effects of a nuclear blast. To see what I am talking about, just Google “Halifax 1917”.

Overwhelmingly, security precautions and airport searches are solely designed to raise public consciousness, a feel-good strategy without any real effects or benefits. The fact that we have not to date had “another 9/11” has next to nothing to do with airport security operations. It is the result of intelligence, pure and simple.

No government can defend itself against terrorism solely by means of protection and security. This is all the more true when we consider the international dimension. Even if we assume the impossible, and the U.S. became wholly invulnerable from terrorist attack, this would still leave an almost infinite number of targets worldwide.

Think of Europe, an area somewhat smaller than the continental U.S., though divided into over 30 separate nations, with competing police jurisdictions. Most also operate within the Schengen system, so that there are effectively no internal border controls. This makes it easy for activists to escape detection, to operate freely across several countries, and to drive across several countries in a day. Meanwhile, Europe is host to hundreds of major U.S. targets, including embassies, military bases, business offices, airline offices, and tourist resorts. Add to that tourist sites where Americans gather in large numbers.

And that is just Europe: American targets are just as exposed in the Middle East, in the Asia-Pacific region, or in Latin America. There is simply no way that the U.S. can mobilize forces to defend every single potential target in the world against a global organization like ISIL/Daesh or al-Qaeda, or to place U.S. troops at every port and airport, every embassy and tourist destination. Of itself, talking about “making public spaces safer” is delusional.

Let me say this simply: in a terrorist war, any effort that is solely defensive is bound to fail. Just to protect is to lose.

The only way to defeat attacks is to prevent them being launched, and that means finding out what the extremists are going to do before they do it, and stopping them. This demands the use of surveillance and infiltrators, both what is known as electronic intelligence, and human intelligence.

But surveillance over what? In an ideal world, terrorists would wear T-shirts with the word TERRORIST printed in large letters, so they could be picked out easily by the security forces. Unfortunately, they do not do so, and strenuously resist attempts to isolate them from the people. They operate through above-ground organizations and institutions.

One critical concept here is insulation, which British counter-terrorism expert Frank Kitson defines as “a functional system of associations, clubs and other groupings designed to carry out specific tasks.” It is, in short, a means by which terrorists can hide among the larger population. Through legal and above-ground organizations, terrorists spread propaganda, and recruit. The groups can be used as testing grounds, to observe the efficiency and dedication of young militants who might eventually be drafted into the terrorist organization. In advanced stages of a campaign, they can be used to smuggle and store weaponry, and train militants in their use.

Depending on the context, those above-ground groups might take many forms—labor unions, political parties, social clubs, ethnic and religious pressure groups—but in the Islamist world, that chiefly (but not exclusively) means mosques. I could easily list a hundred European mosques that presently serve this crypto-terrorist function, and many do so quite flagrantly.

This point might be obvious, but let me say it clearly. The vast majority of U.S. mosques presently serves no such role, and their members would utterly reject radicalism, extremism, or violence. Even where there is an extremist presence, that would be absolutely contrary to the wishes of the mainstream in the congregation. The main thing the imams in those places want is to have the police help them kick out the extremists, and not to be too gentle doing so.

But any terrorist Islamist presence in the U.S., present or future, does and will use mosques in this way. If you do not maintain such mosques under surveillance—and particular “certain mosques” already leaning in radical Salafist directions—you might as well abandon any and all pretense of trying to limit or suppress terrorism on U.S. soil.

“Surveillance” in this instance emphatically means human intelligence within the mosque. That means recruiting informants within it, and trying to bring radicals over to your own side, to see what extremists are going to do before they do it. Just how and where is radicalization being undertaken? Who are the key militants? Are there weapons present? What are the overseas connections? And if that means recruiting and controlling imams and religious teachers, all the better.

These tactics are absolutely fundamental to European counter-terrorism approaches, and nobody has the slightest doubt of that fact. That fact is public knowledge, and effectively beyond political criticism. If U.S. agencies claim that they are not doing the same things right now, in American mosques, they are simply deluding the public. They will worry about the freedom of religion lawsuits later.

So, God help me on this, in this instance, Trump is right. The only thing he is doing wrong is talking about it publicly.

Again: In a terrorist war, any effort that is solely defensive is bound to fail. To protect is to lose. We must understand and accept all the necessary consequences of that fact.

Philip Jenkins is the author of The Many Faces of Christ: The Thousand Year Story of the Survival and Influence of the Lost Gospels. He is distinguished professor of history at Baylor University and serves as co-director for the Program on Historical Studies of Religion in the Institute for Studies of Religion.

← Older posts