Bradley Birzer

Ray Bradbury Was the Coolest Non-Conformist on the Planet

Ray Bradbury at the Miami Book Fair International, 1990. MDArchives/Flickr   

An American original, Ray Bradbury will almost certainly enjoy a high reputation for centuries to come. The future will remember him for hundreds of short stories and at least four profound novels of gothic Americana: Fahrenheit 451; The Martian Chronicles; Something Wicked this Way Comes; and Dandelion Wine. Almost completely ignored by critics and, even by devoted fans, are Bradbury’s last several novels, many of them noir mysteries, often with a supernatural twist as well as author’s trademark humor and irony. All excellent, these include From the Dust Returned; Farewell Summer; Death is a Lonely Business; A Graveyard for Lunatics; and Let’s All Kill Constance. Yet, it is almost always the short story we think of when we think of Bradbury. Almost every one of his novels comes from his compiling short stories and tying them together through some narrative device. And, of course, most American students are introduced to Bradbury’s work through one or more of his short stories appearing in an anthology.

Two themes (among many) lurk behind almost every corner in his fictional soul: dystopian conformity and autumnal imagination. This piece will only deal with the first of these two themes, leaving autumn for another piece.

Dystopia is to be found whereever and whenever too much power has accumulated, destroying the honed order of our ancestors in favor of some matrix to promote individual or generational ego. Yet, Bradbury also believes in its opposite, utopia. Utopia graces our lives, however, only in the imagination, especially when we remember childhood, energy, magic, and love. And, it’s not enough merely to remember, we must contextualize and give order to our varied experiences of wonder. Bradbury’s utopia, then, is an ecstasy of imagination at its highest. “Life is short, misery sure, mortality certain,” Bradbury wrote in 1973. “But on the way, in your work, why not carry these two inflated pig bladders labeled Zest and Gusto.”

Of all of his works, Fahrenheit 451 remains the most famously dystopic. Yet, when an interviewer asked him in 1996 if he had tried to present “a bleak view of the future” in the vein of Brave New World or Nineteen Eighty-Four and to “write a cautionary story,” Bradbury not atypically balked. “That’s fatal. You must never do that. A lot of lousy novels come from people who want to do good. The do-gooder novel. The ecological novel. And if you tell me you’re doing a novel or a film about how a woodsman spares a tree, I’m not going to go see it.” Much as Willa Cather had once tried to explain her art as art not as politics, Bradbury too rejected the idea that a good author writes with an intended purpose. Instead, he has an idea, something precious and magical, and he follows it, plays with it, nurtures it, and pursue its essence. In the end, good art will reveal a truth, but not always the truth an author originally desired to convey.

Cover art for ‘Fahrenheit 451’ which is 65-years-old this year.

Still, even Bradbury could not fully disguise or dismiss his own political and cultural view of the world. When asked what the truth was that emerged from Fahrenheit 451, he admitted he wrote it in response to “Hitler and Stalin and China, where they burned God knows how many books, killed God knows how many teachers.” Add to this, he feared, the disaster of Joseph McCarthy in the early 1950s, and free thought and free expression would collapse in America. Siding with Alexis de Tocqueville, Bradbury feared that true oppression in the United States would be a soft despotism, with the culture being run by progressive busy bodies, moralizing and oppressing with a myriad of rules and acceptable attitudes. Fahrenheit 451, thus, anticipated political correctness almost three full decades before it became a deadly and nascent issue in the late 1980s. As Bradbury explained decades after the Fahrenheit 451’s publication, he hoped to prevent the future more than to predict it. The medium of science fiction allows so many possibilities. “Whether or not my ideas on censorship via the fire department will be old hat by this time next week, I dare not predict,” he admitted in 1953. “When the wind is right, a faint odor of kerosene is exhaled from Senator McCarthy.”

When pushed on the issue, Bradbury admitted that he was a civil and economic libertarian of some sort. He despised talking about or even thinking about politics, but he also hated that the political sphere was consuming all other spheres of life, it ruling over everyday lives and limiting everyday decisions. Though he might forgive and even encourage government funding for the sciences, he wanted a government that promoted (or left alone) the average person, believing that representatives and bureaucrats too easily abused their powers. Tellingly, as a young man, his favorite books were written by Ayn Rand, Albert Jay Nock, and Irving Babbitt.

In his Martian Chronicles, published in 1950, Bradbury had imagined another dystopian future in which all imaginative works had been destroyed, much as they would be in Fahrenheit 451.

Everything that was not so must go. All the beautiful literary lies and flights of fancy must be shot in mid-air!  So they lined them up against a library wall one Sunday morning thirty years ago, in 2006; they lined them up, St. Nicholas and the Headless Horseman and Snow White and Rumpelstiltskin and Mother Goose— oh, what a wailing!— and shot them down, and burned the paper castles and the fairy frogs and old kings and the people who lived happily ever after (for of course it was a fact that nobody lived happily ever after!), and Once Upon A Time became No More!  And they spread the ashes of the Phantom Rickshaw with the rubble of the Land of Oz; they filleted the bones of Glinda the Good and Ozma and shattered Polychrome in a spectroscope and served Jack Pumpkinhead with meringue at the Biologists’ Ball! The Beanstalk died in a bramble of red tape! Sleeping Beauty awoke at the kiss of a scientist and expired at the fatal puncture of his syringe. And they made Alice drink something from a bottle which reduced her to a size where she could no longer cry ‘Curiouser and curiouser,’ and they gave the Looking Glass one hammer blow to smash it and every Red King and Oyster away!”

Bradbury’s talents also interested the governmental agency set up to destroy the U.S. Constitution in the name of protecting it, the Federal Bureau of Investigation. Demonstrating a level of buffoonery perhaps unprecedented in its history, the FBI opened an ongoing investigation of Bradbury, fearing his literature as subversive and, bizarrely, possibly communist. An informant told the FBI that Bradbury “was probably sympathetic with certain pro-Communist elements.” The evidence? At a meeting of screen writers, some members asked openly whether or not to ostracize members of the Communist Party as well as those who embraced the Fifth Amendment to the U.S. Constitution from their discussion. In a not atypical fit of passion, Bradbury stood and shouted at his fellow members, claiming them to be a lot of “Cowards and McCarthyites.”

Further, the FBI informant claimed, Communists had embraced “the field of science fiction” as it was a “lucrative field for the introduction of Communist ideologies.”

Bradbury, in particular, now declassified FBI documents claimed, wrote stories “slanted against the United States and its capitalistic form of Government.”  One must wonder who these communist science fiction writers were, ready to pollute the minds of thousands of smart nerds: Robert Heinlein, Isaac Asimov, C.S. Lewis, Walter Miller?  

It becomes rather clear in the FBI’s own investigation of Bradbury that “communist” did not mean Marxist or Leninist or Stalinist or Maoist. Rather, it meant anyone who did not support 1950’s conformist culture of corporate and crony capitalism, Washington’s soft despotism, and what Eisenhower would call the “Military Industrial Complex.”  

For the FBI, “communist” also meant those who actually believed in the Bill of Rights, especially the Fifth Amendment. By this standard, Bradbury was indeed a “communist.”  Perhaps a serious one. But, then again, so would Thomas Jefferson, James Madison, and millions upon millions of other Americans. Imagine a world in which average citizens might carry a pocket constitution with them?  Communists, all! In the Martian Chronicles, he had the audacity to criticize the robber barons of American history.

We Earth Men have a talent for ruining big, beautiful things. The only reason we didn’t set up hot-dog stands in the midst of the Egyptian temple of Karnak is because it was out of the way and served no large commercial purpose. And Egypt is a small part of Earth. But here [Mars], this whole thing is ancient and different, and we have to set down somewhere and start fouling it up. We’ll call the canal the Rockefeller Canal and the mountain King George Mountain and the sea the Dupont Sea, and there’ll be Roosevelt and Lincoln and Coolidge cities and it won’t ever be right, when there are the proper names for these places.

As a piece of art, The Martian Chronicles offers a culturally conservative view of imperialism, hubris, and exploitation. The Martians, for Bradbury, serve as an allegory for the classical world of democratic Athens and republican Rome as well as of the noble and natural republicanism of North American Indians. Through a series of vignettes, all set on Mars, Bradbury examines some of the most important existential issues of the human condition.

If anything, Bradbury only grew more and more libertarian as he aged. He despised the censorship of soft despotism, and he found the “politically correct” movement of the late 1980s and early 1990s—then, only in its infancy—repulsive to the extreme. On the 40th anniversary of Fahrenheit 451, Chronicles asked the famous author what he thought of the movement.

Someone said to me recently, aren’t you afraid? No, I said, I never react in fear; I react in anger. As with graffiti, you must counterattack within the moment, not a day, a month, or a year later. All the politically correct terrorists must be driven back into the stands. There is no place for them in the open field of democratic ballplaying.

Amen, Mr. Bradbury. Amen.

His response should be the response of all right-thinking people.

Bradley J. Birzer is The American Conservative’s scholar-at-large. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.


 Tagged , , , . 7 comments

What if Custer Were A Lone Survivor?

Author Brad Birzer and wife, Professor Dedra Birzer, in Kansas in 1998. (Birzer)

Sometime in the late fall of 1998, my pregnant (our first child) wife and I drove from Kansas City to Hutchinson, Kansas. En route, we stopped at Council Grove, an old, homey eastern Kansas town in the Flint Hills, once a part of the Sante Fe Trail. Somewhat famously, there’s an oak tree in town known as “the Custer Elm.”  Whether it’s still there or not, I have no idea, but the sign that accompanied the elm read: “General Custer and his famous 7th Cavalry camped under this tree in 1867 shortly before his tragic massacre by Sitting Bull.”  My wife and I laughed and laughed. Being bizarre history nerds, we thought that was hilarious. First, neither of us thought much of Custer as a human being. Second, Custer did not meet his doom until 1876. If nine years equates with “shortly” we wondered if the author of the sign had an Elvish life span. And, third, Crazy Horse, not Sitting Bull, killed Custer. Sitting Bull wasn’t even at the Battle of Little Bighorn. He was—befitting his position as medicine man—on home guard duty, protecting the Sioux villages during the battle. Please don’t get me wrong. Neither my wife nor I are cynical, nor do we fail to appreciate how much Kansans love their history. Being a native Kansan, I know very well how much Kansans appreciate their history. It’s hard to drive more than five miles without hitting a spot of some historical significance, marked and described for any and all travelers and wanderers across the Wheat State.

On a serious note, the dreadfully mistaken sign promoted a rather deep discussion about the nature of history, what we can know, what we cannot know, and what we have to accept—in necessary humility—as absent from the record and subject, then, to individual interpretation.

In a far more humorous vein, H.W. Crocker III addresses every one of these questions—though, often, in sideways, non-linear, indirect way—in his most recent novel, Armstrong, the first of his “Custer of the West” series. As I had a chance to mention a week or so ago at The American Conservative (and please indulge me as I obnoxiously quote myself):

A satirical alternative history about Michigan’s own George Armstrong Custer, simply and cleverly entitled Armstrong. In Crocker’s world, Custer survived a butchering by Crazy Horse at the Battle of Little Bighorn and has become a Victorian paladin and celebrity, doing everything over the top and then some more beyond the top. Crocker knows his history, so his anti-history is knock-down, pain in the stomach, hilarious.

I quote this not to be troublesome and arrogant, but to note that my views of a few weeks ago have only strengthened. Since finishing that book, I have given it much thought. Indeed, it keeps hovering over my other thoughts, and it has promoted me to ask a whole series of questions about the nature of history. Yes, the kind of questions that sign in Council Grove first posed almost two decades ago.

Without hyperbole, Crocker’s Armstrong caused so much laughter—as well as thought—that my sides and stomach did actually hurt. The model of the book is rather clearly Mark Twain’s Connecticut Yankee in King Arthur’s Court. Considering Twain’s standing in 19th century American literature, I do not make this comparison lightly. That book, too, at least until its horrific and brutal ending, has given me laughter fits for much of my adult life. Crocker’s wit is certainly as good as Twain’s, amazingly enough, as both share the talent of poking fun without destroying. More on this later in the review. And, it’s not just Twain that one thinks of with Crocker’s novel. There’s also a more than a bit—often quite explicit—of Yellow Journalism, dime store fiction, Pulp, Batman, and even Kurt Vonnegut, Joseph Heller, and Flannery O’Connor in here as well. Outside of the written word, there’s quite a bit of cinema, too, but mostly of Zelig-like quality. Yet, if there was one movie that most will probably think of when reading Armstrong, it’s Dances with Wolves. Counter that sappy, feel-good-leftism, and New Agey Dances, however, Crocker turns Kevin Costner on his head. This is Dances with Wolves if written by Bill Gaines. If I’ve not expressed it well yet, let me blunt. This novel is over the top. And, mightily and gloriously so. Yet, even within the slap-stick outrageousness, there lurk and hover very meaningful and subtle points and comments. In other words, Crocker has produced a fictional masterpiece.

As noted above, the novel begins in June 25, 1876, the day of the massacre at Little Bighorn in Montana. On that day, it should be remembered, George Armstrong Custer unwittingly led part of the Seventh Cavalry into a trap. Frustrated by his post-Civil War career, Custer was often hotheaded, arrogant, and reckless. Those qualities (or lack thereof) caught up to him and, sadly, the men under his command on that day, as Crazy Horse led a pan-Indian coalition against the American invaders. Regardless of what is P.C. and what is not in 2018, the Indians were clearly the defenders, nobly rejecting the invasion while attempting to protect home and hearth. Their victory, though decisive, proved fleeting, as the waves of emigrants and their livestock would soon overcome the Plains Indians, no matter how noble they might be. It should also be remembered that, whatever his failings, Custer had been a true hero in the American Civil War. On July 3, 1863, in particular, he had led his men against those of Confederate Jeb Stuart, preventing the latter from assaulting the rear of Union lines at Gettysburg. The goal—at least as Robert E. Lee had envisioned it—was for Stuart’s cavalry to hit the center rear of Union lines just as Pickett’s men hit the front of the line. In a several hour cock of the walk, medieval style joust and dual between Union and Confederate cavalries in the East Field, Custer prevented aid from reaching Pickett and Lee. Simply put, Custer had mattered. By 1876, not so much.

Crocker creates a world in which Custer alone—at least among American military—survived that day, having been protected by Rachel, a white captive of the fictional Boyanama (!) band of Sioux. She claimed Custer as her slave, but he claimed her as his ward. As part of his captivity, the Sioux tattooed his arm, drawing a picture of his beloved wife, Libby on his biceps with the motto around it: “Born to Ride.” Escaping his captors, he decides that he must remain incognito, hoping to clear his name after the disaster at Little Bighorn. During the novel, in fact, he takes many names, all of them hilarious. The most frequent name he takes, however, is Armstrong Armstrong (yes, you read that correctly), thus the title of the novel.

Over ten chapters, Custer’s adventures never cease. From Sioux captive to Chorus girl to mock Indian to fake U.S. Marshall, Custer finds himself leading a group of enslaved victims in Bloody Gulch, Montana, a company town controlled brutally from the top down by one ruthless man and scoundrel, Larson. Interestingly enough, this man claims to be empowered by the U.S. government to hold such authority. Custer seemingly accepts this, also claiming to be empowered by the U.S. government. He rationalizes this as acceptable because of his hatred of all things Republican and U.S. Grant-related. Larson is a Republican, and he, Custer, a Democrat. When Rachel proclaims him a “liberator” in the vein of Abraham Lincoln, he quickly corrects her. “No, Rachel, not like Lincoln. He was a Republican. What this country needs is a good Democrat in favor of lower taxes, a return to sound money, free trade, a smaller reformed government that spends more on the army, and honest administration—especially after two terms of that baboon Grant.”

Don’t make too much of the quote, though. This is not a political book. Not in the least. It’s a book of adventure and social commentary. The social commentary, though, is simply drop-dead hilarious. Among Custer’s allies in the book are a former Confederate officer and dandy, a Latin-speaking, Catholic Crow Indian, a number of Chinese acrobats, and seemingly unlimited beautiful women—all of whom seemingly swoon over Custer’s manliness. At one subtle moment, Custer perks up considerably when he learns that having been inducted into the Boyanama Sioux might very well allow him to have multiple wives. Though Crocker takes this no further, it’s pretty clear that Custer hopes this might happen.

In the social commentary that pervades the whole story, much as Mark Twain does in Connecticut Yankee in King Arthur’s Court, Crocker plays with radically politically incorrect stereotyping. Yet, the humor to be found—gut holding at times—is not because Crocker’s depictions are shocking (they are), but because they’re hilarious. As a gifted writer and thinker, Crocker’s stereotypes artfully reveal the true essence of humanity, the individual person, and virtue. It’s a stunning accomplishment, frankly.

Finally, it has to be noted that the entire book is written as a long letter to his faithful and devoted wife, Libby, letting her know that he survived Little Bighorn as well as retelling his manly adventures. For some bizarre and funny reason, Custer is convinced that his lusty comments about the legs and shapes and skin tones and hair color of every woman he meets will make Libby appreciate his manliness even more. The joke, though repeated incessantly throughout the book, never gets old.

Armstrong is satire and fiction at its finest. Crocker has given us a treasure and one that, this reviewer hopes, will be but the beginning of a series of Custer’s adventures in the West. Viva, Armstrong Armstrong!  Now, to return to Council Grove, Kansas, and see if Crocker’s new novel has forced the town leaders to change that sign. . . . After all, each only hovers on the edge of reality.

Bradley J. Birzer is The American Conservative’s scholar-at-large. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

 Tagged , , , , . 14 comments

What Exactly is ‘The West’?

From the book, “Frank Brangwyn and His Work. 1910” Published in 1911. (Public Domain)

Every autumn, I have the great pleasure of teaching what we at Hillsdale College call “Western Heritage.”  It’s the first core course that every entering student must take. With classes ranging from 15 to 20 students each, we read primary sources, ranging from Genesis to Plato, Aristotle to St. John, Cicero to St. Augustine, and Thomas Aquinas to John Calvin. Though I’ve been teaching this course since the fall of 1999, I have never once found myself bored or tired or uninspired. Quite the opposite, actually. Whatever my faults, this course has made me a better person and a better thinker. And, judging by how my students embrace the material, the same is true for them.

Yet, every fall, as I prepare for the class, one question lingers. It’s a question I’ve never been able to answer to my own satisfaction. What is the West? Ever since the “politically correct” movement began in the 1980s in the United States, its critics have complained of it—from a quiet seething to outright brutality and invasive protests—as racist, sexist, and imperialist. The critics can be as emotionally violent as they are intellectually dull.

In the 1980s, at least, even its critics took a canon of authors and texts seriously, asking only that the canon be more inclusive in terms of race and gender. Frankly, I miss those days. Today, at the vast majority of schools and colleges, the West is something hideous and embarrassing, to the point that the term itself can trigger almost automatic hatred and dismissal.

Let’s leave the critics aside from now, with one important caveat: a recognition that they’re simply wrong.

Even the terminology suggests much good. In much of the ancient Mediterranean, the West was the land of the gods, known as the Blessed Isles, the Blessed Realm, or, of course, Atlantis. Plutarch wrote:

These are called the Islands of the Blest; rain falls there seldom, and in moderate showers, but for the most part they have gentle breezes, bringing along with them soft dews, which render the soil not only rich for ploughing and planting, but so abundantly fruitful that it produces spontaneously an abundance of delicate fruits, sufficient to feed the inhabitants, who may here enjoy all things without trouble or labour.

Even the Egyptians, often regarded as a people almost entirely separate from the other western powers, believed that Isis and Orisis, representing justice and immortality, reigned from their mysterious realm in the West.

The idea of the gods living in the West proved so strong that the early Church had a difficult time explaining how Jesus came out of the East. As a way to convince pagans to convert to Christianity, the Church described Christ as the “perfect offering” from “east to west,” thus arguing that Christ had sovereignty everywhere, preferring neither east nor west.

Whatever successes the Church had in explaining this, the mystery of the West motivated everyone from Columbus to Coronado to J.R.R. Tolkien.

No one, however, prior to the sixteenth century thought of the West as synonymous with Europe. The ancient Latins had employed the term, “Europa,” but it was an idea of freedom, not an actual place. The term Europe did not come into vogue until the very early 16th century as a way to distinguish Christian Europe from the Americas to the West and the Muslims to the south and east. Since roughly 893 AD, most educated Europeans referred to their world simply as “Christendom” or the Christiana res publica.  Alfred the Great, as far as is known, was the first to use the term, and he employed it as those people who resisted the Vikings. Most Christians, however, simply referred to what is now Europe as some variant of middangeard or Middle-earth. Even western Christians did not think of the Orthodox Churches as being “Eastern” until the Crusades.

One of the greatest historians of the last century, Christopher Dawson, thought of the West as a tradition, one that blended, almost seamlessly, the classical world with Christianity. He is worth quoting at length on this:

This tradition is entirely different from the influence of the pagan culture, which continued to exist in a submerged subconscious form; for it affected those elements in Christian society which were most consciously and completely Christian, like monasteries and the episcopal schools. Consequently, it is impossible to study Christian culture without studying classical culture also. St. Augustine takes us back to Cicero and Plato and Plotinus. St. Thomas takes us back to Aristotle. Dante takes us back to Statius and Virgil, and so on, throughout the course of Western Christian culture. And the same is true of Eastern Christendom in its Byzantine form, though this only reaches Russia . . . second hand and infrequently. But the same is true of theology, at least its more advanced study. The whole of the old theological literature of Catholic Christendom, both East and West, is so impregnated by classical influences that we cannot read the Greek and Latin Fathers, or even the Scholastic and 17th-century theologians without some knowledge of classical literature and philosophy.

Critically, for Dawson, literature, philosophy, and theology defined that tradition, ignoring the role of politics and political boundaries or seeing them, at best, as of secondary importance.

If there is such thing as a tradition of the West—say, from Marathon to Waterloo—then, we should probably accept the Battle of Thermopylae (480 BC) as its origin. Coming at the very end of the Persian Wars, the Spartan war king Leonidas and his 300 men held off nearly 100,000 battle hardened Persians for days. As Herodotus described it, scathingly:

But Xerxes [the Persian god king] was not persuaded any the more. Four whole days he suffered to go by, expecting that the Greeks would run away. When, however, he found on the fifth that they were not gone, thinking that their firm stand was mere impudence and recklessness, he grew wroth, and sent against them the Medes and Cissians, with orders to take them alive and bring them into this presence. Then the Medes rushed forward and charged the Greeks, but fell in vast numbers: others now took the places of the slain, and would not be beaten off, though they suffered terrible losses. In this way it became clear to all, and especially to the king, that though he had plenty of combatants, he had but very few men.

Betrayed by a fellow Greek, Leonidas and his 300 were slaughtered, but their legacy remains, and deeply so. If this was the beginning of the West, the West was born in sacrifice, justice, and resistance. When the Persian tyrant demanded that Leonidas and his men to lay down their arms and surrender, the Spartan king supposed replied: “Molon labe.” That is, come and take them. Liberty, as the ancients understood it, did not mean what it may mean to some today—that every person may do what he or she likes unless it physically harms another person. Rather, freedom meant liberty from the control of a “god-king,” as was common in the East.

Such sacrifices have not been uncommon in the West. Even leaving the death of Jesus Christ aside for the sake of argument—after all, who can compare—we have the examples of Socrates being executed in his defense of Truth; Marc Antony’s men murdering the greatest of Roman Republicans, Marcus T. Cicero; the uncounted numbers of martyrs who died in the arenas; the Jesuits in North America, and so on.

Yet, if Leonidas unleashed what we might call western patriotism in 480—that is, something to fight against—the West must also have something to fight for.

The West did not, of course, invent sacrifice, no matter how well citizens of the West have embraced it for the past two and one-half millennia. It did, however, invent something unique in the world, something to fight for.

Sometime around the year 510 BC, a full thirty years before the death of Leonidas, a number of Greek thinkers in what is now the extreme western coast of Turkey wanted to know what the origin of all things might be. Could it be air, water, land, or sea? And, they wanted to know why all of life seems cyclical: life, middle-age, and death; and spring, summer, fall, and winter. Yet, the world did not end at the end of each cycle, it began anew. This proved universally true.

It must be noted that every civilization—east to west—has a form of ethics, a way to treat those in the in-group. It was uniquely in the West, though, that philosophy—the love of wisdom and the search for universal principles—arose. Ethics tells me how to treat my neighbor, but only philosophy allows me to understand that the person beyond my neighbor is still a fellow human. After all, each person is a universal truth wrapped in a particular manifestation.

Of those primary elements that might be the source of all being, as the first Greek philosophers argued, the one that won out over time was the one Heraclitus named, Logos—meaning inspiration, word, fire, thought, imagination. It is no wonder that the most Greek of the four Christian Gospels, that of St. John the Beloved, declares Jesus Christ as the Logos or that St. Paul believed Jesus the source of all being, reconciling all things through the Cross.

In other words, far from being racist and sexist, western civilization was the first to argue for the universal concept of the dignity of the human person, regardless of his or her accidents of birth. Those, today, who attack western civilization have absolutely no idea that their very freedom comes from those “dead white males” they so hate.

Bradley J. Birzer is The American Conservative’s scholar-at-large. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

 Tagged , , , , , , . 43 comments

My Ántonia at 100

Willa Cather ca. 1912 (Public Domain)

When it comes to considering America’s greatest writers, it would be foolish to ignore Willa Cather as a contender. Indeed, it is quite possible that her 1925 novel, Death Comes for the Archbishop is the great American novel, rivaling anything that came before or since.

Yet, Cather was consistent. While not at the level of Death Comes, her 1913 O Pioneers and her 1927 The Professor’s House certainly come close. Shadows on the Rock (1931), too. Of all her novels, though, the one that most rivals Death Comes is her 1918 novel, My Ántonia. When the book first appeared, that nastiest and most difficult of critics, H.L. Mencken had nothing but praise for it and its author. She is, he wrote approvingly, “isolated in accomplishment” and “isolated from all current rages and enthusiasms.” Devoid of heroes, plots, love affairs, and any pretense to change the world, My Ántonia sees the world through the eyes of an immigrant, a poor Bohemian who becomes one with the land she works. “But what Miss Cather tries to reveal is the true romance that lies even there—the grim tragedy at the hearth of all that dull, cow-like existence—the fineness that lies deeply buried.” Cather succeeds at making real and critical what is often ignored or hidden. “Miss Cather’s method inclines more to suggestion and indirection. Here a glimpse, there a turn of phrase, and suddenly the thing stands out, suddenly it is as real as real can be—and withal moving, arresting, beautiful with strange and charming beauty,” he continued. And, then, surprisingly, Mencken offered his highest praise: “I commend this book to your attention, and the author no less. There is no other American author of her sex, now in view, whose future promises so much.”

A full century later, Mencken’s review still holds true. In almost every way, Cather writes at a level beyond every other American author. One could not be blamed, if giving any of Cather’s novels only a cursory read, in believing her writing style somewhat juvenile and superficial. Such a reading, though, would be dead wrong. In her many writings on the meaning of art, Cather criticized anything that might be blatant, political, or over the top. True art, she believed, contained the entire author’s view of life, but it did so by layering, not by berating. “Art, it seems to me, should simplify,” she explained. “That, indeed, is very nearly the whole of the higher artistic process; finding what conventions of form and what detail one can do without and yet preserve the spirit of the whole.” Thus, she argued, the partaker of the art fills in all of the details of what the artist has intentionally trimmed and cut, making the art belong as much to the artist as it does to the recipient. “Any first-rate novel or story must have in it the strength of a dozen fairly good stories that have been sacrificed to it.”

In this Stoic effort, Cather understood that nothing should be produced without every aspect of it meeting the highest standards of excellence possible. This applies to that which is seen as well as that which is not. As Steve Job would explain nearly eight decades later, every created thing should be excellent, in every one of its aspects. He cited the example his father offered him. If a carpenter makes a stunning oak chest of drawers but uses press board for its back—presuming that no one will ever see it—the entire piece of furniture is junk. So it was with all of Cather’s novels. Additionally, Cather argued in the same vein as T.S. Eliot—no real art is revolutionary. Rather, it is always at its best when it’s evolutionary. The artist knows when to compromise only when she or he knows the rules and knows what needs to be broken for real artistic progress. At first, every artist “is wedded to old forms, old ideas, and his vision is blurred by the memory of the old delights he would like to recapture.” The artist, though, can only break barriers when he knows exactly what those barriers are. The writer, in particular, can never actually write about the essence of hate or love. Instead, he can only write of the human person as understood or distilled by hate or love. All emotion and ideas can only be understood in relation to character and person. If his own ideology clouds his art, the artist, in good conscience and taste, should forsake art and work “in a laboratory or a bureau.”

Like the Great Plains about which the author so gorgeously writes, little that the eye first observes is true. The grasses one sees on the plains are nearly six times longer than that which grows above ground, hiding—at least traditionally—deer, buffalo, elk, birds of all kinds, snakes, and bobcats. Equally important, far from flat—as many crossing I-70 lament—the plains roll and break, thus giving a false impression of depth and distance. On clear days, one can see for miles and miles, day or night, even when the latter is not illuminated the all-pervasive heat lightning of summer. The Great Plains unveil treasure after treasure to those who explore. The same is true of Cather’s novels.

Though named after a Bohemian immigrant, the novel My Ántonia is really about the radically diverse life—human and otherwise—on the Great Plains, as understood by an emigrant from Virginia, Jim Burden. In the opening scene, Burden and Cather meet on a train, discussing their lost friendships of youth, including their mutual friend, Ántonia Shimerda, from Black Hawk (Red Cloud) Nebraska. Burden is now a lawyer for a large railway concern in the East, but he fondly remembers growing up with his grandparents on their Nebraska homestead. From the moment he arrived there from Virginia, Ántonia, though a few years older, dominates his cultural outlook and development. From the beginning to the end of the novel, she is a sprite, an earth goddess, and a force of nature, something fully human and yet superhuman as well. Everything that Jim thinks and remembers of Ántonia is synonymous with his memories of childhood and the country in which he grew. Ever after life had taken its toll on Ántonia’s physical appearance, Jim could not help but see her inner greatness.

She lent herself to immemorial human attitudes which we recognize by instinct as universal and true. I had not been mistaken. She was a battered woman now, not a lovely girl; but she still had that something which fires the imagination, could still stop one’s breath for a moment by a look or gesture that somehow revealed the meaning in common things. She had only to stand in the orchard, to put her hand on a little crabtree and look up at the apples, to make you feel the goodness of planting and tending and harvesting at last. All the strong things of her heart came out in her body, that had been so tireless in serving generous emotions.

Married, but without any children, and financially successful, Jim recognizes that Ántonia—with her patch of land, her dedicated husband, and her innumerable children—has embraced and understood life at its most profound level. Jim can only describe Ántonia’s land and family in mythic terms. Her children are fauns and Ántonia, herself, is a “rich mind of life, like the founders of early races.”

Though married, Jim admits,

Do you know, Ántonia, since I’ve been away, I think of you more often than of anyone else in this part of the world. I’d have liked to have you for a sweetheart, or a wife, or my mother or my sister— anything that a woman can be to a man. The idea of you is a part of my mind; you influence my likes and dislikes, all my tastes, hundreds of times when I don’t realize it. You really are a part of me.’

For years, critics categorized and dismissed Willa Cather as a mere regional writer, a Nebraskan and little more. To a great extent, this was true, as Cather often wrote about the American frontier, though she was equally adept at describing it in the Canadian hinterlands, on the Great American Plains, and in the American Southwest. In all her frontier novels, she focused on three vital themes: the fundamental necessity of personal virtue and sacrifice; the communal effort; and the unforgiving but sacramental elements of nature and, especially, the land itself. My Ántonia explores all three themes. Those who came first, either broke the land or, simply, broke. Those who followed everything to the first ones, but rarely did they exhibit the same spark of life.

Those girls had grown up in the first bitter-hard times, and had got little schooling themselves. But the younger brothers and sisters, for whom they made such sacrifices and who have had ‘advantages,’ never seem to me, when I meet them now, half as interesting or as well educated. The older girls, who helped to break up the wild sod, learned so much from life, from poverty, from their mothers and grandmothers; they had all, like Ántonia, been early awakened and made observant by coming at a tender age from an old country to a new.

Those who attempted to make it on their own—what in the 1920s would be called “rugged individualism”—almost always failed and went mad. The subduing of nature took the entire community. Having migrated across the Atlantic, leaving everything once known, the immigrants often fared best. “This family solidarity was that the foreign farmers in our country were the first to become prosperous.” Critically, those immigrant farmers brought with them the skills, manners, and attitudes of the old world, usually expertise in food, music, the arts, furniture, etc., setting them a cultured step above the native American emigrants. Typically, though, the native emigrants took the immigrants’ poor use of English as a sign of unintelligence.

While every sentence, paragraph, and chapter in the novel exudes a beauty, truth, and goodness, no one does more so than the tragic figure of Mr. Shimerda, the father of Ántonia and the one who first spoke the title of the novel. A gifted artisan and musician, he left Bohemia only after the insistence of his wife—she a product of a forbidden relationship. Having been a man of much intellect and skill, his Bohemian community had always sought his advice and wisdom. In Nebraska, though, not only was he a nothing, he was incapable of understanding the land or working it. He became less than nothing, a burden to his family. Upon arriving on the Great Plains, he entered a deep depression. Right before Christmas, he killed himself with a shotgun.

In some unfathomable way, Mr. Shimerda became the spirit of the land after his death. Because he had committed suicide, no cemetery would accept his body. The family buried him at what would be a crossroads. Jim, though Protestant, wonders about the fate of his soul. “I knew it was homesickness that had killed Mr. Shimerda, and I wondered whether his released spirit would not eventually find its way back to his own country,” he considered. “I thought of how far it was to Chicago, and then to Virginia, to Baltimore— and then the great wintry ocean. No, he would not at once set out upon that long journey. Surely, his exhausted spirit, so tired of cold and crowding and the struggle with the ever-falling snow, was resting now in this quiet house.” Cather’s passage describing Shimerda’s grave is one of the finest in all American literature, well worth quoting at length.

Years afterward, when the open-grazing days were over, and the red grass had been ploughed under and under until it had almost disappeared from the prairie; when all the fields were under fence, and the roads no longer ran about like wild things, but followed the surveyed section-lines, Mr. Shimerda’s grave was still there, with a sagging wire fence around it, and an unpainted wooden cross. As grandfather had predicted, Mrs. Shimerda never saw the roads going over his head. The road from the north curved a little to the east just there, and the road from the west swung out a little to the south; so that the grave, with its tall red grass that was never mowed, was like a little island; and at twilight, under a new moon or the clear evening star, the dusty roads used to look like soft grey rivers flowing past it. I never came upon the place without emotion, and in all that country it was the spot most dear to me. I loved the dim superstition, the propitiatory intent, that had put the grave there; and still more I loved the spirit that could not carry out the sentence— the error from the surveyed lines, the clemency of the soft earth roads along which the home-coming wagons rattled after sunset. Never a tired driver passed the wooden cross, I am sure, without wishing well to the sleeper.

Few if any novels have so captured the spirit of the American character, in all of its majesty and nobility. Though many critics loved Cather, and her novels sold very well, her conservative politics had soiled her reputation by the end of the 1930s, and she became, in literary circles, a non-person for many decades. Only in the 1960s and 1970s did her reputation again soar. Today, Nebraska has done mighty things to keep the memory and legacy of her greatest artist alive. If you’re crossing I-70 or I-80, do not hesitate to stop at Red Cloud, her hometown, and the setting of all of her Great Plains novels. Celebrate the mind, art, and imagination of the most American of American authors.

En route, if so blessed, you might just feel the spirit of a Bohemia, out of place and yet fully in his right place.

Bradley J. Birzer is The American Conservative’s scholar-at-large. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.


 Tagged , , . 7 comments

Glass Hammer: Giving Meaning to Time & Space

America’s single most innovative and interesting rock band is also, sadly, one of its least known and appreciated. This needs to end, and the sooner, the better for all concerned.

Amazingly enough, the band Glass Hammer is now celebrating its 26th birthday, and is about to release its 17th studio album. This is an astounding achievement in the world of art and, especially, in the world of rock. To add even more accolades, the band exists because its two founders were and are perfectionists, refusing to compromise on their own vision of what excellence is.

Creating Glass Hammer in 1992, long-time friends, Steve Babb and Fred Schendel—who had played in several 80s metal bands—decided to dive into what they loved most: complicated, intricate, baroque, over-the-top rock. At the time of the band’s creation, the term “progressive rock” was more than out of favor, evoking for most the horrors of bloated songs, the wearing of capes, the stabbing of keyboards with knives, and lyrics about Hobbits. Though, if Babb and Schendel had hoped to avoid the “progressive rock” stereotype, they failed miserably. If anything, their music—what they called “fantasy rock,” bringing the speculative and imaginary worlds of C.S. Lewis, J.R.R. Tolkien, and others to life—was inordinately more nerdy than “progressive rock.”  

Promoting their music on TV in the early 1990s, they readily found a market for their particular brand of fantasy rock. Singing about Hobbits, it seems, was far from unpopular. Indeed, the band has made money on every one of their releases, whether studio, live, or compilation. And, while their fanbase might be fewer in number than, say, the many who appear regularly on pop and rock radio, their fans are dedicated and, not surprisingly given the type of music, quite intellectual and serious.

Rock DJ Chris MacIntosh (a.k.a., “Grandfather Rock”) proclaims the albums of the band to be “sonic masterpieces.” Another longtime devotee, Dennis Cussen, notes that the band’s integrity and artful lyrics have not only inspired him but have also sustained him during difficult times. That the band can inspire while also pursuing excellence, Cussen claims, “is a gift from above.” Babb’s childhood and now, lifetime, friend, Robert Clay Smith, first introduced him to progressive rock in high school. As Smith so accurately writes, “These are sincere and truly gifted musicians and singers whose hearts are in the right place and are providing people with uplifting music which is truly good not only for the ears but also for the heart, soul and spirit.”  It would be hard to disagree with any of these statements, at least from this author’s perspective. And, none of these statements are unique to these three quoted. Glass Hammer fans across the years would all say something similar.

With the release of its first album in 1993, Journey of the Dunadan, Glass Hammer also inadvertently and, at the time, unknowingly contributed significantly to the current revival of progressive rock, now known as “third-wave prog.” In this, they joined Britain’s Marillion, California’s Spock’s Beard, and Sweden’s The Flower Kings as third-wave prog’s founders.

To support the band and their vision, Schendel, Babb, and Babb’s lovely and brainy wife, Julie, founded a state-of-the-art sound studio in Chattanooga, Tennessee, in 1994. The studio, Sound Resources, records anything that can be recorded, but it specializes in album, music, and audio book recording, engineering, and production. As studio owners, Babb and Schendel can also spot, identify, and cultivate talent. As such, Glass Hammer has existed as much as a project as it has a band over its 26 years. The only real constants have been Babb and Schendel, with the two men recruiting the best of those they meet.

Among the most important recruits over the years has been Susie Warren Bogdanowicz, a very talented and extremely attractive mother of four from Florida. Babb and Schendel recognized her many gifts immediately, bringing her into the Glass Hammer family with the release of their fourth studio album, Chronometree, in 2000. Though the band has had many singers—including, most famously, Jon Davison (now of Yes)—Bogdanowicz not only possesses the most angelic voice in rock, she also, quite frankly, possesses the single finest voice in all of rock in this year of our Lord, 2018. Certainly other rock vocalists—such as Big Big Train’s David Longdon as well as Headspace’s Damian Wilson—offer as much integrity, but none have the range and the power of Bogdanowicz. Prior to joining Glass Hammer, she had been the lead singer of an alternative rock band during the 1990s.

Over their first sixteen studio albums, Glass Hammer has proven itself, time and again, to be expert storytellers and myth makers. They have certainly embraced the works of Lewis and Tolkien, but they have also gone beyond just their heroes. They have become—in word and note—every bit the bards that their heroes were. Their albums have dealt rather profoundly with everything from the Roman empire to the mysteries of death to the horrors of H.P. Lovecraft to the nobility of soldiers of World War I. And, while the vast majority of bands see their creativity and purpose ebb after their first and second albums, Glass Hammer has just gotten better with age. As I look back over my own notes and reviews of the band since 2002, I see a constant. In almost every review of every new album, I write something to the effect: “This is a band at the top of their game, with this release being better than all previous releases.” And, I’ve meant it every time. That kind of excellence and integrity is rare in any place in any time in history.

The latest studio album to appear from the band, Chronomonaut, will be released on October 12, with autographed pre-orders getting underway on September 12 via the band’s website. It shows both the creative as well as mischievous side of the band. It’s a sequel to the 2000 album, Chronometree, the album that brought Bogdanowicz into the Glass Hammer family. That album, the fourth by the band, was as hilarious as it was inventive. The story of that album revolves around “Tom,” a young man obsessed with the lyrics of progressive rock albums. More than a tinge satirical, Babb and Schendel were making fun of themselves, noting that not uncommon perfectionist and OCD streak that runs through all lovers of progressive rock.

Truth be told, we progressive rockers are not just the nerds of the rock world, we’re the snobs of the rock world. Every note, every lyric, every album cover, and every credit has to be analyzed over and over and over some more. While non-proggers could accuse us of many things, inattention to detail is not one of them.

Tom, the protagonist of Chronometree, though, takes this even farther than most of us did in the 70s and 80s. He becomes convinced that the lyrics from Yes, Genesis, King Crimson, Kansas, and other progressive rock bands are, in actuality, a secret, coded, perhaps scriptural language from another world.

In a series of social media posts and videos—all very much in the vein of Stranger Things—Tom has returned over the past year. He calls himself “The Elf King” now, and he signs his name Tom Timely. Though the posts from Tom are current, the videos date back to the summer of 1983. A young Tom [ok, a little spookily, this could easily be a young Brad, yours truly] in 1983 ponders the deeper meanings of life, space, and time, but he also complains that his efforts to create the perfect progressive rock band have been foiled by his bandmates caring more about their girlfriends than about the band itself. Because Tom seeks perfection, the actions of his friends and bandmates is nothing short of the Platonic betrayal of the True.

When asked about the meaning of the sequel, Babb responds:

Chronomonaut deals with nostalgia, which I think every modern prog fan can relate to. How many of us, fans and musicians alike, are trying to recapture some lost glory of our youth? In a way I try to musically and lyrically elaborate on a C.S. Lewis quote, the same one in which he coined the phrase “the inconsolable secret” :  “These things—the beauty, the memory of our own past—are good images of what we really desire; but if they are mistaken for the thing itself they turn into dumb idols, breaking the hearts of their worshipers,” Lewis wrote. “For they are not the thing itself; they are only the scent of a flower we have not found, the echo of a tune we have not heard, news from a country we have never yet visited.”

Babb cautions, though, that there’s nothing wrong with romanticizing the past (and, it would be harder to find a greater romantic than Babb); there is always the danger of mistaking our past for perfection and failing to prepare ourselves not just for our future in this world, but in the next as well. Babb, it should be noted, not only cares about the artistic integrity of his music, but he’s admirably unafraid to share his own faith and beliefs through his art.

For Chronomonaut, Glass Hammer is: Fred Schendel (keyboards); Steve Babb (bass); Aaron Raulston (drums); and Susie Bogdanowicz (vocals). Though I have singled out Babb and Bogdanowicz for praise for this piece, I must also note that Schendel is one of the best keyboardists you will ever hear, certainly the superior to even such greats as Rick Wakeman, and Raulston’s drumming is, at once, forceful and nuanced. Certainly, he is one of the top drummers in the world today. Babb, too, is an excellent bassist, the equal to Geddy Lee and the late Chris Squire.

If you’re interested in the worlds of art, myth, and fantasy; if you believe in excellence; and if you desire to reach the Socratic good, true, and beautiful, you have no further to look than Glass Hammer.

Bradley J. Birzer is The American Conservative’s scholar-at-large. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

 Tagged , , . 6 comments

Hope on a Rose

(Credit: Laura Smith)

Had things worked or happened differently, I would be celebrating the eleventh birthday of my daughter, Cecilia Rose Birzer, today. I can visualize exactly what it might be like. A cake, eleven candles, hats, cheers, goofiness, photos, and, of course, ice cream. I imagine that she would love chocolate cake–maybe a brownie cake—and strawberry ice cream. Her many, many siblings cheer here, celebrating the innumerable smiles she has brought the family. As I see her at the table now, I see instantly that her deep blue eyes are mischievous to be sure, but hilarious and joyous as well. Her eyes are gateways to her soul, equally mischievous, hilarious, and joyous. She’s tall and thin, a Birzer. She also has an over abundance of dark brown curls, that match her darker skin just perfectly. She loves archery, and we just bought her first serious bow and arrow. No matter how wonderful the cake, the ice cream, and the company, she’s eager to shoot at a real target.

She’s at that perfect age, still a little girl with little girl wants and happinesses, but on the verge of discovering the larger mysteries of the teenage and adult world. She cares what her friends think of her, but not to the exclusion of what her family thinks of her. She loves to dance to the family’s favorite music, and she knows every Rush, Marillion, and Big Big Train lyric by heart. She’s just discovering the joys of Glass Hammer. As an eleven-year old, she loves princesses, too, and her favorite is Merida, especially given the Scot’s talents and hair and confidence. She has just read The Fellowship of the Ring, and she’s anguished over the fate of Boromir. Aragorn, though—there’s something about him that seems right to her.

If any of this is actually happening, it’s not happening here. At least not in this time and not on this earth. Here and now? Only in my dreams, my hopes, and my broken aspirations.

Eleven years ago today, my daughter, Cecilia Rose Birzer, strangled on her own umbilical cord. That which had nourished her for nine months killed her just two days past her due date.

On August 6, 2007, she came to term. Very early on August 8, my wife felt a terrible jolt in her belly and then nothing. Surely this, we hoped, was Cecilia telling us she was ready. We threw Dedra’s hospital bag into the car as we had done four times before, and we drove the 1.5 miles to the hospital. We knew something was wrong minutes after we checked in, though we weren’t sure what was happening. Nurses, doctors, and technicians were coming in and out of the room. The medical personnel were whispering, looking confused, and offering each other dark looks. Finally, after what seemed an hour or more, our beloved doctor told us that our child—a girl, it turned out—was dead and that my wife would have to deliver a dead child.

We had waited to know the sex of the baby, but we had picked out names for either possibility.  We had chosen Cecilia Rose for a girl, naming her after my great aunt Cecelia as well as St. Cecilia, the patron saint of music, and Rose because of St. Rose of Lima being the preferred saint for the women in my family and because Sam Gamgee’s wife was named Rosie.

I had never met my Aunt Cecelia as she had died at age 21, way back in 1927.  But, she had always been a presence in my family, the oldest sister of my maternal grandfather.  She had contracted tetanus, and the entire town of Pfeifer, Kansas, had raised the $200 and sent someone to Kansas City to retrieve the medicine.  The medicine returned safely to Pfeifer and was administered to my great aunt, but it was too late, and she died an hour or two later. Her grave rests rather beautifully, just to the west of Holy Cross Church in Pfeifer valley, and a ceramic picture of her sits on her tombstone. Her face as well as her story have intrigued me as far back as I can remember. Like my Cecilia Rose, she too had brown curly hair and, I suspect, blue eyes. She’s truly beautiful, and her death convinced her boyfriend to become a priest.

The day of Cecilia Rose’s death was nothing but an emotional roller coaster. A favorite priest, Father Brian Stanley, immediately drove to Hillsdale to be with us, and my closest friends in town spent the day, huddled around Dedra.  We cried, we laughed, and we cried some more–every emotion was just at the surface. I’m more than certain the nurses thought we were insane. Who were these Catholics who could say a “Hail Mary” one moment, cry the next, and laugh uproariously a few minutes later? Of course, the nurses also saw just how incredibly tight and meaningful the Catholic community at Hillsdale is. And, not just the Catholics—one of the most faithful with us that day was a very tall Lutheran.

Late that night, Dedra revealed her true self.  She is—spiritually and intellectually—the strongest person I know. She gave birth with the strength of a Norse goddess. Or maybe it was just the grace of Mary working through her. Whatever it was, she was brilliant. Any man who believes males superior to females has never seen a woman give birth.  And, most certainly, has never seen his wife give birth to a dead child. Cecilia Rose was long gone by the time she emerged in the world, but we held her and held her and held her for as long as we could. With the birth of our other six children, I have seen in each of them that unique spark of grace, given to them alone. Cecilia Rose was a beautiful baby, but that spark, of course, was absent, having already departed to be with her Heavenly Father.

For a variety of reasons, we were not able to bury her until August 14.  For those of you reading this who are Catholic, these dates are pretty important. August 8 is the Feast of St. Dominic, and August 14 is the Feast of St. Maximilian Kolbe.

Regardless, those days between August 8 and August 14 were wretched. We were in despair and depression. I have never been as angry and confused as I was during those days. Every hour seemed a week, and the week itself, seemed a year. I had nothing but love for my family, but I have never been that angry with God as I was then and, really, for the following year, and, frankly, for the next nine after that. We had Cecilia Rose buried in the 19th-century park-like cemetery directly across the street from our house. For the first three years after her death, I walked to her grave daily. Even to this day, I visit her grave at least once a week when in Hillsdale.  In the first year after her death, I was on sabbatical, writing a biography of Charles Carroll of Carrollton. Every early afternoon, I would walk over to her grave, lay down across it, and listen to Marillion’s Afraid of Sunlight.  Sometime in the hour or so visit, I would just raise my fist to the sky and scream at God.  “You gave me one job, God, to be a father to this little girl, and you took it all away.” In my fury, I called Him the greatest murderer in history, a bastard, an abortionist, and other horrible things. I never doubted His existence, but I very much questioned His love for us.

Several things got me through that first year: most especially my wife and my children as well as my friends.  There’s nothing like tragedy to reveal the true faces of those you know. Thank God, those I knew were as true in their honor and goodness as I had hoped they would be. A few others things helped me as well. I reread Tolkien, and I read, almost nonstop, Eliot’s collected poetry, but especially “The Hollow Men,” “Ash Wednesday,” and the “Four Quartets.” I also, as noted above, listened to Marillion. As strange as it might seem, my family, my friends, Tolkien, Eliot, and Marillion saved my life that year. I have no doubt about that. And, nothing gave me as much hope as Sam Gamgee in Mordor.  “Sam saw a white star twinkle for a while. The beauty of it smote his heart, as he looked up out of the forsaken land, and hope returned to him that in the end the Shadow was only a small and passing thing: there was light and high beauty forever beyond its reach.” As unorthodox as this might be, we included Tolkien’s quote in the funeral Mass.

A year ago, my oldest daughter—the single nicest person I have ever met—and I were hiking in central Colorado. We were remembering Cecilia Rose and her death. Being both kind and wise, my daughter finally said to me, “You know, dad, it’s okay that you’ve been mad at God. But, don’t you think that 10 years is long enough?” For whatever reason—and for a million reasons—my daughter’s words hit me at a profound level, and I’m more at peace over the last year than I’ve been since Cecilia Rose died. I miss my little one like mad, and tears still spring almost immediately to my eyes when I think of her. I don’t think any parent will ever get over the loss of a child, and I don’t think we’re meant to. But, I do know this: my Cecilia Rose is safely with her Heavenly Father, and, her Heavenly Mother, and almost certainly celebrating her birthday in ways beyond our imagination and even our hope. I have no doubt that my maternal grandmother and grandfather look after her, and that maybe even Tolkien and Eliot look in on her from time to time. And, maybe even St. Cecilia herself has taught my Cecilia Rose all about the music of the spheres. Indeed, maybe she sees the White Star. Let me re-write that: I know that Cecilia Rose sees the White Star. She is the White Star.

Happy birthday, Cecilia Rose.  Your daddy misses you like crazy, but he does everything he can to make sure that he makes it to Heaven–if for no other reason than to hug you and hug you and hug you.

Bradley J. Birzer is The American Conservative’s scholar-at-large. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.


What is Classical Liberal History?

“Alcibiades Being Taught by Socrates,” by Marcello Bacciarelli (1776-77) (public domain)

What is Classical Liberal History? Edited by Michael J. Douma and Phillip W. Magness,Lexington Books, 2018, 268 pages.

Ever since choosing history as a profession, I’ve been as fascinated with the actual philosophy of history (if one should exist) as I have been with the actual history of a thing, person, or event itself. After a quarter of a century of wrestling with the role of human agency, I have come to the conclusion that Friedrich Hayek was right all along in his own understanding of the “knowledge problem” and “methodological individualism”—that each person is simply too complex, in and of himself, to be studied at any meaningful level.

Free will renders so much null and void. Amen. That is, as the brilliant philosopher and economist James Otteson has noted repeatedly, if you believe in human liberty, you have to accept that you simply cannot predict with any meaningfulness the events of tomorrow. Yes, there are trends, to be sure, but free will means that almost anything—from the good to the ill—is possible.

Yet, the trajectory of academia has gone the other way since, roughly the 1890s, toward determinism.

A few years ago, I had the privilege of having coffee with a Pulitzer Prize-winning historian. In fact, the prize had been announced just a few hours earlier on the day of our coffee date. When we began talking about human agency and history, she declared–thoughtfully but firmly—that no such thing as free will could or ever would exist. Instead, every aspect of our nature was predetermined in some way or another by material factors. And, it must be noted, this is a very creative person. Raised on “race, class, and gender,” to be sure, but no academic automaton. She is, to be sure, very much her own person. Yet, even she was negating her own by free will and creativity by declaring such things as mere mechanisms. Given the most important and distinguished thought of the 19th century—especially from Darwin, Marx, and Freud—it is no surprise that the 20th and, thus far, the 21st centuries have been dominated by materialist thought. It explains much about the current and shabby state of civilization—from the loss of liberal education to the triumph of power. One need only give a cursory examination to the media, to the average classroom, or to Facebook to see that the outrage culture has been fueled by the failure to understand individual dignity and creativity. There seem to be no free agents anymore, only those that haven’t quite caught up with “the program.”

For better or worse, my response as a professional historian to this was to turn to thinkers I did trust: to Cicero; to Augustine; to Burke; to Smith; and to Hayek. Each of these greats had noted time and again that no system could exist to explain all. To varying degrees, each believed in an eternal order and ordering, but each recognized that, in the here and now, no system could be known or understood or propagated by any one person, one group of persons, or humanity as a whole. Each person is simply incapable of knowing everything. Thus, as I saw it, this inability to understand history and humanity effectively—that is, to reject quantification (and, consequently, the reduction and dismissal) of the human person, to be an anti-ideologist, and to reject the idea of a system—has been to become a biographer, to study the most fundamental aspect of existence, the individual human person. As such, I have, for the most part, seen race, class, and gender as mere parts of human existence, not as the whole or even determining parts of order and society. As such, each person is unique, born in a certain time and a certain place, but never of his own choosing. Yet, when coming of age, he or she chooses almost every moment of everyday. Some of these choices are limited by things such as race, class, gender, ethnicity, language, religion, neighborhood norms, education, etc., but none of these things need, necessarily, be determinants. As the grand J.R.R. Tolkien explained in a letter to W.H. Auden, each person is an allegory of a universal principle, robed in the garments of time and place.

While I certainly do not believe that biography is the only form of legitimate history, I am more than a little partial to it. With biography, the biographer gets to “know” the subject, intimately. Never will the biographer be totally objective, unless he or she is a mere antiquarian. Instead, the truly good biographer uses his own soul, experience, and reason to understand the choices of his subject. Thus, the best biography is always one in which the biographer is as apparent as the subject. Thus, when we read Arnn on Churchill, McCulloch on Adams, or Pearce on Tolkien, we are learning as much about Arnn, McCulloch, and Pearce as we are about Churchill, McCulloch, and Adams.

In an extraordinarily thoughtful and well-edited and conceived book, What is Classical Liberal History? (Lexington Books, 2018), editors Michael J. Douma and Phillip W. Magness bring together 13 scholars (including themselves) to answer the most important questions about the historian’s craft. Not surprisingly, Hayek is frequently invoked in the book.

Some of our finest historical thinkers—from Sarah Skwire and Jonathan Bean to David Beito and Han Eicholz—ask vital questions about the role of liberalism, properly understood, in human society. Penetratingly, these authors look at industrialism, feminism, scientism, civil liberties, historicism, progressivism. What is most appealing about this wonderful collection is that each author takes seriously the radical tendencies of modern and post-modern academics, finding the good within the questions asked and raised in mainstream academia, even if believing the answers provided by most academics, as insufficient.

In the perceptive and rather fetching introduction, Douma notes that his goal is to counter the tendencies of conservatism and progressivism in historical thought, each of which improperly consider the past as a way to understand morality, often focusing on colossal entities, such as nations or great men. In other words, by Douma’s definition, classical liberals would not be too thrilled with the biographers mentioned above. Yet, Douma insists, unlike all other historical schools, “classical liberal historiography is based upon the principle of methodological individualism central to the classical liberal tradition.” Further, he notes, classical liberal historiography is the “study of individual action in the past.” As much as I appreciate what Douma is doing—and he is an excellent writer, thinker, and scholar with a great future ahead of him—I remain unconvinced that any of these things are specific to classical liberalism. I would be happy to be persuaded otherwise.

If one takes What is Classical Liberal History as a negative statement on what exists in the world of mainstream thought and academia, this book is brilliant. Indeed, the writings of Skwire, Eicholz, Beito, Magness, and Bean are so good as to be a bit intimidating. These are each scholars at the height of their abilities, and their abilities would make any scholar—of whatever political and cultural persuasion—blush.

If one sees it as a fundamental and comprehensive take on history, though, it will become as ideological as those it complains about. My criticism is minor, but I think it is just. For example, the editors (and, admittedly, this is just the nature of editing) might be perceived as forming a clique. Frequently, the scholars chosen cite only a few common authorities and sources and, then, usually refer to each other. No where in the book do some of our most important historians and thinkers of our day–such as Mark David Hall, Rob McDonald, Richard Gamble, Mark Kalthoff, Paul Rahe, Richard Samuelson, Adam Schwartz, Greg Schneider, Gerald Russello, Patrick Deneen, or Bruce Frohnen—even make an appearance. Others, such as Kevin Gutzman and Otteson, get only the briefest mention. Even Burt Folsom, arguably the best-selling libertarian historian alive today, only merits a single mention. Indeed, only Paul Moreno of the Hillsdale College department of history even gets a mention. Given that this department is, by far, the single largest collection of conservative and libertarian historians anywhere in the world, this seems a huge omission. Similar comments might be made about Ashland, Grove City, University of Dallas, and the University of St. Thomas (Minnesota).

As much as I enjoyed reading What is Classical Liberal History? I can only hope it is meant as a beginning, not an end. Further I hope that readers take it as an invitation, not as an exclusion.

If the radical plainness and sameness of current academia and the conformity of collectivist and consumerist culture is to be combatted and the dignity of the human person to be understood, it will do no good merely for the classical liberals and the conservatives to form sides and distinguish themselves from one another. Douma notes in the introduction that “classical liberal history begins with the recognition of the inherent worth of the individual.” I have no doubt that this is a central feature of classical liberal historiography. But Russell Kirk—the father of all post-war conservatism—would have said (and did) exactly the same thing. And, Pope John Paul II noted in his 1996 address on Christian humanism that the beginning of all goodness resides in recognizing the human person as an unrepeatable center of dignity and will. Perhaps one could charitably state that Kirk and John Paul were bound to get at least something correct, but, again, I remain unconvinced that classical liberal historiography is the best way to promote human liberty and dignity. The question of human dignity is as old as philosophy itself, beginning with Heraclitus and Socrates.

From my own perspective, the best history is still biography, and, for what it’s worth, biography seems to me the best Hayekian (and Ciceronian and Augustinian and Burkean and Smithian and Ottesonian) manner in which to approach history, a way to recognize the universal and the particular, a way to understand how free will allows the individual human person to navigate through difficulties and challenges—material and otherwise—in his whirligig of existence. To quote one of my favorite thinkers of the post-modern world, “if you choose not to decide, you still have made a choice.”  If this is classical liberal, so be it. If this is conservative, so be it. If this is progressive, so. . . well, no, even I can’t go there. I have never considered myself a classical liberal, but I have always considered myself libertarian. In the end, though, I hope that what we write as historians is just good history and scholarship, whatever label is given it. “I will choose a path that’s clear. I will choose free will.”

Bradley J. Birzer is The American Conservative’s scholar-at-large. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.


Trust No One: The X-Files at 25

Aliens, Neanderthals, poltergeists, more aliens, bumbling bureaucrats, cabals, still more aliens, pyrokinetics, artificial intelligence, religious cults, were-creatures, eugenicists, mutants, stoners, more bumbling and malicious bureaucrats, and a few more aliens. Quite the motley crew of characters. Then, after the teaser, the smeared and ghostly credits play, usually with a final thump and the statement: “The Truth is Out There.” Every once in a while, though, variations surprise us, such as “Trust No One,” “Resist or Serve,” “Everything Dies,” and “Apology is Policy.”

These creatures, these possibilities, these seductions, and these impossibilities haunted our Friday nights back in 1993, as one show revolutionized everything from entertainment to libertarian paranoia. Admittedly, some of season one seems a bit dated in hindsight, but there’s a reason The X-Files mattered so much when it first arrived on our television sets 25 years ago this fall, and why the series remains relevant (and alive) to this day.

Created by Chris Carter, a native Californian who had worked for Disney in the 1980s, and aired on the then-relatively new national station Fox (one of its two main characters, in a not-so-subtle nod to Murdoch’s rising empire, even has “Fox” as his unlikely first name), The X-Files offered intelligent pulp for the right-thinking person. The camera angles, the cinematography, and the lighting were of the big screen more than the small, no matter how limited the special effects budget. The lighting was extraordinarily creepy, and made even creepier by Mark Snow’s haunting soundtrack of atmospheric electronica. The show spanned a full nine seasons (from 1993 to 2002), two theatrical releases (1998 and 2008), several bestselling novels (by Kevin J. Anderson and Charles Grant), and two rock albums (that is, inspired by The X-Files), before going into a hiatus, only to be revived Netflix-style over the last two years on Fox.

Aside from Star Trek and perhaps Breaking Bad, it would be difficult to find a series that has had a greater influence on American culture over the last half century. Not only have the characters and stories become ingrained in American folklore and folkways, but most modern entertainment owes it a debt. Some of its influences have been obvious—such as Fringe and Stranger Things—while others, such as Battlestar Galactica and Salem, have offered more subtle nods. Even one of America’s foremost scholars of Shakespeare, the brilliant Paul Cantor, has written extensively and intelligently on the show, its symbolism, and its significance.

The X-Files, of course, did not emerge out of a vacuum. In terms of American culture and imagination, it followed in the line of Poe and Hawthorne, Bradbury and Lovecraft, O’Connor and Percy, and Serling and King. It also followed almost three decades of American interest in the occult, UFOs, and hauntings. Even figures we incorrectly remember as staid—such as the founder of post-war conservatism, Russell Kirk—gave considerable attention to these irrationalities in their various books and articles of the period. Interest had become so intense in UFOs during the late 1940s and 1950s (after the supposed Roswell alien crash landing of 1947) that the FBI had responded with its own form of “Internal Affairs” for the supernatural and alien, Project Blue Book. The program ended in 1970, after citizens’ groups proclaimed it to be fraudulent, existing for the sake of coverup and to protect the interests of the federal government and the armed forces.

Most critically, like all good horror and science fiction, The X-Files played upon—and at times exacerbated—our deepest fears as a people. This is not to suggest that it would have failed at a different time in our history, but it certainly didn’t hurt that libertarian angst over Gulf War I, Ruby Ridge, and Waco, the general conservative distrust of the New World Order, and the rise of popular and consumerist radio talk shows coincided with its success. And while Watergate was two decades past by the time The X-Files began, Nixon’s specter of corruption hangs over the entirety of season one.

The main character, Fox “Spooky” Mulder (David Duchovny), reflected the paranoid curiosity of every intelligent and liberty-minded American in the post-Cold War era, wondering just what the heck was going on in an age that rapidly and horribly left Ronald Reagan behind, even while claiming to speak in his name. Dr. Dana Scully, Mulder’s skeptical and Watson-esque sidekick, reflected our more rational and patriotic side, the side that still wanted government to be our ally, not our master. At first, Scully is somewhat bland, merely a foil to the witty Mulder. As the first season progresses, though, Gillian Anderson brings much more depth to her character, especially with regard to her Roman Catholicism. Naturally, the issue of faith as opposed to reason appears frequently. In the pilot—which takes place a year before the series—the FBI assigns Scully to The X-Files, hoping that she’ll temper Mulder’s paranoia.

In the first season, we see a number of bosses come and go. Some are ambitious. Some are blindly patriotic. Some are even well-meaning and well-intentioned. All, however, are so arrogant as to be unable to see beyond their own agendas, whether for the private good or the greater good. Only Mulder fully serves the common good, with Scully defiantly loyal to the person of Mulder but not to his ideas or instincts. Mulder, wounded at almost every level except his intellect, remembers his younger sister’s disappearance when the two were just children, and is now convinced that she was abducted by aliens. Obsessed with the loss, he’s made a name for himself as an expert on the psychotic behavior of serial killers, which led him to take over the Bureau’s most neglected division, the X-Files, which, it seems from oblique references in the show, are even older than the FBI itself. There he’s earned the mocking “Spooky Mulder” nickname from his unreflective peers at the bureau.

In what would be a running gag on any other show but seemed plausible in this one, Mulder almost always saw what he wanted to see—that which is mysterious and gnostic—while Scully almost always missed everything that wasn’t purely rational, except when it was related to faith and Catholicism. In a nod to its audience (not typical, given the pandering in most Hollywood movies and TV shows of that day), Scully and Mulder are both well-educated, each with advanced degrees, and each prone to speaking in complicated sentences with actual and meaningful vocabulary and grammar. Never did they speak down to the audience, though exposition could be long at times.

Vitally, Mulder is the one person who always stands for truth, no matter the cost to himself or his reputation. A crusader to his core, he never backs down. At one critical moment in season one, as the FBI is hoping to shut down the X-Files, Mulder fiercely challenges the committee trying to crucify him: “No one, no government agency, has jurisdiction over the truth.” It’s the kind of thing that made my soul soar at age 25. At age 50, it still does.

In almost every way, the show understood the nuances of the post-Cold War world better than the so-called intellectuals and neoconservatives of that era. The X-Files was not concerned with the “end of history,” but with the threatened annihilation of what was (and is permanently) human. Though cheesy at times, it dealt seriously with the corruption of the republic, the militarization of the police, and the rise of empire abroad.

The last several years have seen The X-Files‘ first nine seasons remastered on blu-ray. The results, not surprisingly, are glorious visuals (even the weak effects of the early seasons seem better, if somewhat quaint) and even more stunning audio. Mark Snow frankly is a musical genius, but to hear his soundtrack on blu-ray is to take even his abilities to a new level, hitherto unappreciated on VHS and DVD.

The revival of the series on Fox, which began in 2016, is excellent. The show remains true to its roots while having improved at every level the writing, acting, and special effects. Even the music has remained good. Both Duchovny and Anderson are at the very height of their abilities, with acting talents so strong that each seems somewhat beyond human.

For a variety of reasons, I stopped watching television shows (other than the news) around 1981, when I was in seventh grade. I never stopped going to the cinema and watching classic movies, but TV in the 1980s did nothing for me. I was far more interested in exploring my local environs, reading, and writing. Yet for whatever reason, I happened to see the pilot of The X-Files on its first appearance. I had just turned 25. A friend taped the shows for me (God bless him!) and I finally bought my first TV in 1996, wanting to watch both The X-Files and Babylon 5. To me, as I look back on 1993 while I was working on my Ph.D., The X-Files was as important to informing my worldview as were the many hours I spent reading Liberty, Reason, National Review, Chronicles, and The Freeman.

Re-watching the series now with my children reminds me just how critical the decade of the 1990s was. Though it probed into the most difficult issues of society, conformity, individuality, empire, and humanity, The X-Files existed in a more innocent era, a pre-9/11 time that still believed a national security state to be but one possible future, not the nightmare dystopia it has become. Ronald Reagan gave us breathing room, economically and globally. We lived on the capital of the former until 2008, but we blew the global stability within two years of Reagan leaving the White House.

Equally, The X-Files gave us a warning not only about creeping empire and the New World Order, but the dangers of an intrusive security state that claimed to exist in the name of its people but retarded the free will of those same people at every turn. Carter’s warning was a massive and brilliant one. For the most part, we ignored it, and are now suffering for it immensely. When it comes to governments, corporations, and colossal entities, I agree with Carter, Mulder, and Scully: trust no one.

Bradley J. Birzer is The American Conservative’s scholar-at-large. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

Does Anyone Remember the Spirit of ‘76?

‘The Bicentennial Barn’ outside Glen Arbor, MI, in 1976. Jimflix!/Flickr

As spring turned into summer, 1976, I was age nine, living in what I then presumed the norm for America, the rather idyllic and medium-sized Kansas town, Hutchinson (affectionately known in Kansas as “Hutch”).

My neighbors were loving, caring, and intelligent. My parish—one of three in town—was led by a devout priest who had immigrated from Ireland in 1950. When I later became an altar boy, he even knighted me. I loved my school, but especially my teacher, Miss Mackey. I had wonderful friends who helped me explore the surrounding wheat fields and nooks and crannies of a prosperous area, filled not only with the raising of agricultural products of all types, but also the harvesting of the abundant local natural commodities of oil and salt. Indeed, my home town boasted the world’s second longest grain elevator (the Soviets built a slightly bigger one just to spite us), and we sat atop the (then) world’s second largest salt deposits. Our high school mascot was even, appropriately and creatively, the mythical “Salt Hawk.”

When it came to entrepreneurship, we were quite excellent as well. The headquarters of Farmarco/Farmland resided in Hutch, and, even more impressively, the truly upscale and innovative grocery chain of Dillons and Kwikshop (now owned, for better or worse by Kroger; not so upscale) began and was headquartered in Hutch. The president and vice president at the time were even Ph.D students who had worked with Milton Friedman. The town also hosted the state fair, an excellent public library (my second crush—after Miss Mackey—was Miss Canfield, the reference librarian), some of the finest city parks in America, and a planetarium that would soon grow into the world famous “Cosmosphere.” We even had, at one point, three different book shops in town, and my mom—ever gracious—encouraged me to read from as far back as I can remember. Yet, she was also “hands-off,” and, in typical 1970s fashion, she free-range parented me. Who wanted to watch TV when I could bike throughout the town and countryside, exploring, observing, discovering, and uncovering? Saturdays were always the same, as were summer days. Up by 7, getting my own breakfast, and off to find friends and friends. The only rule—I had to be home by dinner time.

Yet, even at age 9, I knew that not all was right with the world, especially for those not privileged to live in Hutchinson. Several images haunted me at the time: our betrayal of the South Vietnamese (I can still see in my mind’s eye the Vietnamese women throwing their babies into the departing American helicopters) and the Cambodians; the fear of a Soviet strike on American soil; and the devastating demoralization that accompanied Watergate. I even vaguely understood that inflation and the energy crisis were harming America at home and abroad. Believe it or not, one of my first political letters ever written (yes, I was nine) was a letter of complaint and concern to President Gerald Ford, who graciously responded with a form letter. And, just to make sure that those of you reading this don’t think me too high brow, I also wrote letters to Bill Gaines of Mad magazine, who responded far more graciously than had President Ford, sending me not only a personal letter, but photographic images of Alfred E. Newman and some rare and unusual publications from his company.

I must—if not clear from my narrative so far—point out something that many Americans don’t know or, even if they do, don’t understand. Kansans are as proud to be Kansans as Texans are to be Texans. Kansans, in typical humble fashion, just aren’t quite as loud about their pride as are Texans. To Kansans, Kansas is not “flyover country.” Rather, it is a place where generations upon generations of immigrants have broken the soil and made it the breadbasket of the world. Kansans not only wave to every other driver on the backgrounds of the state, they plant and mark historical spots about every two or three miles. To Kansans, Kansas is a place where everybody helps everybody else, and class distinctions make little to no difference. A Kansan is a Kansan is a Kansan. Though I had no idea how unusual this was as a child, Hutchinson even boasted a very sizable Mexican community and a somewhat sizable black population. No doubt, on average, the whites of Hutchinson lived in the central and northern (that is, nicer) parts of town, but the various peoples got along rather well, especially given the kind of conflict that one found in other places in the 1970s. And, as with so many good communities in the 1970s, we graciously welcomed those who had made it out of South Vietnam.

Before getting to the actual heart of this piece, I must note one other thing about Kansas. Kansans are deeply conservative, broadly defined. I have a close high school friend who once began a joke by stating, “You know how Kansans are divided into two types of folks: conservative and really, really conservative.” She nailed it, perfectly. My mother—already praised above—still regretted the loss of Barry Goldwater’s presidential run in 1964. She was (and remains) pretty diehard about this. I can still see our family room with perfect clarity. On the bookshelves just to the left of the hearth and fireplace sat Goldwater’s books, the Encyclopedia Britannica Great Books (all of them), The Fountainhead, Dr. Zhivago, and the family bible. The mix pretty much represented everything we believed.  

Kansans are so conservative, they even call boxelder bugs “democrats.” No joke! I had no idea that the bug had any other name than “democrat” until my first year in graduate school. I was on a walk with my beloved advisor, Anne Butler (RIP), and a boxelder bug jumped on one or the other of us, and I exclaimed in pure innocence, “Oh, sheesh, I hate democrats.” Anne, being rather left wing, just laughed and laughed, especially as I explained to her that—and I was certain—that was the proper name for the bug. I now know better. And, while I did know a few actual Democrats when growing up, they mostly lived in Wichita and Kansas City, not in respectable towns like Hutchinson. The division in Hutch was not Republican and Democrat, but Republican and Libertarian.

As skeptical as my family was about the war in Vietnam and Richard Nixon (even before the Watergate scandal), we knew the communists to be almost pure evil. My mom still remembered her own childhood of processing with the priests, the nuns, the Blessed Sacrament, and her neighbors around Hays, Kansas, praying for the end of communism, the greatest force of evil in the world.

When I wasn’t off with my friends, looking for adventure in the environs in and around Hutchinson, I would spend hours upon hours setting up my plastic army men, always having the U.S. take out the Vietcong. After the hours of patient placement and encampment, I happily blew up my VC with firecrackers in a mere matter of minutes.

All of this brings me to July 4, 1976. Kansans, by and large, were more than disgusted with Washington, D.C., New York (especially the UN), and California (Hollywood) that year. There was a feeling—not inaccurately—that our leadership had not only failed us and the world, but had failed so miserably that it might take generations to get back to a true understanding of the original intent of the founding. There was no doubt—though it was more in humility than smugness—that Kansans thought the world should look much more like Kansas. To be frank, there’s an entire history of the patriotic summer of 1976 that has yet to be written. It was, at least to those of us who lived it, an awakening to the good, the true, and the beautiful of American constitutionalism and republicanism. It was real, it was deep, and it was organic.

Three events stick out—clear in my memory—even after 42 years. First, many of my neighbors began framing and hanging in their homes copies of the Declaration of Independence. Even as a nine-year-old, I remember the discussions in the neighborhood, at a friend’s house, and even in church about the meaning of that august document. It was treated—by all—with a deep awe and reverence, a text long forgotten but eternally true.

Second, and whether other cities in America did this or just Hutch, I don’t know. A group in town—with the permission of the fire department and the city government—not only allowed but actually encouraged every child in the town to claim a fire hydrant and paint it as one of the founding fathers or mothers. While this might sound hokey to the modern reader, it was a powerful symbol of patriotism throughout the entire community. Every single fire hydrant—at least as I remember it—was a celebration of the founders.

My older brother, Todd, and I claimed at the end of our neighboring street, Bannock Burn. At the time, it was a dead end, but it’s now become a link to a recent housing development. Regardless, at the end of that street, rather proudly, stood our version of John Hancock, wearing the powdered wig of the time and a yellow frock coat.

Third, one of the men in town I respected most—and I still miss dearly—was a great intellect, patriot, and gentleman, Bob Gottschalk (RIP), the manager of the Kansas State Fair. On the evening of July 4, thousands upon thousands of Kansans descended upon the fairgrounds. The local rock radio station, KWHK 1260-AM, provided the soundtrack over loudspeakers. My family and I arrived early, laying out our beach towels and claiming a prime spot before twilight hit. The show, once it began, lasted for an hour or two, and I still remember rather clearly hearing Kansas’s (yes, the rock band) “Song for America” (I’m not sure we realized how critical of America it actually was) and Styx’s “Suite Madame Blue” being played. Gottschalk and the radio station lit the fireworks, timed rather perfectly to the music.

It was a night I will never forget, and the day itself will always be deeply ingrained in my soul.

Something profound happened that summer in the middle of a little appreciated state. The citizens of Hutchinson, Kansas, tired and sick and angry, took the celebration into their own hands. It was no mere social barbecue that year. Rather, with a fierce determination to express ourselves, July 4, 1976, was a meaningful and abiding display of love for what the founding fathers had given us.  

Truly, we understood and lived out the “Spirit of ’76.” 

Bradley J. Birzer is The American Conservative’s scholar-in-residence. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

Marillion: Prog-Rock’s Bards of Alienation

Album cover for Marillion’s “Brave” in 1994

“The Cold War is done, but those bastards will find us another one.”  

This cry might have come from any current reader of The American Conservative alive in the early 1990s—well, maybe without the bastard part. But still, an anguished expression from Russell Kirk or Pat Buchanan? Why not? After all, as TAC editor Bob Merry recently and wisely noted, so many so-called conservatives of the early 1990s “kicked Reagan to the curb” the moment they inherited the Republican Party. And it seems they kept kicking, mutating a military that came into existence solely to defeat the Soviets into a world peace-keeping force, a new Delian League. The bastards did find us another one.

And then: “They’re here to protect us, don’t you know. So get used to it. Get used to it.”  

James Bovard or Virginia Postrel? Or some other grand libertarian of a quarter of a century ago? Why not?

Actually, the words are prog rock lyrics from Marillion’s album Brave (1994).

Taking its name from The Silmarillion, the 1977 book released by the Tolkien Estate, Marillion formed in 1978 and released its first album, A Script for a Jester’s Tear, in 1983. With their third album, Misplaced Childhood, they hit it big as the name “Kayleigh” became a repeated word on MTV and all across European radio. In his excellent book Citizens of Hope and Glory, Stephen Lambe explains: “‘Kayleigh’ penetrated the UK public consciousness like no other song in Progressive Rock with the exception of ‘Dust in the Wind,’ which had done pretty much the same thing for Kansas in the USA in 1977.”

Indeed, the name Kayleigh—in a variety of spellings—became an important and trending name for girls throughout the English-speaking world.

“Fish” was none other than the charismatic and Goliath-sized Scottish lead singer of the band, Derek Dick. After one more album, Dick left, and Marillion hired Steve “h” Hogarth to take over lead vocals. To this day, the decision of the band to replace Fish rather than call it the end has deeply divided Marillion fans. Where Fish was clever and staccato, Hogarth is romantic and suave, bringing a fundamentally different sound.

With Hogarth, the band wrote and recorded 1989’s Season’s End and 1991’s Holiday in Eden. By corporate standards, the albums sold relatively well, each reaching number seven in the United Kingdom. Hoping to get the band to release another album that would sell as well as Misplaced Childhood, EMI’s delegate to the band told them: “Look what you guys need to do is to make a quick, sharp, surefire album, and, you know, get back to basics,” according to bassist Pete Trawavas. By “back to basics,” EMI’s man surely meant to produce another “Kayleigh” and make a huge profit.

Though now gone as an independent label, EMI, it should be remembered, was a major label in the 1960s through the 1990s, having made its fortune with artists such as The Beatles, Queen, Iron Maiden, Elvis Costello, R.E.M, The Smiths, Kate Bush, and Pink Floyd. It had, however, already revealed its true colors in 1988 when it allowed Mark Hollis to make arguably the finest album of the rock era, Spirit of Eden by Talk Talk. A commercial failure but a fan and critic favorite over the past 30 years, Spirit of Eden took 16 months to make, and EMI initially lost a considerable sum. The album has, however, transcended everything, including its label and its band. With its airy atmospherics, its silky flow, and its religious and sacramental lyrics, Spirit of Eden became known simply as the first “post-rock” album, breaking every genre and subgenre then in existence. It has since sold over 500,000 copies.

Marillion chose one of their sound engineers, Dave Meegan, to produce their third album with Hogarth. Having studied with famed producer and musician Trevor Horn (The Buggles, Yes, Seal), Meegan had also worked, perhaps most famously, with U2 on The Joshua Tree (1987) and Rattle and Hum (1988).

Meegan, it turned out, loved the gothic elements of sound, that is, the jagged nature of nature and of music, the uneven and the odd. Setting up their studio in a castle in the southwest of France, Meegan and the band recorded everything: from a fire crackling to random conversations to the air moving through hallways and turrets, all, it seemed, to pick up the random atmospheric of a passing ghost. “The theory was that if we fed all the mics onto tape then we’d pick up any passing ghosts as well. You can’t hear them but I can feel them here and there,” the atheist Hogarth remembered. “Brave is all about the spiritual aspect of life dominated by the non-spiritual, so we filled the songs with as many sounds and pictures as we could dream up—I sent our sound engineer out at dawn one morning to record silence for the beginning of the album!”

EMI’s desire to make a fast pound faltered almost immediately, as Meegan and Marillion began to echo Hollis’s work on Spirit of Eden. As it turned out, Meegan and the band took eight months to write the album and another seven to produce. Certainly, this was no longer normal commercial territory. Even the members of Marillion began to express concern over the time it was taking. But producer Meegan responded: “The way I see it, we could make a masterpiece or we could just make a very average record and you’ve got to decide which one you’d like.” The band chose “masterpiece.”

Inspired—or frightened—by the real-life news story of a young English woman who had been found walking around Severn Bridge unsure of her own identity, history, or age, Hogarth centered the concept of Brave around alienation, a human trapped in an overwhelmingly mechanistic and progressive world:

The babble of the family

And the dumb TV

Roar of the traffic and the thunder of jets

Chemicals in the water

Drugs in the food

The heat of the kitchen and the beat of the system

The attitude of authority

The laws and the rules

Hit me square in the face, first morning at school

The heroes and the zeroes

The first love of my life

When to kiss and to kick and to keep your head down

When they’re choosing the sides

I was never any good at it

I was terrified most of the time

I never got over it

I got used to it

Though many artists have depicted alienation, no one has done it quite as well as Hogarth and Marillion. From the moment the album begins to its final second 72 minutes later, Brave captivates its listeners.

The band Marillion at the O2 ABC, Sauchiehall Street, Glasgow, 30th November 2016

As with Hollis’s Spirit of Eden, Brave never lived up to the hopes that EMI had invested in it. Marillion would only write one more album for the label, its phenomenal 1995 release Afraid of Sunlight. And despite the time and effort that went into Brave, it never caught on the way Misplaced Childhood had. In his diary entry dated March 29, 1994, Hogarth expressed his frustration:

The general reaction to Brave can only be described as euphoric. Why does media and radio despise us? I guess you either love or loathe Marillion…. Conversion is a long and quantum jump. Our music seems to be behind a locked door. It’s fun in the room if you find the key!

Following Afraid of Sunlight, Marillion decided to leave not just EMI but the entire corporate rock world behind. Amazingly, the band charted its own course, and even more amazingly, it succeeded. As Hogarth told Prog magazine in 2016, commemorating their 20th year of independence, labels are “not doing it because they love you. If they’re really honest, they’re not even doing it because they’re excited about the music.”

In true entrepreneurial form, the band made its 1999 album available to their subscribers on the website of the same name. It proved a huge success. A year later, the band innovated again, offering the very first crowdfunded album in the history of rock. Since then, the band has made one success after another. Though they’ve never sold in the numbers they did in the 1980s, they have, perhaps, the most loyal fan base in the rock world today. Those who love Marillion—such as Gianna Englert, a young political theorist at Southern Methodist University; famous Lutheran organist and music theorist Rick Krueger; and the venerable libertarian Tom Woods—follow the band through every tour and keep up with its every nuance.

Whatever Hogarth’s worries in 1994, he and Marillion have transcended their critics as well as themselves in almost every way that matters. When the album came out, one of its most ardent supporters was South African filmmaker Richard Stanley, who retold the story of Brave in an art film of the same title that year, using the album as a soundtrack. Though the ending of the film is, arguably, more dour than the end of the album, the film not only does justice to the album, it does justice to the art.

Twenty-four years later, the album remains as vibrant as ever. Just this year, audiophile and musician Steven Wilson remixed and re-released Marillion’s Brave. The deluxe, 60-page hardback edition of the re-release comes with the original mix of the album, the 2018 mix of the album, two DVDs of a 1994 concert, and a Blu-ray with the full Wilson mix in 24-bit, promo films, a bonus track, and a 70-minute documentary.

Music journalist Stephen Humphries (Christian Science Monitor, Prog, Boston Globe, American Way) explains the brilliance of Brave best:

Hogarth, an uncommonly emotionally honest singer, turns one person’s search for connection, meaning, and redemption into something that is universal to the listener. I often tell people that Marillion reaches emotional and musical planes that most other bands don’t know exist. Brave exemplifies those qualities. The beauty of Steven Wilson’s remix is how it creates even greater clarity and space in the instruments so that Steve Hogarth’s voice can seep through to the listener’s soul.

Looking back over the past three decades, it’s rather clear that the bastards did indeed bring us another one, and they do simply want us to just get used to it. Yet in large ways and small ways, we normal folks keep fighting back—through truth, beauty, and goodness.

This piece benefitted immensely from the writings of Paul Stump, Phill Brown, Roie Avin, Stephen Lambe, Stephen Humphries, Rich Wilson, and Jerry Ewing.

Bradley J. Birzer is The American Conservative’s scholar-in-residence. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

Thomas Jefferson is America and America is Thomas Jefferson

Thomas Jefferson (Library of Congress)

In almost every way, when we think about Thomas Jefferson, we think about America. When we’re proud of our heritage, we focus on his Declaration of Independence, his founding of the University of Virginia, his design of Monticello, his massive library, and his sending forth Lewis and Clark. In our uncertainty and embarrassment, we turn to his ownership of slaves, his alleged obsession with Sally Hemmings, and his bitterness towards Alexander Hamilton.

In so many ways, Thomas Jefferson is America, and America is Thomas Jefferson.

A conservative can appreciate him for his classical education, a libertarian for his promulgation of natural rights, a liberal for his love of choice, and a progressive for his optimism. Jefferson’s “right to life” might well adorn an anti-abortion placard, while his “right to liberty” might equally stand on a pro-abortion one. A marble 19-foot-tall version of the man might greet all who come to the imperial city of Washington, D.C., while a hastily drawn mimeograph of his face might appear at a protest in Tiananmen Square. Each is equally identifiable.

This month marks the 275th anniversary of his birth. The moment should give us all pause. Just who exactly was this extraordinary man—a man who seems to be the best of us, the worst of us, and, in some strange and mysterious way, also above us?

Whatever gifts Jefferson had, he was, to be sure, a man. He was born, he lived, and he died. He had a beloved wife, Martha, and when she passed all too young, his greatest friend and ally was his daughter, also named Martha. Certainly he loved books and wine, tangible pleasures, no matter the extraordinary talents of his mind. Six feet and one or two inches tall, one visitor wrote of him later in his life that he possessed a “face streaked and speckled with red, light gray eyes, white hair” and his stance was “bony, long and with broad shoulders, a true Virginian.” He was wearing “shoes of very thin soft leather with pointed toes and heels ascending in a peak behind, with very short quarters, gray worsted stockings, corduroy small clothes, blue waistcoat and coat, of stiff thick cloth made of wool of his own merinos and badly manufactured, the buttons of his coat and small clothes of horn, and an under waistcoat flannel bound with red velvet.” If nothing else, the description reminds us that Jefferson lived in another age, a more elegant and somehow voluptuous one, one that feels far more Hogwarts than it does Goldman Sachs.

If America has produced a more intelligent man—that is, at least as well rounded as Jefferson was in his intellect, perceptiveness, and creative drive—that person has yet to come forward. Few would claim Jefferson was a great president, but even in the White House he was unique. Washington might have had more fortitude and Lincoln more resolve, but Jefferson had already given us his everything by the time he entered office in 1801. He had, in the manner of a classical demi-god, articulated and perhaps bestowed upon us our founding mission, our purpose, and our greatest contribution to the world: the belief, however poorly practiced and implemented, that ALL men are created equal, each endowed by his creator with certain inalienable rights.

This contribution, though, raises vital questions not just about the man but by extension about America. Exactly what were Jefferson’s sources and influences? Was he merely a French radical living in the hinterlands of Western civilization? Certainly some have argued so. After all, when asked, he admitted in 1789 that he loved Isaac Newton, Francis Bacon, and John Locke above all others in the Western tradition as “the three greatest men that have ever lived, without any exception.” Taken at face value, this is an extraordinary claim by any standard, even one far less majestic than Jefferson’s. While each was an Englishman, each was also quite recent and modern in Jefferson’s day. And while Newton, Bacon, and Locke might each be highly intelligent in and of themselves, taken together they seem a bit radical and mischievous. Additionally, given Jefferson’s own life-long pursuit of the classics and liberal arts, one must ask, where is Greece and Rome in all of this, let alone medieval and Reformation England?

In his own extraordinary work on Jefferson, Thomas Jefferson: Apostle of Americanism, the French-born American man of letters and Princeton professor Gilbert Chinard claimed that simply because Jefferson admired someone—no matter to what degree—it didn’t mean that person’s ideas were reflected tangibly and measurably in his writings. Endowed with an intense intelligence, Jefferson could well separate what he knew to be true, what might be true, and what ought—but never would—be true.

Chinard argued forcefully that when it came to the Declaration as well as to the laws of Virginia, Jefferson understood what would and would not work in America. “No greater mistake could be made than to look for his sources in Locke, Montesquieu, or Rousseau,” Chinard argued, most certainly exaggerating to make a point. “The Jeffersonian democracy was born under the sign of Hengist and Horsa, not of the Goddess Reason.” As proof of this, Chinard—himself, it should be remembered, of French birth and stock—drew upon John Adams’ description of Jefferson’s proposed seal of the United States in 1776. “Mr. Jefferson proposed, the children of Israel in the wilderness led by a cloud by day, and a pillar by night—and on the other side, Hengist and Horsa, the Saxon chiefs, from whom we claim the honor of being descended, and whose political principles and form of government we have assumed.” Even if you’re an extremely intelligent reader—and, after all, you wouldn’t be here at The American Conservative if you weren’t—you might be scratching your head as you read this. Newton and Locke, certainly. You know them well. But Hengist and Horsa? Who on God’s green earth are these two? Unless you spend your time reading early Medieval Celtic or Anglo-Saxon poetry—such as Beowulf—or modern British fantasy by C.S. Lewis and J.R.R. Tolkien, Hengist and Horsa probably mean almost or even less than nothing. The two Saxon chiefs reside more accurately in myth than they do in history, at least as professional historians understand the term.

For Jefferson, though, Hengist and Horsa represented the great republican tradition of the Germanic tribes sitting under the oak trees, deciding what was common law and what was not, speaking as representatives of their people in the Witan, and living as free men, bound to no emperor. To the American founding generation, Hengist and Horsa were as real as Cincinnatus, the Roman republican who threw down the sword, refused a permanent dictatorship of the city, and walked into the country to spend his life as a farmer. In the long scheme of things, the accuracy of the founders’ understanding of history matters little. They believed in Cincinnatus, Hengist, and Horsa, and they acted accordingly.

It isn’t hard to find the classical world that intrigued Jefferson’s mind. Probably no one has documented this as well as Carl Richard in his 1994 magnum opus, The Founders and the Classics. As late as 1810, Jefferson complained that any understanding of current events took precious time away from his reading of Tacitus and Homer. Roughly a decade later, he admitted, “I feel a much greater interest in knowing what has happened two or three thousand years ago than in what is now passing.” Though he loved Homer and Tacitus most, Virgil was not far behind. When Jefferson founded the University of Virginia in the late 1810s, he noted that all the science in the world meant little if a student failed to learn Greek and Latin. He wanted to exclude all professors and students who could not readily read the classics in their original language. Only this way could the nature of man, the temptations of power, and the attainment of the virtues truly be understood.

All of this came together for Jefferson in what he called, as historian Hans Eicholz beautifully put it, the “harmonizing sentiments of the day.” In an 1825 letter to Henry Lee explaining the purpose behind the Declaration, Jefferson wrote:

This was the object of the Declaration of Independence. Not to find out new principles, or new arguments, never before thought of, not merely to say things which had never been said before; but to place before mankind the common sense of the subject, in terms so plain and firm as to command their assent, and to justify ourselves in the independent stand we are compelled to take. Neither aiming at originality of principle or sentiment, nor yet copied from any particular and previous writing, it was intended to be an expression of the American mind, and to give to that expression the proper tone and spirit called for by the occasion. All its authority rests then on the harmonizing sentiments of the day, whether expressed in conversation, in letters, printed essays, or in elementary books of public right, as Aristotle, Cicero, Locke, Sidney, etc.

If Jefferson really is the best mind that America has produced—he probably is—and if his greatest contribution to the world is his Declaration that all men are endowed with certain inalienable rights, we would be fools to ignore our classical and medieval lineage. Indeed, we would become nothing more than mischievous European radicals, bent on altering all things inherited from man and God, giving neither his proper due.

This article (and my views, such as they are) benefitted immensely from the thoughts and words of Dedra Birzer, Gilbert Chinard, Hans Eicholz, Winston Elliott, Kevin Gutzman, Christian Kopff, Don Lutz, Dumas Malone, Rob McDonald, and Carl Richard.

Bradley J. Birzer is The American Conservative’s scholar-in-residence. He also holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

Russell Kirk at 100

Even among the odd, Russell Amos Kirk was unusual. Perhaps only in America could such an eccentric and anti-individualist individual have arisen. And arise he did.

One hundred years ago, Kirk entered the world. Born to poverty-stricken but bookish Anglo-Saxon Celts on the wrong side of the tracks in Plymouth, Michigan, probably very few looked at his young parents and believed them capable of producing a genius. Kirk’s mother was a quiet saint, but his father was a ne’er-do-well who never quite got his life together and certainly never earned any respect from his only son. Like so many who settled in Michigan during the 19th century, the Kirks and their relatives had come with the first waves of immigration to the northern American colonies, slowly migrating across New England, upstate New York, and towards the Great Lakes. They had lost their Puritanism at some point, these old Yankees, and were to become the supporters of Abraham Lincoln and the proud backbone of the Union army during the American Civil War. Northern and agrarian, they replaced their lost Knoxian faith with not only cherished books but also with séances, faith healings, levitations, and bizarre spiritual liturgies.

Like so many traditionalists of the 20th century, Russell Amos Kirk (“Jr.”) revered his grandfather while dismissing his father. His traditionalist sympathies and piety were lacking in the previous generation, who deemed it unworthy even of consideration. All real hope rested in his grandfather’s cohort. Russell especially admired his maternal grandfather, Frank Pierce, a well-read and rough Stoic character.

Unsure of what greater things to believe, young Kirk found certainty in his mother and in her father. Through their influence, he read everything he could find, from the collected works of James Fenimore Cooper to Thomas Jefferson to Karl Marx, all while still in his tweens. He never stopped reading, and possessed amazing recall courtesy of what was almost certainly a photographic memory. Of all the great things in the world, however, nothing bested a walk with his quietly certain grandfather. On those walks, Russell felt his mind sharpen, his soul enlarge, and his world come into focus.

Not realizing how abnormal it was for a 12-year-old to read and devour all the knowledge and wisdom around him, Kirk never felt the sting of poverty, so immersed was he in the life of the mind.  

When moved, he began to write, and once he started to write, he never stopped. One of his most loyal students, Wes McDonald (RIP), noted that Kirk probably wrote more in his lifetime than what even the most educated American reads in a lifetime. Having been privileged to read all of Kirk’s published books, articles, and reviews, as well as his private correspondence and papers, I can affirm McDonald’s suspicion. Everywhere Kirk traveled, he took with him his three-piece tweed suit, a swordstick, and a typewriter. The eminent scholar Paul Gottfried claimed that watching Kirk at the typewriter was akin to watching Beethoven compose. According to those who knew Kirk well, he could type while carrying on a conversation simultaneously. His photographic memory allowed him to reference things he had read over the years without looking them up. Sometimes in a furious day, he might answer tens or even hundreds of letters, and often in a furious night, he could produce a full-book chapter.

Almost as soon as Kirk entered Michigan State College as an undergraduate in 1936, Professor John Abbott Clark took him under his care and introduced him not just to the profoundly important but already neglected works of Irving Babbitt and Paul Elmer More, but to Socratic and Ciceronian humanism as a fundamental part of the Western tradition. During his college years, Kirk combined his love of romantic literature, the humanist ideals of Babbitt and More, and the stoic wisdom of his grandfather into what would be recognized by 1953 as modern conservatism. While earning an M.A. in history at Duke in 1940 and 1941, Kirk also discovered an intense love of Edmund Burke, whom he’d encountered through Babbitt and More in college but only indirectly. It was while writing his M.A. thesis on the rabid Southern republican John Randolph of Roanoke that Kirk first felt the influence of the greatest of the 18th-century Anglo-Irish statesmen. Though many scholars—from Daniel Boorstin to Leo Strauss to Peter Stanlis—were also re-discovering Burke (along with Alexis de Tocqueville) in the 1940s, it was Kirk’s 1953 work, The Conservative Mind, that would once again make Burke a household name in America and, to a lesser extent, in Great Britain.

By the time America entered World War II, a very young Kirk—rather enthusiastically Nockian and anarchistic—already despised Franklin Roosevelt for his mistreatment of ethnic and religious minorities at home and abroad and his militarization of the American economy. As much as Kirk hated Hitler, he did not see FDR as a viable alternative. Succumbing to the draft in the late summer of 1942, Russell Amos Kirk, B.A., M.A., endured in the military the only way he knew how: by spending all of his free time reading. Before shipping off to training at Camp Custer in Michigan (he would spend much of the war as a company clerk in the desert wastes of Utah), Kirk purchased every work of Plato and the Stoics that he could find. From his childhood to his death, he kept a copy of Aurelius’s Meditations close to him. As in the rest of his life, it would serve as his greatest comfort during the war. As he wrote in a personal letter, “everything in Christianity is Stoic”:

Really, the highest compliment I can pay to the Greeks is that they could understand and admire the Stoics and admit their own inferiority. Were the Stoics to ask the moderns the rhetorical questions they asked the Greeks, the moderns also would accept the questions as rhetorical—but would answer them in exactly the opposite manner.

In imitation of Aurelius, his own war diaries attempted to describe the world around him through the lens of the Greek and Roman-adopted Logos, the eternal order of the universe. “’Nothing is good but virtue’—Zeno” Kirk scrawled across the cover of his first diary.

That same Stoicism, however, also made Kirk profoundly aware of the majesty of nature, even in the desert wastes of the Great Basin. Though he had always been an excellent writer, something fundamentally changed in his view of the world in September of 1942. Kirk, though only 24 when he wrote these words, is worth quoting at length:

This is written in the dead of night (and why shouldn’t it be the dead of night? All else is dead here, and has been ever since the beginning of time). …I handle special orders, travel orders, daily bulletins, and the like—a great many stencils to type—and am a star contributor to the Sand Blast, our paper, a copy of which I’ll send you once we get the next issue out; I intend to do some brief literary criticism for it, once the post library opens. Officers are affable, hours required are briefer than those I had as a civilian, and the work is very light and sometimes infrequent. …I’ve grown to endure the country in true Stoic fashion, and take a certain pleasure in feeling that I’m a tough inhabitant of one of the most blasted spots on the continent. There’s enough leisure here, and that’s a lot; the winters are said to be dreadful, but I have found fears exceed realities here, as everywhere. Already we have very cold mornings and evenings, and as I write a great sand-laden wind very chilly, is howling around the shacks of Dugway. Coming here tends to make me lean toward the Stoic belief in a special providence—or, perhaps, more toward the belief of Schopenhauer that we are punished for our sins, in proportion to our sins, here on earth; for I’d been talking of Stoicism for two or three months before I burst into Dugway and there never was a better and sterner test of a philosophy, within my little realm of personal experience—to be hurled from the pleasures of the mind and the flesh, prosperity and friends and ease, to so utterly desolate a plain, closed in by mountains like a yard within a spiked fence, with everywhere the suggestion of death and futility and eternal emptiness. But, others, without any philosophy, live well enough here; and, as Marcus Aurelius observes, if some who think the pleasures of the world good still do not fear death, why should we?

Though disgusted by the ill treatment of Japanese Americans and the dropping of the atomic bombs, Kirk remained loyal to, if not uncritical of, the United States. The Army finally released Kirk in 1946, and after two years of teaching Western civilization at Michigan State, he accepted a position in the doctorate program at the University of St. Andrews in Scotland in 1948. It was after a messy breakup with a girlfriend in the fall of that year that a dispirited Kirk devoted himself to “an invigoration of conservative principles.” The five years of research and writing following that breakup would become The Conservative Mind, which was published in the spring of 1953.

Amidst today’s whirligig of populist conservatism, crass conservatism, and consumerist conservatism, we conservatives and libertarians have almost completely forgotten our roots. Those roots can be found in Kirk’s thought, an eccentric but effective and potent mixture of Stoicism, Burkeanism, anarchism, romanticism, and humanism. It is also important—critically so—to remember that Kirk’s vision of conservatism was never primarily a political one. Politics should play a role in the lives of Americans, but a role limited to its own sphere that stays out of rival areas of life. Family, business, education, and religion should each remain sovereign, devoid of politics and politicization. Kirk wanted a conservatism of imagination, of liberal education, and of human dignity. Vitally, he wanted a conservatism that found all persons—regardless of their accidents of birth—as individual manifestations of the eternal and universal Logos.

A hundred years after the birth of Russell Amos Kirk, those are ideas well worth remembering.

Bradley J. Birzer is the scholar-in-residence at TACHe holds the Russell Amos Kirk chair in history at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

What Does Classical Education Have to Do With Revolution?

Juan Antonio Ribera’s “Cincinnatus Leaves the Plough to Dictate Laws to Rome” (circa 1806) (public domain).

As I read about the political insanity this weekend and the ridiculous blame game for the looming government shutdown—will it be remembered as Trump’s fault or as Schumer’s fault?—I can’t help but think about what no one is talking about: how to solve our $21 trillion national debt. This number breaks down to a little over $170,000 per U.S. taxpayer.

It’s infuriating that the politicos attempt (and, more often than not, succeed) to distract us from this real issue. There’s an Orwellian element to all of this, whether intentional or not. That is, the most important issue is so critical that it is overwhelming in what it demands of our faculties to understand: that Washington, D.C., and our federal government are, at this point, simply insolvent. Whether this has been caused mainly by social issues or military ones, we’re insolvent. As some point, everyone will see the federal government for what it is, and, at that point, the collapse will be not just swift but horrific. Yet, there seems to be no reform coming. At least no serious reform.

Even the most pro-interventionist of the American founders, Alexander Hamilton, could never have imagined or desired the kind of federal government we have now. When he wrote of “energy” in government, he meant it as a means of restraint. To give “energy” to government meant, at least to Hamilton, giving the federal government the means to execute the powers expected of it by its Constitution. Rather brilliantly, he argued that a government charged with a duty but not empowered by the specific rules of that government to accomplish its duty would merely make up its own rules, thus taking government away from restraint and toward leviathan. Though many libertarians think of Hamilton as the touchstone for all future expansive government, they’re wrong. Even Alexander Hamilton desired ways to limit the expansion of government, and whether he wanted a strong executive or not, he envisioned a small, commercial republic as the proper outcome of the American revolution.

Over the previous three pieces in this series, “The Origins of the Rise of the Modern Nation State,” I’ve focused almost exclusively on the classical understanding of government. There is, I must confess, a method to my madness. One need only look at the actual classical words and symbols used by the founders to see how immensely indebted they were to the ancients. The U.S. Senate, for example, is modeled on the Maryland Senate, which is modeled on the Roman Senate. “Senate” comes from the Latin for “old wise men.”  If only!  

Two Roman fasces grace either side of the Speaker’s chair in the U.S. House of Representatives. (Official White House Photo by Pete Souza)

Or, even more blatantly, look at our capitol building. While we might expect our founders to have designed it as something grand and spectacular, such as the Hanging Gardens, the Taj Mahal, or, even, English Parliament, they chose an architectural style from the height of the Roman Republic. Which, of course, is also why a Washington with thousands of armed guards, black SUVs, road blocks, and rooftop surface to air missiles looks so ominous. Nothing is worse when regarding the symbols of authority than the militarization of republican architecture. The fasces of Congress quickly look like the fasces of Mussolini. Even if we don’t recognize it immediately, something in us reminds us of how readily Rome succumbed to the temptations of power as we drive around the D.C. of 2018.

The hold of the classical world on the founding mind, however, is much deeper than architecture or names. To enter college in one of the nine schools available in the American colonies in, say, 1750, one had to prove fluency in Greek and Latin. The grand historian of the period, Forrest McDonald and his wife, Ellen, explained:

Just to enter college during the eighteenth century—which students normally did at the age of fourteen or fifteen—it was necessary, among other things, to be able to read and translate from the original Latin into English (I quote from the requirements at King’s College—now Columbia—which were typical) “the first three of Tully Select Orations and the first three books of Virgil’s Aeneid: and to translate the first ten chapters of the Gospel of John from Greek into Latin, as well as to be a ”expert in arithmetic’ and to have a ‘blameless moral character.’

To be prepared for a college education, pupils began studying Greek and Latin around the age of six or seven. Indeed, one thing we in the world of schooling for democratic citizenship often forget is that all education in the 18th Century was classical education (even the term, “classical education,” would be redundant to the 18th Century mind). One was supposed to learn reading, writing, and arithmetic at home. Schools taught only Greek, Latin, and classical literature. Even farm children, with only a year or two of schooling in their lives, spent their school days drilling Greek and Latin.

For the truly enterprising student, he would also study Italian, if for no other reason than to read Dante in the original.

This is a world 300 years and 1 million miles apart from ours. It is no wonder, though, that George Washington (one of the few founders not liberally educated, interestingly enough) chose the mythic Republican Cincinnatus and the Republican rebel Cato the Younger as his exemplars or that the founders as a whole wanted a republic. This understanding of the classical world pervaded all of America, even the America that had not received much classical education, if any. Names such as George (Latin for agriculture), Narcissa, and Romulus were not uncommon proper names. Towns and counties took the names Homer, Athens, Remus, etc. Though not every American had read Virgil’s Aeneid, every American knew something about Aeneas, Troy, and Dido. Tellingly, the McDonalds reminded us, when American officers and French officers spoke on the field of battle during the Revolutionary War, they spoke in Latin, the only common language they shared. The index to the Federalist Papers quickly reveals as much, with 56 references to the classical and medieval world of the West and no references to John Locke.

Among the Romans, the American founders most appreciated and idealized the stoic Cato the Elder, the martyr Cicero, the poet Virgil, the historian Livy, and the theorist Tacitus. While the founders knew and studied the Greeks, it was the Roman Republicans that inspired them and the Roman imperials that terrified them.

“The Revolutionary leaders were men of substance—propertied, educated. They read. And what they read made it easier for them to become rebels because they did not see rebels when they looked in the mirror,” historian Trevor Colbourn has written. “They saw transplanted Englishmen with the rights of expatriated men. They were determined to fight for inherited historic rights and liberties.”  

When writing the Declaration of Independence, Thomas Jefferson explained that he drew on ancient sources:  

This was the object of the Declaration of Independence. Not to find out new principles, or new arguments, never before thought of, not merely to say things which had never been said before; but to place before mankind the common sense of the subject, in terms so plain and firm as to command their assent, and to justify ourselves in the independent stand we are compelled to take. Neither aiming at originality of principle or sentiment, nor yet copied from any particular and previous writing, it was intended to be an expression of the American mind, and to give to that expression the proper tone and spirit called for by the occasion. All its authority rests then on the harmonizing sentiments of the day, whether expressed in conversation, in letters, printed essays, or in elementary books of public right, as Aristotle, Cicero, Locke, Sidney, etc.

John Adams, the first American to argue for independence, as early as 1765, said the same as Jefferson in 1774:

These are what are called revolution principles. They are the principles of Aristotle and Plato, of Livy and Cicero, of Sidney, Harrington, and Locke; the principles of nature and eternal reason.

Unlike the French or Russian revolutionaries, attempting to create, in the words of Shakespeare, a “brave new world,” the American patriots turned the world right-side up. They desired a republic rooted in right reason, first principles, and the Natural Law. God had written the republican principles of the American Revolution into nature herself. “We do not by declarations change the nature of things, or create new truths, but we give existence, or at least establish in the minds of the people truths and principles which they might never have thought of, or soon forgot. If a nation means its systems, religious or political, shall have duration, it ought to recognize the leading principles of them in the front page of every family book,” a leading Anti-Federalist wrote in the aftermath of the war for Independence.

For this reason, the modern American conservative has a duty to know not just the origins of the American republic, but its origins in the Roman republic. After all, if we’re not conserving these things, what is it worth to be a conservative?  

When the founders of the United States created her, they wanted a republic, not an empire; a government, not a state; and a commonwealth not a democracy.

Bradley J. Birzer is the president of the American Ideas Institute, which publishes TAC. He holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

 Tagged , , , . 10 comments

Bring on the Conservative Debate for Immigration

In recent years, probably no matter has split nationalist and populist conservatives from libertarian and anti-statist conservatives more than that of immigration.  Yet, very few conservatives are actually taking the time to debate or discuss this issue, so fundamental to understanding the very essence of who we are as an American people.  Too many suppositions and assumptions have taken on the air of truth, and, as such, and, if for no other reason, the topic itself demands good discussion and vigorous debate.  In particular, the modern American conservative should praise Gerald Russello and The University Bookman for its on-going symposium dealing the whole swirling mess.  We need much more of this.  It’s too important to leave to emotion or passion alone.

As Christians around the world celebrated the arrival of the Three Kings—the Magi of the Orient—on Epiphany, the president of the United States called for $33 billion to shore up America’s borders with $18 billion for the wall.  

Would the Magi have been admitted in 2018? “Excuse me, Balthasar, but I need to see that your papers are in order.  Oh, I’m sorry, but your gift of myrrh exceeds our 3.2 ounces of liquid allowed.”  

Perhaps, President Trump simply chose his timing poorly, but it would be impossible for the Christian to miss the irony.

As a professor of the western canon, the Great Ideas of the West, and the western tradition, I find it nearly impossible to claim that there is a long tradition of excluding those who “aren’t us.” Even the most cursory examination of the issue reveals that the best of western thinkers have considered political borders a form of selfish insanity and a violation of the dignity of the human person.  The free movement of peoples has not only been seen as a natural right throughout much of the western tradition, but it has also been seen as a sacred one.

In the gloriously pagan Odyssey,Odysseus survives, again and again, because the highest commandment of Zeus is to welcome the stranger and protect him with all that one has. To this day, one finds remnants of this tradition throughout the Mediterranean as the stranger is greeted with olive oil, bread, and, depending on the predominant religion of the region, wine. As staple crops of the ancient world, these signified not just acceptance but actual joy at the arrival of the stranger.  The god of the hearth stood as patron of the sojourner.

The Athenians, during the tumultuous fifth century before Christ, prided themselves on allowing not just the stranger into their communities, but also their very enemies in.  After all, what did the Athenians have to hide?  Why not expose the ignorant to truth?  Let the oppressed see how a free people live.

During the vast, long expanse of the Middle Ages, the Germanic peoples not only thought of themselves as residents of their own little piece of Middle-earth (Midgard), but they also thought of themselves as citizens of what King Alfred the Great labeled Christendom, the Christiana res publica, as well as believing themselves sojourners en route to the City of God. What Christian could allow—in good conscience—the accidents of birth such as gender or skin tone in this Veil of Tears to trump the possibilities of eternal salvation in the next?  Neither Greek nor Jew, neither male nor female. . . .

Nothing in Christendom better represented the ideals of the free movement of peoples than did the Great Charter of 1215, forced upon King John at Runnymede.  Though points 1 and 63 of the Magna Carta demanded freedom of the Church from political interference, points 41 and 42 reveal how fundamental the movement of peoples is to the sanctity of the common law.

  1. All merchants shall have safe and secure exit from England, and entry to England, with the right to tarry there and to move about as well by land as by water, for buying and selling by the ancient and right customs, quit from all evil tolls, except (in time of war) such merchants as are of the land at war with us. And if such are found in our land at the beginning of the war, they shall be detained, without injury to their bodies or goods, until information be received by us, or by our chief justiciar, how the merchants of our land found in the land at war with us are treated; and if our men are safe there, the others shall be safe in our land.
  2. It shall be lawful in future for anyone (excepting always those imprisoned or outlawed in accordance with the law of the kingdom, and natives of any country at war with us, and merchants, who shall be treated as if above provided) to leave our kingdom and to return, safe and secure by land and water, except for a short period in time of war, on grounds of public policy- reserving always the allegiance due to us.

If we accept the Magna Carta as one of the most important documents in the history of western civilization, we Americans cannot afford to ignore it, its intent, or its specifics.  Common law demanded that a people—and the person—move freely, border or not. Even in time of war, the enemy must be treated with dignity.  

Equally important, can we American afford to ignore that the pagans, such as Odysseus, as well as the Christians, such as King Alfred, stood alike for the free movement of peoples and the welcoming of the stranger? To this day, the Roman Catholic Church, following the Hebraic Decalogue, teaches: “The more prosperous nations are obliged, to the extent they are able, to welcome the foreigner in search of the security and the means of livelihood which he cannot find in his country of origin. Public authorities should see to it that the natural right is respected that places a guest under the protection of those who receive him.” To be sure, the immigrant must fulfill his or her duty as a citizen as well.

As an American conservative, I am not suggesting that we should surrender our own free will to the dictates of the past or even to any one religion, but I do think we would be foolish beyond measure to ignore the advice of our ancestors. And, for what it’s worth, the best of our ancestors believed in the free movement of peoples.

When it comes to the specifically American tradition of immigration and the free movements of peoples, the issue becomes more complicated.  

Imagine for a moment that the great waves of immigration never came to America.  In the colonial period, among those who freely chose to cross the Atlantic, you would have to dismiss the Anglicans to Virginia, the Puritans to New England, the Quakers to Pennsylvania, and the Scotch-Irish. Of the unfree peoples, you would have to take out all of those of African origin. In the 1840s, remove the Germans, the Scandinavians, and the Irish.  In the 1880s through the 1910s, remove all Greeks, Poles, Jews, Italians. . . .  

Yes, the native American Indian population would be justly celebrating, but, overall, and, from any relatively objective view, there would be no America.

Between 1801 and 1924—with the critical exception of the Chinese and the Japanese—no peoples were barred from entry into the United States.  Congress forbade further Chinese immigration in 1882, and a gentleman’s agreement ended Japanese immigration in 1905. Otherwise, until 1921 and 1924, any person of any continent, of any religion, of either gender, of any skin color, or any other accident of birth could enter the United States and take up residency the very day of arrival.  Only those with known criminal records or those suffering from tuberculosis were turned away.  

Unless you are a full-blooded American Indian (less than one percent of the present United States population), you, American reader, would not be here without some ancestor having immigrated—freely or by force—to the United States. And possibly from what one might crassly dismiss as a “sh-hole country.”

Thus, our ancestors not only expressed their favor of the freedom of movement among peoples in their writings and laws, but when,] push came to shove, they also voted with their feet.

Since the tragedies of September 11, 2001, we Americans have surrendered not just our liberties but our very souls to the false notion and false comfort of governmentally-provided security.  Tellingly, we have even closed off what was once the freest and longest border in the history of the world, our border with our extremely kind and polite neighbor to the north, Canada.

Again, I am not suggesting we must be slaves to the past, nor am I suggesting that we should dismiss the legitimate security concerns of a sovereign people.  But, as an America people, we came into being because of the free movement of peoples.  We rebelled against the designs of the 18th-century British, and we mocked the 19th-century Europeans and their passports and border guards.  

Now, we seem to have become them.  

If we continue to build walls around our country, really, then, just who are we?  Only in the last generation or so have so many American conservatives become convinced of the necessity of the vast array of restrictions on those who wish to become a part of the United States.  Perhaps they are right, but, regardless, there is much to discuss.

Bradley J. Birzer is the president of the American Ideas Institute, which publishes TAC. He holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.


Heroism and Realism in Christopher Nolan’s Batman

“I see a beautiful city and a brilliant people rising from this abyss. I see the lives for which I lay down my life, peaceful, useful, prosperous and happy. I see that I hold a sanctuary in their hearts, and in the hearts of their descendants, generations hence. It is a far, far better thing that I do, than I have ever done; it is a far, far better rest that I go to than I have ever known.”

This article appears in the January/February 2018 issue of TAC.

—Obituary for Bruce Wayne, taken from Dickens’s A Tale of Two Cities

In 2005, Time Warner released Batman Begins, the first high-budget film by Anglo-

American filmmaker Christopher Nolan (who later did Dunkirk, Inception, Interstellar), known at the time only to a few cinema nuts for his low-budget but intensely artful and intellectual films (Memento). The Batman franchise—from novels to comic books to movies to toys—had been a hugely profitable property for Time Warner for years. Still, most Americans viewed Batman as a really neat comic book figure. “Be yourself. Unless you can be Batman. Then, be Batman.” When the character had appeared on screens, it was as a countercultural buffoon on television in the 1960s, then two decades later as a big-screen gothic and carnival-esque weirdo in the hands of Tim Burton and his followers. Only Bruce Timm’s excellent animated Batman, which aired afternoons during the early 1990s, did the character justice, but this version, given the medium, reached only a handful of diehard Batman fans.

And so with Nolan the question emerged: could this newcomer to big film projects transfer his cinematic intensity and intellectualism to Batman, thus transforming him from pop sensation to a cultural mainstay, giving the property gravitas and the studio profit?

The answer, it turns out, was yes. Two central elements of Nolan’s filmmaking characterized his particular Batman genre. First, he brought the characters into the realm of realism. They reside in the actual world, not a fantasy world, and events and developments can all be explained rationally. Second, Nolan fashioned his central character not from the pastel pages of a comic book but rather from America’s western legend, the frontier mythos that captured the national consciousness so powerfully in multiple movies of the 1940s and TV shows of the 1950s. This western legend, or myth, was larger than any single person, event, or even culture. And there were no antiheroes in that cultural fare, focused on the daunting challenge of extending the essence of Western civilization to those forbidding and often dangerous lands of the Rocky Mountains and beyond. It took real heroes to do that.

Thus does Nolan’s Batman trilogy stand today as a remarkable cultural achievement. Indeed, his third film in the trilogy, The Dark Knight Rises, was not only the best of the three but is arguably one of the finest movies ever made, a true achievement of the cinematic arts, certainly worthy of an Alfred Hitchcock or a John Ford. It also may be the single most important defense of Western civilization ever to reach a Hollywood screen. That the severe cultural liberals of the West Coast didn’t rip it to shreds indicates they probably didn’t watch it—or perhaps didn’t understand it.

In crafting his Batman movies, Nolan pulled together his longtime core development team—his wife, Emma, and his brother, Jonathan—but he also turned to his troupe of actors from previous projects, including Christian Bale and Cillian Murphy. Nolan, though a longtime Batman fan, had never been a collector or reader of comic books, and he concluded that he needed an expert in the original comic book Batman. He wisely turned to David Goyer, an Ann Arbor native, and lifelong comic book fan and writer. Goyer not only had written for DC and Marvel (the two main comic book companies and friendly rivals) but also had written some extraordinary film scripts, such as the Gothic noir dystopia, Dark City (1998), arguably one of most imaginative science-fiction films ever made.

Partly because he was untried in this kind of filmmaking and partly because of his own artistic sensibilities, Nolan developed no initial plan for any sequels. He wanted every member of his team to see this film as a one-time opportunity, holding nothing back in its making. As he put it:

People ask if we’d always planned a trilogy. This is like being asked whether you had planned on growing up, getting married, having kids. The answer is complicated. When David and I first started cracking open Bruce’s [Bruce Wayne’s] story, we flirted with what might come after, then backed away, not wanting to look too deep into the future….I told David and Jonah to put everything they knew into each film as we made it. The entire cast and crew put all they had into the first film. Nothing held back. Nothing saved for next time.

As Nolan approached the story, he decided two critical things. The first was his insistence on realism. If something happened that could not be explained rationally, he excised even the idea of it. Everything from the Batmobile to the reaction of the police had to be utterly realistic. If the Batman headgear needed two ears, there needed to be an explanation for those two ears. If Batman jumped from building to building, there needed to be a reason why and explanation as to how. The second was his embrace of the myths of the American West. Here Nolan was tapping into something already in the Batman mythos but not explicitly understood by the larger public. Like Natty Bumppo of James Fenimore Cooper’s “Leatherstocking Tales” and Mark Twain’s Huck Finn, Bruce Wayne/Batman stands as a crucial American symbol. If Bumppo and Finn personify the American frontier of the 19th century, so Wayne/Batman is the great mythological figure of 20th and 21st century urban America. Named after the Revolutionary War general, Mad Anthony Wayne, and coming from one of the wealthiest of American families (builders and defenders of Gotham City, a Platonic shadow of New York, populated by 30 million people), Bruce Wayne considers it his aristocratic duty to protect the poor and oppressed from the wealthy and corrupt. He is an Arthurian but also deeply American figure.

In the Batman stories as developed in the comics over the last three decades, it has come to light that the Waynes have been keepers of the Holy Grail in the present day, descendents of Arthur from 2000 years ago. As with Arthur, Wayne surrounds his Batman persona with a number of knights (the Gotham Knights) who serve under such codenames as Nightwing, Robin, Oracle, and others. While, to our modern eyes, they seem much like a Marine platoon, they more properly resemble the Catholic military orders of the High Middle Ages. As with Arthur, Wayne must enter the Chapel Perilous, time and again, to keep the darkness of the Waste Land at bay.

Brilliantly, Nolan wrapped the eventual Dark Knight Trilogy into the significance of myth, and the significance of myth into the story. When attempting to explain to Alfred, his father figure, butler, and accomplice, what he hoped to do when returning to Gotham City, Wayne says: “People need dramatic examples to shake them out of apathy and I can’t do that as Bruce Wayne. As a man I’m flesh and blood. I can be ignored. I can be destroyed. But as a symbol, as a symbol I can be incorruptible, I can be everlasting.” 


Several themes inform each movie. The first movie deals with justice and fear; the second with free will and anarchy; the third with hope and reformation.

In Nolan’s typical but eccentric way, the first movie jumps repeatedly in time, creating a whole out of non-linear storytelling. The essential tale is familiar, at least to Americans born after 1939, but Nolan adds his own tastes and vision.

The son of the wealthiest couple in the greatest city of the Western world, Gotham, Bruce Wayne, as a young boy, stands with his parents in “Crime Alley,” having left the opera. A killer shoots both parents and takes their money and jewels. Left an orphan, Bruce is raised by the family butler, Alfred Pennyworth. After dropping out of Princeton and ineffectively confronting the man he believes ordered the hit on his parents, Wayne departs Gotham City, traveling throughout the world for years, learning what it means to fight, to suffer, and to survive. The purpose, as he sees it, is to hone all his skills—physical as well as intellectual—and return to Gotham to protect the innocent.

In Batman Begins, Wayne finds himself in a high Tibetan temple belonging to an evil and inverted type of Knights Templar, the “League of Shadows,” an organization that demands the end of corruption of public officials. It calls them to account by destroying any city that has become unrepentantly corrupt. They claim responsibility for having destroyed Rome, Constantinople, and London, across the centuries. Now they say they will take out Gotham. The leader is a man named Ra’s al Ghul, an Arabic title meaning, “Head of the Demon.” The League portrays itself as superior to all political organizations in promoting what it sees as justice, a harmony derived from a very Nietzschean desire for the “will to act.” The League of Shadows, Ra’s al Ghul explains, has been a check against human corruption for thousands of years. “We sacked Rome. Loaded trade ships with plague rats. Burned London to the ground. Every time a civilization reaches the pinnacle of its decadence, we return to restore the balance.”

Though trained by the League of Shadows to be the heir to Ra’s al Ghul, Wayne rejects its brutal philosophy, destroys its temple, and returns to Gotham, presuming incorrectly that he has destroyed the League.

Back in Gotham, he assumes the primal symbol of his own fears, a Bat, hoping to employ terror against evil. As a Batman, he stands as a living gargoyle, adorning the cathedral of Western civilization while driving away rival evils. In his fight, he relies on four persons to sustain him: Alfred, to serve as Watson to his Holmes; Police Lieutenant James Gordon, the only honest cop in Gotham; Lucius Fox, a master engineer and entrepreneur; and Rachel Dawes, the one true love of his life, now an assistant district attorney. Rachel particularly complicates Wayne’s life, as she is unsure of his sanity and his intentions, especially in his assumption of the Bat persona.

The movie—operatic in a Wagnerian way from opening to final scene—concludes with Wayne barely defeating the revived Ra’s al Ghul and his League of Shadows. In the war against the League, Wayne Manor is destroyed, Rachel reveals that she cannot love a man who fights crime as a Bat, and a poison is loosed upon an area of Gotham known as “The Narrows,” a decayed part of the city that houses the poor and the insane. The consequences of this poison remain unknown as the movie ends, but Lieutenant Gordon, now fully in alliance with Batman, shows him the “calling card” of a new masked criminal, a grimy playing card of a joker.

The second movie, The Dark Knight (2008), begins with the Joker and his henchmen stealing from a mob bank. The heist, filmed as a tribute to such crime neo-noir classics as The French Connection (1971) and Heat (1995), goes off as planned, introducing the audience to the face of diabolic anarchy and insanity, the stunning Joker (played by Heath Ledger, who died in a drug overdose after filming).

Unlike the first movie, filmed almost entirely in shadow, with vertical lines and a Gothic noir aesthetic, The Dark Knight presents a much shinier and sunnier Gotham, its architectural lines straight, sleek, clean, and horizontal. This story, atypically for Nolan, is linear, driving relentlessly from the opening heist to the final tragic moments.

The story centers on the Joker’s attempt to destroy Gotham from within, through anarchy. As he puts it:

I’m a dog chasing cars … I wouldn’t know what to do with one if I caught it. I just do things. I’m just the wrench in the gears. I hate plans. Yours, theirs, everyone’s. Maroni has plans. Gordon has plans. Schemers trying to control their little worlds. I’m not a schemer, I show the schemers how pathetic their attempts to control things really are. So when I say that you and your girlfriend was nothing personal, you know I’m telling the truth.

He may denigrate plans, but the Joker is a master chessman, planning and scheming, always three or four moves ahead of his opponents. According to all law enforcement databases, the Joker should not exist—no fingerprints on record, clearly trained in some form of special ops, outfitted entirely in custom clothes.

Though The Dark Knight deals with anarchy and plans, the movie also probes the notions of free will and duality. If we choose A, are we doomed to follow B? If we follow B, have we destroyed all future options? The question manifests itself most particularly in the personal story of Harvey Dent, a young and courageous district attorney, ready to become the face of decency in Gotham, a White Knight, replacing Batman’s Dark Knight.

To prove that no real goodness resides in the world, the Joker plays upon Dent’s weaknesses, killing his girlfriend (Rachel Dawes, also Wayne’s one love) and driving him to madness and evil deeds. In the final scene, with Gotham not knowing that Dent had succumbed to the Joker’s dark spirit, Dent takes Lieutenant Gordon’s family hostage. In the fight to protect Gordon’s children, Batman plunges over a building ledge in a fight with Dent. Dent is killed. Batman tells Gordon that he will take the blame, though he has done nothing wrong. The movie ends with Batman having saved hundreds of lives, defeating the Joker and Dent, but now becoming a hunted man, vilified as a murderer. “You’ll hunt me. You’ll condemn me. You’ll set the dogs on me.” But Gordon explains to his son that Batman is “a silent guardian, a watchful protector, a dark knight.”

Hoping to shed the vigilante mantle of the Batman, Wayne and Police Commissioner Gordon decide to place all of their hopes on District Attorney Harvey Dent, the “White Knight” as opposed to Wayne’s “Dark Knight.” Harvey, though, hides an unrestrained abusive side, one that revels in torture. Knowing this, the Joker manipulates events in the movie—always three or four moves ahead of the good guys as if in a masterful game of chess—to force Dent to reveal this horrific side. Rather than allow the symbol to die, Wayne and Gordon decide to hide it, allowing Dent to die a martyr as the White Knight, placing, then, the abuse and killings committed by Dent on the Batman.

After The Dark Knight, Nolan insisted he had no intention of making a third movie. The death of his friend Heath Ledger rattled him, inhibiting any return to the world of Batman. But Batman wouldn’t leave him alone, he later explained, and he had to produce the third movie to find out how the story unfolds.

Nolan’s third Batman film was inspired by Charles Dickens’s A Tale of Two Cities, though much of the dialogue might have been written by the Anglo-Irish statesman Edmund Burke, considered by many the father of modern conservatism. The movie is in essence a retelling of the events of the French Revolution in Paris. In place of Robespierre is a mercenary, chemically-enhanced villain from either Eastern Europe or the Mideast, known only as Bane. This character is the creation of noted writer Chuck Dixon, a man both admired and reviled in the comic book world for his conservatism. Bane, working with several of Wayne’s competitors in business, has spent six months rebuilding the infrastructure of Gotham City, secretly lacing all of the concrete in streets, bridges, tunnels, and sewers with explosives. He considers himself the fulfillment of the infamous League of Shadows.

Coming out of retirement at 43, Batman investigates. But, when he encounters Bane in the sewers, the evildoer breaks his back. Bane takes the injured Wayne to a prison somewhere in the Middle East (filmed in an ancient city on the Pakistan-Indian border) and leaves him there to die. Languishing in this hell hole with his broken back, Wayne is reduced to watching TV, specifically, a single Gotham City channel. Bane wants Wayne to see the fall of Gotham as it implodes and collapses in on itself from the weight of its own corruption. In lines that could have been lifted straight out of Volume I of Alexander Solzhenitsyn’s The Gulag Archipelago, Bane explains to Wayne that while he is happy to have broken Wayne’s body, the prison is meant to destroy his soul.

Returning to Gotham, Bane detonates his explosives, destroying the city’s infrastructure as well as its access to and from the main island—the equivalent of Manhattan. In taking over the city, he warns the United States not to intervene or he will unleash a nuclear weapon with a six-mile blast radius, killing all on the island. By separating Gotham from the United States, Bane has created his own city-state. In the new, conquered city of Gotham, Bane frees all prisoners of Blackgate Prison (the Bastille) and declares the city to be under control of “the people.” The people, Bane says truthfully, have been deceived by the leadership of Gotham. Harvey Dent was not a White Knight but an insane, murderous criminal. Thus all of Gotham’s successes over the previous eight years, since Dent’s death, have been lies. Bane declares:

We take Gotham from the corrupt. The rich. The oppressors of generations who’ve kept you down with the myth of opportunity. And, we give it to you, the people. Gotham is yours. None shall interfere. Do as you please. . . . For an army will be raised. The powerful will be ripped from their decadent nests, and cast into the cold world the rest of us have known and endured. Courts will be convened. The spoils will be enjoyed. Blood will be shed.

In Soviet style, the criminal, the insane, and the poor ravage the homes, property, and persons of the wealthy, inverting the entire socio-economic structure of Gotham. The people—under the judgeship of Dr. Jonathan Crane, the “Scarecrow” and creator of the poisons in the first film—establish courts to sentence the wealthy for having preyed upon the poor. All such trials end in the execution of the guilty. This level of anti-communist passion has not been seen from Hollywood since Roland Joffé’s revealing if horrifying 1984 look into Cambodia under the Khmer Rouge, The Killing Fields.

Meanwhile, in his Middle Eastern prison, Bruce Wayne heals and regains his strength, spiritually as well as physically, climbing out of the pit (Plato’s Cave), liberating himself and his fellow prisoners. The fact that only one person had ever escaped from this prison heartens Wayne, for he calculates that if one person had escaped he could too.

Returning to Gotham City, Wayne as Batman takes control of the remaining police under Gordon’s command, raising a counter-revolutionary army. Leading hundreds of police into battle, he and his greatest ally, a somewhat reformed jewel thief named Selina Kyle, battle Bane and his revolutionaries. In hand-to-hand combat outside the Gotham City stock exchange, Kyle and Batman barely defeat Bane. Still, there remains the nuclear bomb. Taking his Bat—a hover aircraft based on the Harrier and the helicopter—Batman flies the bomb out of the city, over the Atlantic, and lets it detonate safely. Everyone assumes, however, that Batman sacrificed himself in saving Gotham.


At the funeral—attended only by four loved ones—a shell-shocked Gordon, who has only now come to realize the true identity of Batman, reads from The Tale of Two Cities. Looking at the grave of Wayne, next to that of Wayne’s mother and father, Alfred breaks down, believing that his entire life has been a failure. He had wanted to serve the Wayne family but had overseen its death.

The movie ends with Wayne Manor becoming a home for orphaned boys, St. Swithin’s, led by a Catholic priest. Also Lucius Fox begins to think that Wayne might have survived the flight over the Atlantic while Gordon refurbishes the long disused Bat signal and Alfred sits in an outside French café, seeing Bruce and Selina sitting together, in love, at a neighboring table.

In Nolan’s expert hands, Batman becomes what he always meant to be, an American Odysseus, an American Aeneas, an American Arthur, an American Beowulf, and an American Thomas More. Indeed, it would be hard to find another figure in popular and literary culture that more embodies the traditional heroism of the West more than in the figure of Bruce Wayne. He most closely resembles Aeneas, carrying on the culture of charity and sacrifice into the darkest and most savage parts of his world. Like St. Michael, he guards the weak, the poor, and the innocent. Like Socrates, he will die for Athens (Gotham) as it should be rather than as it is. Like Beowulf, he asks nothing for himself, merely the opportunity to wage the never-ending war against evil.

And in the third film, Western civilization survives, but only barely and only with incredible sacrifice at every level. “I see that I hold a sanctuary in their hearts, and in the hearts of their descendants, generations hence,” Dickens had written.

While some might still see merely a children’s comic book superhero made glittery with a Hollywood budget in the Dark Knight Trilogy, it would be impossible not to recognize Nolan’s genius in these films. Unlike, say, Peter Jackson, who dumbed down The Lord of the Rings, Christopher Nolan leavened Batman. Jackson diminished Tolkien, while Nolan enlarged Batman.

Since his creation in 1939 by two young Jewish artists in New York, Batman has served as a critical cultural marker for American and Western civilization. If we treat him like a clown, as did the 1960s TV series, we do not know who he is—or who we are. If we treat him like a Gothic carnival freak, as did Tim Burton, same thing. If we treat him as the great American hero and symbol of an urban age, as did Nolan, we have a chance at survival.  

Bradley J. Birzer is the president of the American Ideas Institute, which publishes TAC. He holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

A Mighty Symbol of Free People Over a Millennia

Remains of the sanctuary of Zeus Dodonaios in Dodona. Credit: Fingalo/Creative Commons

One of the best—but, sadly, least known—political scientists of the past century, Don Lutz, recognized exactly how important symbols can be to a free and ordered people. Communities across time share “symbols and myths that provide meaning in their existence as a people and link them to some transcendent order,” Lutz argued in the preface to a Liberty Fund collection of American colonial documents. In his argumentation, Lutz followed a number of critical thinkers, ranging from Eric Voegelin to Russell Kirk to Robert Nisbet. Unfortunately, a people, a person, a government, a bureaucracy, or a corporation can readily pervert such symbols, stripping them of their original meaning while allowing them to raise the consciousness of a society in ways directly contrary to what the symbols originally meant. Such is the power of symbols.

One of the most fascinating symbols of a republic in the western tradition, from the Romans through the Germanic Barbarians to the American founders to the American founders of the Republican Party, is the mighty oak. As noted in the previous essay on the history on the rise of the modern nation state, all republics must exist—by their very nature—as reflections of nature herself. They are, at essence, organic, necessarily experiencing birth, middle age, and death. How easily one might transfer this to the oak, thinking of its own stages, from acorn to prevailing gian, to corrupted and hollowed-out shell. Once, a thing of nearly infinite possibilities, but, ultimately, food for termites.

Yet, as a symbol, the oak itself has remained alive and well for a free and ordered people not just over generations, but over millennia. How much healthier for us and those of us who crave ordered liberty to see our representation in a majestic thing of nature rather than in a person, too often transformed into a god or demigod in our fallen humanity.

To see the importance of the oak, we must turn back to the Romans at the end of the Republic, nostalgically clinging to and idealizing what was.

When her father unjustly declared neutrality in the matter of the Trojans, Venus intervened on behalf of her son, Aeneas, bestowing upon him divine weaponry.

But the goddess Venus,

lustrous among the cloudbanks, bearing her gifts,

approached and when she spotted her son alone,

off in a glade’s recess by the frigid stream,

she hailed him, suddenly there fore him: “Look,

just forged to perfection by all my husband’s kill:

the gifts I promised!  There’s no need now, my son,

to flinch from fighting swaggering Latin ranks

or challenging savage Turnus to a duel!”

With that, Venus reached to embrace her son

And set the brilliant armor down before him

under a nearby oak.

Aeneas takes delight in the goddess’ gifts and the honor of it all

as he runs his eyes across them piece by piece.

He cannot get enough of them, filled with wonder,

turning them over, now with his hands, now his arms,

the terrible crested helmet plumed and shooting fire,

the sword-blade honed to kill, the breastplate, solid bronze,

blood-red and immense, like a dark blue cloud enflamed

by the sun’s rays and gleaming through the heavens the burnished greaves of electrum, smelted gold,

the spear and the shield, the workmanship of the shield,

no words can tell its power . . .

There is the story of Italy,

Rome in all her triumphs. There the fire-god forged them,

well aware of the seers and schooled in the times to come.

When the greatest of Roman republicans, Marcus Tullius Cicero, offered the world the first treatise on the natural law, On the Laws, began with the image of an oak, deeply rooted not just in the soil, but in the poetic imagination itself. “I recognize that grove and the oak tree of the people of Arpinum: I have read about them often in the Marius. If that oak tree survives, this is surely it; it’s certainly old enough,” Atticus begins. To which Quintus famously answers, “It survives, Atticus, and it will always survive: its roots are in the imagination. No farmer’s cultivation can preserve a tree as long as one sown in a poet’s verse.”  Indeed, Quintus continues, this very oak might have been planted by the one god. Certainly, the name of the oak will remain, tied to the sacred spot, long after nature has ravaged it.

In his History of Early Rome, Livy informs us that a consecrated oak sheltered the praetorium, a seat of waiting and contemplation for foreign guests and ambassadors from the Senate. Likewise, Suetonius reminds us that Mars, especially, favored the oak as a tree symbolizing the divine authority.

The Mediterraneans, though, held no monopoly over a mythic understanding of the oak, as the Germanic tribes far to the north considered the tree the symbol of their god of justice, Thor. When the Anglo-Saxons and Scandinavians met to decide the fate of inherited and common law–which laws to pass on, which laws to end, and which laws to reform–they met as a Witan or AllThing under the oaks.

Christians, knowing the oak to be so utterly rooted in the pagan tradition, knew not whether to love or to hate the tree. According to St. Bede, when St. Augustine of Canterbury called a conference of church leaders in 603, he did so at an oak, knowing the Anglo-Saxon fondness for the tree. There, at what became known as Augustine’s oak or Augustine’s Ak, the evangelist called for unity in proclaiming the gospel. Two generations earlier, Bede records, St. Columba had done something similar, building a monastery among the Celts known as Dearmach, “Field of Oaks.”  Even at the most famous of medieval monasteries, Lindisfarne, Finan built the church altar there not out of traditional stone, but, rather according to the custom of the peoples in that region, an altar “of hewn oak, thatched with reeds.”

When St. Boniface, a century later, encountered a group of Friesians still worshipping the oak of Thor, he—with nothing short of awesome bravado–attacked the tree with his axe. According to the hagiographic legends surrounding Boniface, the oak exploded into four parts moments before the blade touched its bark. So astounded were the pagans at his daring, that St. Boniface seized the moment to begin proclaiming the gospel. Where the ruined oak stood, according to hagiographic myth, an evergreen grew in its place. As it was getting dark and Boniface continued to preach, his followers placed candles all around and upon the evergreen, thus creating the first Christmas tree.

St. Boniface, it turns out, tried this trick one too many times, the last in 754, when some Thor worshippers decided to stick with Thor, beheading the poor Catholic evangelist.

If Boniface undid the oak as a direct representation of a god, he could not undo its importance to justice, as it remained a symbol of the law and of a free people. When the grand Christian King Alfred the Great met with his men in the late 800s to judge the inheritance of the common laws of the Anglo-Saxon people, they, too, met under an oak. Critically, Alfred and his Witan judged the laws. They did not create them, believing such actions illegal. A ruling body can only judge what it has inherited, not create laws out of nothing. Such a power belongs only to God and through his people only across time.

Perhaps, then, St. Boniface’s actions merely rendered under God what was God’s, and unto the community what was the community’s.

The symbol of the oak remained a powerful one in colonial America, especially as the various communities on the eastern seaboard continued their own observance of the traditional common laws and, especially, in their Declaration of Independence. Though not exclusively oak, oaks made fine Liberty Poles and Liberty Trees in the 1760s through 1780s, and newly-freed American communities regularly planted oaks to celebrate their independence from Britain. Pamphleteers, not surprisingly, used the symbol of the acorn and the oak as representative of America’s independence and hardihood.

When Congress rashly passed the democratic Kansas-Nebraska Act in 1854—a law that claimed that the enslavement of an entire people could be decided by mere majority vote—angry republican citizens of Michigan formed a third party, the Republican Party, in Jackson, Michigan, under, not surprisingly, a grove of oaks.

Whatever one in the early twenty-first century might think of Jupiter or Thor, the oak remains a mighty symbol of a free people, a people ready to remember and reclaim what is rightfully theirs by the grace of the Creator and the created order. The oak reminds us of strength in the face of nasty and bitter times, returning us to the nourishment of what makes us strong and free, the duty to govern ourselves in a fashion becoming to God and nature and, equally important, to the dignity of the human person. Unlike oppressive governments who rely on cults of personality, the republic relies on the nature of nature and the nature (good and bad) of the human person.

Origins of the Modern State Part I

Origins of the Modern State Part II

Bradley J. Birzer is the president of the American Ideas Institute, which publishes TAC. He holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.


 Tagged , , , , , , . 2 comments

The Republic Was Never Supposed to Be Forever

“Romans in the Decadence of the Empire” By Thomas Couture, 1847 (public domain).

As I write this second part of the series, the origins of the rise of the modern nation state, our own nation state looks—financially—nothing short of pathetic. At the end of 2017, the federal government’s official estimate for deficit spending is $666 billion. For all kinds of reasons, this is a really scary number, and not just because it causes one to think of the mark of St. John’s envisioned beast. Rrroawr!  $666 billion is a number so terribly large that it is difficult for any of us—even those of us not suffering from innumeracy or apocalyptic dread—to comprehend. And, of course, this is just the recorded and admitted deficit spending for one year. That is, it accounts for those things the government admits to, on the books and on budget.

According to the U.S. Debt Clock, we’re at nearly $21 trillion in debt, and the number increases so quickly that seizures might very well result. As the number made my stomach turn, I thought, perhaps the site should come with a warning akin to those found on PS4 and Xbox games. That’s all we need, right?  Another law and another regulation.

As Tom Woods and all sensible economists have recently claimed, the United States of America is simply insolvent. The only shocking thing is that no one in the mainstream media or financial institutions seems to care.

Whither the American republic? It is worth remembering that no one founds a republic believing the republic will last forever. To believe such a thing automatically negates one’s conservatism. Like all living things, a republic must experience a birth, a middle age, and a death. The question is never if a republic will die, but when. The stronger its soul, the healthier its body. Conversely, the less a people have a purpose, the faster will they decline. A republic, American or not, is a res publica—a common good, a good thing, a public thing. Whether our government still resembles the republic of the American founders is yet another question, and one for another post.

It is also worth remembering that in the long history of western civilization, no political arrangement—with only the rarest exceptions—has lasted more than a few centuries. Political bodies come and go. The two longest lived institutions in the West are not political, but, ethnic and religious. The oldest sustained cohesive people in the world are the Jews, and the oldest institution in the West is the Latin church. We can conservatively date the first at 4,000 years old and, the second, at roughly 2,000 years old. Not a single political body that existed during the time of the Pentecost still exists today. Indeed, even the very form of government that so predominates in the world—the roughly 200 nation states of the world—did not exist until the fifteenth century.

In the previous post, I mentioned what a libertarian skeptic God seems to be, as understood in the Books of Samuel and in Jesus’ handling of the coin of the Roman Empire. This skepticism about what would be called caesaro-papism arrived not just with the Jews, but also with the ancient Greeks and Romans as well.

The classical Greeks believed in community rule, that is, rule localized to each polis, its citizens deciding over and across time what rules, norms, and laws should prevail. At the height of ancient Greece, roughly 150 poleis existed, each with its own form of government. The Athenians were relatively democratic, the Spartans monarchical and militaristic, and the Corinthians free traders. What they held in common was a despising of the Oriental (Persian) belief in a godking. Equally, the Persian “godkings,” Darius and Xerxes, also despised the Greeks and what they perceived as anarchic and archaic liberty. When the Persians warred against the Greek poleis in the early fifth century, their war was far more about pride than logic. As the eminent twentieth-century historian, Christopher Dawson argued, the Persian War was, at its essence, a spiritual struggle.

The Greek patriot Herodotus described one Persian invasion gloriously, the defense of the Gates of Fire (Thermopylae) by Leonidas and 300 Spartans.

But Xerxes was not persuaded any the more. Four whole days he suffered to go by, expecting that the Greeks would run away. When, however, he found on the fifth that they were not gone, thinking that their firm stand was mere impudence and recklessness, he grew wroth, and sent against them the Medes and Cissians, with orders to take them alive and bring them into this presence. Then the Medes rushed forward and charged the Greeks, but fell in vast numbers: others now took the places of the slain, and would not be beaten off, though they suffered terrible losses. In this way it became clear to all, and especially to the king, that though he had plenty of combatants, he had but very few men. (Herodotus, The History, Book VII).

Real men, Herodotus implied rather strongly, fought because they chose to fight, not because they were forced to. Only “free societies” allow the flourishing of real manhood. However brave a Persian might be, no real man could fight for Xerxes. Such warriors were, simply put, slaves, playthings of a false godking. “It was as ‘free men,’ as members of a self-governing community, that the Greeks felt themselves to be different from other men,” Dawson argued.

It would not be absurd to argue that when the last Spartan died at Thermopylae, the Occident was born. Though the Greeks (under the hubris of the Athenians) ultimately squandered their inheritance, falling into empire, civil war, and ruin by the end of the fifth century, the successes of the first few decades of that century are not lessened. The Greek achievement against the Persians proved a glorious watershed in the history of liberty, in the history of dignity, and in the history of civilization.

A full three decades before the Spartans and Persians battled at the Gates of Fire, the farmers of Rome overthrew their Etruscan overlords, proclaiming within a year of their rebellion, a republic. True to their own fears of godkings, the Romans insisted that their republic was not created—implying a man or group of men had the divine ability to declare such a thing out of nothing—but, rather, grew. Our republic, Cicero writes in his dialogue, On the Republic, “in contrast, was not shaped by one man’s talent but by that of the many; and not in one person’s life time, but over many generations” (Cicero, On the Republic, Book II). Though far from perfect, the Roman republic grew, adapted, and evolved over centuries of time, lasting 400 years before succumbing to the dread and fate of outright empire.

Again, one must remember that no republicans believe their republic can last forever. A republic, by its very essence, must rely on its organic nature, a living thing that is born, flourishes, decays, and dies. It is, by nature, trapped in the cycles of life, bounded by the walls of time. While a cosmic republic might exist—as understood by Cicero’s “Cosmopolis” and Augustine’s “City of God”—it existed in eternity and, therefore, aloof of time.

For better or worse, the Roman Republic reflected not just nature, but the Edenic fall of nature as well. We can, the Roman republican Livy recorded, “trace the process of our moral decline, to watch, first, the sinking of the foundations of morality as the old teaching was allowed to lapse, then the rapidly increasing disintegration, then the final collapse of the whole edifice.” The virtues of the commonwealth—the duties of labor, fate, and piety—gave way to the avaricious desires for private wealth. When young, the Romans rejoiced in the little they had, knowing that their liberty from the Etruscans meant more than all the wealth of the material world. “Poverty, with us, went hand in hand with contentment.” As the republic evolved and wealth became the focus of the community, not sacrifice, so the soul decayed. “Of late years,” Livy continued, “wealth has made us greedy, and self-indulgence has brought us, through every form of sensual excess, to be, if I may so put it, in love with death both individual and collective.” (All Livy quotes from The History of Early Rome, Book I)

Not long before his own martyrdom at the hands of a would-be Caesar, Mark Antony, Cicero lamented:

Thus, before our own time, the customs of our ancestors produced excellent men, and eminent men preserved our ancient customs and the institutions of their forefathers. But though the republic, when it came to us, was like a beautiful painting, whose colours, however, were already fading with age, our own time not only has neglected to freshen it by renewing the original colours, but has not even taken the trouble to preserve its configuration and, so to speak its general outlines. For what is now left of the ‘ancient customs’ on which he said ‘the commonwealth of Rome’ was ‘founded firm’? They have been, as we see, so completely buried in oblivion that they are not only no longer practiced, but are already unknown. And what shall I say of the men? For the loss of our customs is due to our lack of men, and for this great evil we must not only give an account, but must even defend ourselves in every way possible, as if we were accused of capital crime. For it is through our own faults, not by any accident, that we retain only the form of the commonwealth, but have long since lost its substance. (Cicero, On the Republic, Book IV)

As we consider our own nation state with its immense debt and bloated empire, we might wonder if Cicero’s words written during the reign of first caesar might not equally apply to 2017.

Bradley J. Birzer is the president of the American Ideas Institute, which publishesTAC. He holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

 Tagged , , , , , . 15 comments

An Ode to Progressive Rock

Progressive Rock favorite, Big Big Train in the UK. (Team Rock)

There are few joys in this insane world greater than the pleasure of really artful music, whatever the genre, whatever the market. And of all of the rock bands in the world, the best might very well be England’s Big Big Train (BBT). Well known in the U.K. and Europe, they remain relatively unknown in North America, to our shame. To my mind, the only band that rivals BBT is Tennessee’s Glass Hammer.

As a band and a project, BBT has been around since the early 1990s. Like most living things, it has aged considerably (though quite gracefully) over the past two decades. Guided by its brilliant founding members Greg Spawton and Andy Poole, BBT is now made up of eight full-time members, including one from Sweden and one from the U.S.

Its most famous member is Dave Gregory, formerly the lead guitarist for XTC. Like every member of the band, Gregory is an extraordinary musician pursuing a high art. He is also, I’m happy to note, a true gentleman and, like everyone in the band, a perfectionist. From the beginning of its existence, BBT has honed its complex song structures, riveting melodies, and gorgeous historical, poetic, and mythic lyrics. Almost all of the band’s songs celebrate excellence, innovation, and struggle. Typical themes include World War I and II ace fighters, beekeepers, medieval saints, architects, and survivors of trauma. Lyrically, the band is levels above almost anything being written in popular culture today, and, in the rock-pop world, certainly well beyond Elvis, Madonna, and Lady Gaga.

BBT resides in a sub-genre of rock music known popularly as progressive rock, art rock, or more affectionately “prog.” Prog began as an attempt in the mid-1960s to present rock music as an art form rather than an emotional reaction. American lovers of prog generally date its advent to “Pet Sounds” by the Beach Boys, while the British usually turn to “Sgt. Pepper’s” as the beginning. One of the most important rules about prog is that there are generally no rules. Traditionally, progressive rock incorporates odd, African-American jazz-like tempos and time-signatures with classical European tonal and compositional structures.

For a time, between about 1970 and 1976, progressive rock—led by such bands as ELP, Genesis, and Yes—sold millions of records. But the genre faded in the late 1970s, outcompeted by the less complicated (many would say less talented) punk rock movement. Where progressive rock succeeded during this period, it was usually by incorporating elements of hard rock and metal, such as Canada’s Rush, becoming a part of the experimental New Wave and post-New Wave scene, such as Peter Gabriel did, or, lasting just a bit longer, by embracing Cold War existentialism, as did Pink Floyd.

By 1980, though, all but the most diehard fans of straightforward prog—if such a thing could exist or ever did exist—considered the genre to be pretentious, overly complicated, and bloated. Even books that attempt to cover the genre sympathetically, such as Ed Macan’s Rocking the Classics and Dave Weigel’s The Show That Never Ends, are written in the past tense. When most commentators speak of prog, they do so in mocking tones, remembering Yes’s Rick Wakeman wearing gaudy wizard cloaks.

Beginning in the early 1990s, however, a whole new group of progressive rock artists emerged, especially as the internet began to decentralize the music market and connect various parts of the globe, one to another. Between 1994 and 2000, progressive rock once again gained a substantial following in Europe, the Middle East, India, and the Americas.

This movement, known as “third-wave prog,” has yet to subside. In America, Neal Morse, Glass Hammer, and Dream Theater dominate. In the U.K., Steven Wilson, BBT, and Marillion hold prominent positions. (To be sure, the heart of the third-wave resides in England, centered around editor and kingmaker Jerry Ewing and his geekish and stylish Prog magazine.)

For TAC readers, it’s worth noting that a whole host of serious American conservative and libertarian writers—ranging from Steve Hayward to Tom Woods to S.T. Karnick to Jason Sorens to Steve Horwitz to Sarah Skwire to Aeon Skoble to Carl Olson to Bruce Frohnen—share a deep and abiding affection for prog.

Additionally, progressive music has typically embraced intelligent topics, offering cultural and political criticisms that take anywhere from six to (*gasp*) 78 minutes. The average prog song is three to four times the length of the average pop song. Many prog songs remain strictly instrumental in their opening five to six minutes, with the vocalist finally entering long after the average pop song would have ended. And as in jazz, progressive musicians often play extended passages or solos for impressive amounts of time, emphasizing complexity as well as spontaneity.

This brings us back to that English wonder of wonders, BBT. Though Spawton and company had limited success in their first decade, it was not until 2009 that the band’s current form began to take shape. That year, Spawton recruited and introduced three key elements to their future success: the extraordinary American drummer Nick D’Virgilio (younger brother of Mike D’Virgilio of; the minstrel English vocalist, flautist, and composer David Longdon; and guitarist gentleman Dave Gregory. The album that BBT released that year, The Underfall Yard, is every bit as good and meaningful as Brubeck’s Time Out, Simon and Garfunkel’s Bookends, Davis’s Kind of Blue, and U2’s War. While any listener would expect the traditional rock instruments of guitar, bass, drums, and keyboards to be present, few would expect the incorporation of a full English brass band, woodwinds, and strings. The themes of The Underfall Yard revolve around the mysteries of Venus, abandoned industrial areas as archeological wonders, entrepreneurial visionaries, Dante-esque architects, and electrical storms off the coast of England.

Big Big Train in concert. (Credit: Simon Hogg)

None of this ever screams—or even whispers—pretense. Through the extraordinary talents of BBT, it all comes across as a perfect and necessary whole, as though the members of the band have found something that had always existed in the created order but had yet to be seen since the fall of Eden. Once seen (or heard), it can never be unseen, nor should it be. By the time the listener reaches the end of The Underfall Yard, he cares immensely about that which has been lost and should never have been forgotten:

These are old places stood in the way,

Grass grown hills and stone.

Parting the land

With the mark of man,

The permanent way.

Using available light,

He could still see far.

Far have we traveled from the pop ejaculations of “hey, baby, baby.” Spawton and Longdon reach into the realms of Chesterton, Eliot, and Betjeman.

Indeed, Spawton’s lyrics, as quoted above, might have been written, at least in their intent if not in their wording, by the greatest of 18th-century thinkers, Edmund Burke. We see the past because of the sacrifice of our ancestors. We see the present as a means to honor, through piety, those who struggled for us, whether they knew us or not. And we see the future, however dimly, only by the available light of tradition, reason, and nature. Could this not be Burke’s mysterious incorporation of the human race: the dead, the living, and the yet to be born comprising one profound community, transcending the limitations of time and place?

Since 2009, BBT has grown vigorously, adding a full-time keyboardist, Danny Manners, another lead guitarist, Rikard Sjöblom, and a violinist, Rachel Hall. And since the release of The Underfall Yard, they’ve just kept getting better and better with one masterpiece after another: Far Skies Deep Time in 2010, English Electric Full Power in 2013, and Folklore in 2016, in addition to an EP and two live albums.

I believe 2017, however, has truly been the year of BBT. This year alone, the band has released two full albums, Grimspound and The Second Brightest Star, a free 34-minute song “London Song,” and on December 1 a Christmas EP.

There exists not a single false step in any of this. And, despite a mass of releases, quantity has never overwhelmed quality. BBT brims with ideas, especially as Spawton and Longdon play off each other, forming a sort of prog Lennon/McCartney for the 21st century. This is a band at the top of its game, bursting with joy over expressing itself and its art.

Let me note just two additional things.

First, as noted above, there’s a Stoic perfectionist streak in most progressive rock, but none more so than in BBT. When the band releases something, Spawton makes sure every single aspect of it is right, from the music to the lyrics to the packaging.

Second, and equally important, the band members do not see themselves as aloof geniuses, tapping into the esoteric music of the spheres. For BBT, the art is real, but so is the audience. They excel at creating and leavening community, not only among themselves, but for and among the rest of us as well.

As a part of the northern tradition of the Beowulf poet, King Alfred, the eddas, and the sagas, the band sings on one of their 2017 albums:

Here, with book in hand

Follow the hedgerow

To the meadowland

Here with science and art

And beauty and music

And friendship and love

You will find us

The best of what we are

Poets and painters

And writers and dreamers

If you thought rock was an artless and juvenile spewing of emotion, think again.

Pick up any album by BBT and be dazzled, not just by the artistry of the music, but by the invitation to be a part of the art and to become immersed fully into something that is unapologetically true, good, and beautiful.

Cynics need not apply.

Bradley J. Birzer is the president of the American Ideas Institute, which publishes TAC. He holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

The Origins of the Modern State: A Conservative View

Saint Peter and Saint Paul, Jusepe de Ribera, 1616. (Public domain)

Part I

Over the last several years, amidst the swirls of overt corruption, immigrant “hordes,” rising “national security” concerns, police militarization, bloated empire, and the so-called deepening of the “deep state,” conservatives and libertarians of all stripes have pondered the meaning of the modern state. Most recently, Paul Moreno has brilliantly considered the rise of The Bureaucratic Kings, Alex Salter has wisely questioned the relationship of anarchy (the Bohemian, Nockian variety) to conservatism, and, though I have yet to read what the always thoughtful Jason Kuznicki of Cato recommends, there is also James C. Scott’s Against the Grain: A Deep History of the Earliest States. Believe me, I am intrigued.  Each of these authors and recommenders, of course, owes an immense debt to the pioneering work of Robert Higgs’s magnum opus, Crisis and Leviathan (1987), and Higgs, in turn, had followed in the footsteps of such 20th century greats as Christopher Dawson, Robert Nisbet, Friedrich Hayek, and Joseph Schumpeter.  

Some conservatives will immediately balk at such analyses. Students of Leo Strauss want to remind us that politics, properly understood in the Aristotelian sense, is high, not sordid. Students of Russell Kirk want to remind us that order is the first concern of any society and that to look too deeply at the origins of a state is a form of pornographic leering and peeping. And, Christians of every variety, consider the 13th chapter of St. Paul’s letters to the Church in Roman as having closed the matter before it ever needs discussion. God, according to a literal reading of St. Paul’s letter, commanded us each to “submit to the supreme authorities. There is no authority but by act of God, and the existing authorities are instituted by him; consequently anyone who rebels against authority is resisting a divine institution.”

While modern Christians might claim this answers every question about the legitimacy of state action, they are not necessarily mainstream in the history of Christianity.  The Prophet Samuel, feeling outcast by the ill favor of his people, of course, had a fierce argument with them, after consulting with God about the necessity of centralizing the government under a monarch.  God assured him that this would be foolish:

He will take your sons and make them serve in his chariots and with his cavalry, and will make them run before his chariot.  Some he will appoint officers over units of a thousand and units of fifty.  Others will plough his fields and reap his harvest; others again will make weapons of war and equipment for mounted troops.  He will make your daughters for perfumers, cooks, and confectionaries, and will seize the best of your cornfields, vineyards, and olive-yards, and them to his lackeys.  He will take a tenth of your grain and your vintage to give to his eunuchs and lackeys.  Your slaves, both men and women, and the best of your cattle and your asses he will seize and put to his own use.  He will take a tenth of your flocks, and you yourselves will become his slaves.

God seems to have been the first hard-core decentralist anti-statist, but Samuel’s people refused to listen, and God granted them, against His better judgement, a monarchy.

Jesus, holding a coin of his day, stamped with Ruler of Things Temporal on one side and Ruler of Things Spiritual on the other, told His followers that they must render unto Caesar what is Caesar’s and to God what is God’s. For better or worse, He did not elaborate, but it is rather clear that the body politic has no right to interfere with the body spiritual.

Even St. Paul, when he wrote the thirteenth chapter of Romans, wrote his chapter in the context of a much larger letter that dealt entirely with the nature of the human person as citizen. Not surprisingly, he wrote this letter to the Christians who lived at the very center of the empire. The letter itself is deeply complex, full of nuances, and, one would wish, resistant to proof texting. In order, St. Paul addresses citizenship to and within the Natural Law, to Judaism, to and within the Gospel of Jesus, to and within Creation itself, a return to the topic of Judaism, to and with God’s will for each person within history, in the Body of Christ, and, finally, in chapter 13, to and within the secular authorities of the world. To suggest that one could readily take any one of these discussions and commands apart for any other is as wrong as it is absurd. While I would never proclaim to know exactly what St. Paul wants of us, I can state with certainty that no easy answer suffices. St. Paul was as individual in his personality as he was in his thought.

Three centuries after St. Paul wrote his letter to the Romans and after horrific massacres, huntings, and martyrdoms at the hands of the Roman Imperials, Christians found themselves, if not quite legal, no longer illegal after the Edict of Milan of 313.  Not until 380, did the Roman government declare Christianity fully legal, and, twelve years later, in 392, it offered Christianity a monopoly. For eighteen years, though many Romans grumbled about the privilege given exclusively to Christianity, none openly challenged it until the barbarian hordes invaded the city of Rome on August 24, 410.  Then, all hell broke loose, and the grumbling pagans became outraged pagans, demanding the recognition that the forsaking of the gods for the Christian God had resulted in the fall of the Eternal City.

In those years prior to the invasion, St. Ambrose of Milan had forbidden the Roman Emperor from receiving communion after the emperor had sanctioned the massacre of rebellious civilians.  This Ambrosian doctrine established that while the powers spiritual did not possess force of arms, they did have the right to deny those who wielded political and military power from enjoying the sacraments of the Holy Church when they were in grave sin.  Ambrose’s excommunication worked, and the emperor accepted and endured an extended penance before being received back into the arms of the church.  Such power remains to this day, as seen most recently and most powerfully in the modern age in a Polish Pope’s shaming of an Evil Empire.

Ambrose’s close friend, St. Augustine, elaborated this Catholic distrust of state power most effectively and most persuasively in his magisterial, The City of God (412-428).  Though long, it is worth quoting at length.

Justice being taken away, then, what are kingdoms but great robberies? For what are robberies themselves, but little kingdoms? The band itself is made up of men; it is ruled by the authority of a prince, it is knit together by the pact of the confederacy; the booty is divided by the law agreed on. If, by the admittance of abandoned men, this evil increases to such a degree that it holds places, fixes abodes, takes possession of cities, and subdues peoples, it assumes the more plainly the name of a kingdom, because the reality is now manifestly conferred on it, not by the removal of covetousness, but by the addition of impunity. Indeed, that was an apt and true reply which was given to Alexander the Great by a pirate who had been seized. For when that king had asked the man what he meant by keeping hostile possession of the sea, he answered with bold pride, “What thou meanest by seizing the whole earth; but because I do it with a petty ship, I am called a robber, whilst thou who dost it with a great fleet art styled emperor. [St. Augustine, City of God, Book IV]

Whatever one might personally think of St. Augustine in the early 21st-century, it matters little.  Outside of Holy Scripture, nothing in the western middle ages mattered as much as his City of God. For all intents and purposes, it was the handbook for the next thousand years of the West. As such, we moderns and post-moderns almost never turn to the medieval period to understand political theory. For the medieval greats, what mattered most was not what form government took, but how moral it was, how ethical it was, and how protective of the powers spiritual it was.  As much as the Medievals studied Paul, they did so through the lens of Augustine.  Paul’s Letter to the Romans, especially chapter 13, was anything but simple.

For those of us living in the last six-hundred years of history, attuned as we are to the doings of the nation-states, at home and abroad, the Medieval is as far from us as is Ray Bradbury’s imaginary civilizations on Mars.

Yet, as good and true conservatives, we in this present whirligig we call civilization, must return to first principles and right reason.  If we are to understand the modern state, we must understand its origins.

Part II, coming to an American Conservative website near you.

Bradley J. Birzer is the president of the American Ideas Institute, which publishes TAC. He holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

 Tagged , , , . 11 comments

Promoting Human Dignity Since 1844: Hillsdale College

Hillsdale College. Credit: CreativeCommons/Flickr/Kingpin1000

For all the wrong reasons (and none actually correct), Hillsdale College served as an important part of the debates in the Senate this weekend regarding tax reform.  Taking it upon himself to become the crusader for everything “progressive,” Senator Jeff Merkley of Oregon proudly proclaimed on Twitter and Facebook: Hillsdale College wants “to have permission to discriminate in selecting students.”  Of course, Senator Merkley did not mean that the college discriminated in its selection process, as any real university would, to seek and recruit the best and the brightest, but, rather, that  the college discriminates to make sure the college stays racially white. Or, as he not so delicately put it, Hillsdale College “specializes in discrimination.”

I have no ability to judge whether the Senator spoke out of ignorance of  maliciousness, but I can state this definitively: He knows absolutely nothing about Hillsdale College, and, frankly, if he possesses even an ounce of decency, he will formally apologize for his claims.

A group of abolitionist Free-will Baptists founded Hillsdale College in 1844, though they stipulated that the college could not be denominational. Instead, true to their abolitionist beliefs, the founders of the college forbade any discrimination based on the accidents of birth. In other words, Hillsdale—from day one of its existence, as defined by its charter—allowed a person of either sex and of any racial, ethnic, or religious background to study there. The college became, understandably, a hotbed for abolitionist sentiment, and it was the rare prominent abolitionist of the ante-bellum period who did not grace Hillsdale with a visit and a speech. Perhaps, most prominently, Frederick Douglass spoke here. True to our heritage, President Larry Arnn dedicated a statue to the great anti-slavery orator just this past spring. That statue, along with a statue of a Civil War soldier and Abraham Lincoln greet the visitor to Hillsdale’s beautiful campus in southern Michigan.

As noted above, though, Hillsdale was not just color-blind from day one, it was also the first college or university in the United States to allow women the right to earn a liberal arts degree. Others allowed women to study for home economics, but, at Hillsdale, they were treated just as well as men, studying the Great Ideas, the Great Minds, and the Great Books of western civilization.

When Abraham Lincoln called for volunteers to suppress the Confederate rebellion in the spring of 1861, almost every single male at the college answered that call, making it unique among all northern colleges. Indeed, outside of the military academies, not a single institution of higher learning offered anywhere near the level of participation that Hillsdale offered. Hillsdale men (and, of course, women, though in non-combat positions) served the Union stunningly, especially in the 2nd, 4th, and 24th Michigan regiments. The 24th, the fifth of five regiments to make up the justly famous Iron Brigade, sacrificed themselves in one of the most horrific moments of the Civil War, the first day at the Battle of Gettysburg. Positioning themselves at a bottle neck on the eastern side of the little Lutheran Pennsylvania town, the 24th Michigan, outnumbered nearly 10 to 1, fought so fiercely that the Confederate invaders held back, despite having the superiority in numbers.  When Lee found out about the timidity of his own troops, he was furious. Had his troops broken the 24th Michigan, they could have readily taken the high ground of Little Roundtop and surrounding areas. The Hillsdale men who gave their lives that day in what must have seemed a hopeless cause very well changed the course of American and western civilization. Today, the fourth floor of Delp Hall, which houses the history department, is dedicated to their sacrifice, a seminar room displaying paintings of that hot, humid afternoon in Pennsylvania as one Hillsdale man after another succumbed to enemy fire.

During the 1950s, at the height of the struggle for black civil rights, Hillsdale’s football team, led by the intrepid Muddy Waters, refused to play in the Tangerine Bowl because black players were not allowed on the field. Hillsdale’s team would’ve have gone into the 1955 Bowl game with a 9-0 record. 

Your author—yours truly—has had the privilege of teaching at this college for over eighteen years.  To this very day, I am more than proud to note, Hillsdale remains 100% blind when it comes to the color, race, ethnicity, and religion of its students. Not only do we not ask a student to identify any race or ethnicity on his or her application form to the college, but we keep absolutely no data about such things. We believe in character, not skin color. We love intelligence, not appearance. We love the individual, not the group.

Though I can only speak for myself and not for the college (for I have no such authority to do so) as a whole, I can state that far from “specializing in discrimination,” we might be the single best institution in western civilization that adamantly refuses at every level to “specialize in discrimination.”

Though I do not have the privilege of knowing or even understanding Senator Merkley, I can state with certainty that while he makes a show of calling for “equality,” he really means a drab uniformity and collectivized tapioca. As Dr. Arnn, the single best college president in the world, has reminded us many times, we were anti-discrimination long before the Federal Government was. In fact, he notes, the Federal Government finally adopted OUR position on the issue of race and ethnicity, not the other way around. Hillsdale had to remind the United States over and over again of the Founding intent as expressed in the Declaration of Independence and the Northwest Ordinance of 1787.

Rather than speaking about that of which he knows nothing, perhaps Senator Merkley would consent to visit our campus. I would happily show him our statues that so beautifully reveal our devotion to liberal education as well as to the dignity and beauty of each human person, each a unique expression of a majestic Creator. I would happily introduce him to my extraordinary colleagues and to my ever-curious students. I would also take him to Oak Grove Cemetery, a sacred site on the northern most part of town that inters over 300 Civil War veterans as well as the first historian of Hillsdale College, Ransom Dunn. In 1854, he became so disgusted with Washington politics and especially the Democratic Party under Stephen Douglas, that he helped form an independent movement that sought to prevent the extension of slavery in the American West. After much deliberation under a grove of oak trees in Jackson, Michigan, they finally decided on the name, the Republican Party.

As a historian at one of the finest institutions of higher learning in existence, I only ask that the Senate neither helps nor hinders us.  Hillsdale College does not take one single penny from the federal government, and our students take not one single penny in loans. Just please leave us alone, and we’ll be fine. Indeed, leave us alone, and we’ll continue to show the world how best to educate and how best to promote the dignity of every single human person regardless of race, ethnicity, religion, etc. 

Bradley J. Birzer is the president of the American Ideas Institute, which publishes TAC. He holds the Russell Amos Kirk Chair in History at Hillsdale College and is the author, most recently, of Russell Kirk: American Conservative.

 Tagged , . 24 comments
← Older posts