Though Congress and the president are out of town, the final weeks of August have seen the arrival of an unexpectedly critical moment. The brutal beheading of James Foley by ISIS (the Islamic State in Syria and Iraq) confirmed that there remains a Sunni jihadist terrorism problem in the Mideast: decimating al-Qaeda and killing Osama bin Laden didn’t end it. It shouldn’t be forgotten that America’s destruction of the Iraqi state in 2003 created the opportunity for ISIS to grow and thrive, as America’s Sunni allies, Saudi Arabia and the Gulf states, gave ISIS financial backing.
How to respond? The usually wise Andy Bacevich suggests that ISIS constitutes a negligible threat to America, a superpower an ocean away, that bombing it has become—like bombing elsewhere, America’s substitute for a genuine national security strategy. Bacevich suggests we ought to butt out, except perhaps to give aid to countries genuinely threatened by ISIS. There is much to this argument, as there is little inclination from the American people to send ground troops once again into Iraq. And even if we were willing to reconstitute and send an occupation force, what good would it do? In a similar vein, Paul Pillar argues that overestimating ISIS as a potential threat is perhaps more likely, and dangerous, than underestimating it.
But few are comfortable with doing little or nothing: ISIS is undoubtedly barbaric, with possible potential to spread. In important ways the situation resembles the months after 9/11, in which America were brutally confronted with the sudden emergence of Sunni extremism which had not previously been deemed a major problem.
Then as now, an influential group of neoconservatives, tightly allied with Israel, had a very specific idea of what they wanted the United States to do. The neocons then—and still do—aspired for an almost endless series of American wars and invasions across the entire Middle East. Because in 2001 we were already engaged in a sort of shadow war with Saddam Hussein—Iraq was under a semi-blockade and America was enforcing a no fly zone over the country—Iraq was the logical starting point. But for the neocons Iraq was only a beginning. “Real men want to go to Tehran” was the neoconservative semi-jokey catchword during that time, and they quite seriously expected that after Baghdad was digested as an appetizer, they could steer the United States into war with Iran—then as now a top Israeli priority. That an American war with Iran was an Israeli priority does not mean Israel opposed the Iraq war: polls at the time indicated that Israel was the only country in the world where large popular majorities were enthusiastic about George W. Bush’s Iraq invasion, and Israeli politicians were regularly invited to appear as guests American news talk shows in order to beat the Iraq invasion drums. Steve Walt’s and John Mearsheimer’s indispensable book The Israel Lobby, contains pages filled with quotations from Israeli leaders making hawkish pronouncements to American audiences; the quotes are a necessary corrective to present to present Israeli efforts to proclaim that an American invasion of Iraq was never really an Israeli objective.
If ISIS is to be contained or defeated without using American ground troops, it is necessary to examine the regional forces ready to fight it. There are of course the Kurds, a small group which can perhaps defend its own region, if that. The biggest potential player is Iran. With its majority Shia population Iran takes a dim view of Sunni jihadism; the Iranian population was pretty much the only one in the Muslim world to display open sympathy with Americans after 9/11. By the standards of the Middle East, it is a scientific powerhouse, with a large freedom aspiring middle class, and considerable artistic community. According to published reports, Iranian tanks have reportedly engaged ISIS near the Iranian border—probably with American approval. We are likely, I would guess, to hear more about Iranian tank brigades in the coming months, even root for them.
The other serious force willing to fight ISIS is Syria, led by the Alawite Bashar al-Assad. Assad is a dictator, as was his father. His regime is strongly supported by Syria’s Christians, by Iran, and by Hezbollah, the Sh’ite militia in neighboring Lebanon. Syria has been caught up in civil war of shocking brutality for the past four years. The largest faction opposing him is ISIS—and American arms distributed to the Syrian “rebels” have often ended up in ISIS hands. By opposing Assad, the United States has in effect been feeding ISIS. Read More…
Thanks to the website Open Culture, I came across George Orwell’s 1940 review of Hitler’s Mein Kampf. Not only does Orwell suss precisely the nature of Hitler’s menace and the source of his popularity, he provides a neat thumbnail description of European liberals and social democrats that could easily attach to today’s American Democrats:
Also [Hitler] has grasped the falsity of the hedonistic attitude to life. Nearly all western thought since the last war, certainly all “progressive” thought, has assumed tacitly that human beings desire nothing beyond ease, security and avoidance of pain. In such a view of life there is no room, for instance, for patriotism and the military virtues. The Socialist who finds his children playing with soldiers is usually upset, but he is never able to think of a substitute for the tin soldiers; tin pacifists somehow won’t do. Hitler, because in his own joyless mind he feels it with exceptional strength, knows that human beings don’t only want comfort, safety, short working-hours, hygiene, birth-control and, in general, common sense; they also, at least intermittently, want struggle and self sacrifice, not to mention drums, flags and loyalty-parades. However they may be as economic theories, Fascism and Nazism are psychologically far sounder than any hedonistic conception of life.
Dig that prescient reference to birth control!
There’s a variety of reasons—see Santayana, Garry Wills, and our own Dan McCarthy—why liberalism leads to force and coercion, but it’s simply not the case that progressivism or modern liberalism or whatever you want to call it is akin to European fascism and Nazism, a virulent outgrowth of German romanticism that should not be confused with the rationalist-materialist hubris of Marx, Engels, and scientific socialism. Since I began blogging semi-regularly four years ago, the conceit that, well, Nancy Pelosi should check her sleeve for a swastika, has been a constant irritant.
I’m glad to learn that the great Orwell would have been similarly irritated.
William Deresiewicz’s July article for The New Republic, “Don’t Send Your Kid to the Ivy League,” launched a insightful, heated debate about the nature and meaning of education. In the newest edition of The New Yorker, Nathan Heller has penned a thought-provoking review of Deresiewicz’s newly-released book, Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life, specifically considering its discussion of self-focus and career, comparing them with the writings and thought of Robert A. Nisbet.
In considering universities’ education, Deresiewicz asks whether they produce self-realization: “The highest function of art, and of literature in particular, is to bring us to that knowledge of ourselves that college ought to start to give us,” he writes. But Heller objects to this:
The groovy lore of college—the notion that it is a place to find yourself, follow your passions, learn to think in ways that benefit the world … Nisbet thought that these ideals were mostly feel-good bunk. Since when was it the university’s responsibility to solve all of society’s problems? he asked. And why should a professor rich in knowledge have to teach things that a callow nineteen-year-old considered “relevant” and “meaningful”? Academe ought to focus on the one thing that it actually did well: letting scholars teach what they knew. That teaching might nurture intellectual skills that the students could use in the real world, but how it did so was mysterious and, anyway, beside the point.
This older idea of knowledge—something inherently and, indeed, objectively good—is contrary to “knowledge” in the conception of Deresiewicz and others, which must be self-applicable in order to be meaningful. But it’s true that Deresiewicz’s call for self-actualization is a bit more complicated than this (Heller notes that Excellent Sheep is “full of such confusions”). In an interview with The Atlantic‘s Lauren Davis, Heller explained the meaning of his book’s title:
… [Students are] sheep, because they have never been given an opportunity to develop their ability to find their own direction. They’re always doing the next thing they’re being told to do. The trouble is that at a certain point, the directives stop. Though maybe not, because even when it comes to choosing a career, there are certain chutes that kids, especially at elite colleges, tend to get funneled towards.
Here, Deresiewicz is denouncing the career-focused structure of learning and the limits it imposes on self-reflection, rather than denouncing the actual subject matter of learning itself. Later on, he applies these thoughts to the classroom and the way students absorb knowledge. “The main point,” he says, “is to know yourself so you know what you want in the world. You can decide, what is the best work for me, what is the best career for me, what are the rewards that I really want.”
These two realms in which Deresiewicz identifies a lack of “self-awareness”—the technical/professional and the personal/intellectual—are connected by the ingredient of time. Students are so busy pursuing parent or university-imposed trajectories that they have no time to reflect on their learning, what it’s for, how they ought to apply it. Deresiewicz has some good points here: it is true that, though knowledge itself is objective, we will be unable to see how it threads its way through our lives unless we cultivate a healthy amount of reflection.
But Heller has good points, too: he acknowledges that education ought not revolve around career paths and money, but also sees the tension between career and academics as the “fragile,” eternal balance of the university. And despite Deresiewicz’s criticisms of students’ frantic schedules, Heller writes that “the truest intellectual training could be how to stay calm, and keep thinking clearly, in the high-strung culture in which students need to make their lives.” These are truly lessons that will remain relevant throughout a person’s life. Heller seems to say that being an “excellent sheep” isn’t so bad, if you are truly an excellent one. Perhaps being a sheep is the first step along a path to eventual intellectual independence and prudence.
Deresiewicz sees the frantic chaos in students’ lives, and argues self-reflection will assuage their depression and give them meaning. Heller sees the chaos, and suggests that this is the way in which gold emerges from dross. It could be that both are right, to an extent: the key is, perhaps, in the “excellent” part of “excellent sheep.” If the core of an education is good and insightful, it will cultivate strong and thoughtful minds. Some students may still run frenzied and stressed between internships and fellowships, papers and projects, exams and speeches. But the meat of what they learn, their daily bread of knowledge, will sustain and grow their self-reflective souls. And eventually, whether during college or after, they will be sheep no more.
The decisions that determined the fate of the great nations and empires that failed to survive the 20th century are well known. For the Kaiser’s Germany, it was the “blank cheque” to Austria after Sarajevo. For Great Britain, the 1939 war guarantee to Poland. For the Third Reich, it was the June 1941 invasion of Russia. For the Empire of the Sun, the decision to attack Pearl Harbor. And for the Soviet Empire, it was the invasion of Afghanistan.
As for the United States, historians may one day concur with the late Gen. Bill Odom. For the lone superpower to survive that century, the decision to invade and occupy Iraq was the most disastrous blunder in its history. George W. Bush held out the promise of a peaceful Mesopotamian democracy as a magnet for all Arab nations. What we produced is a broken land awash in blood, a country severed by tribe and faith: a Kurdish north, Shia south and a Sunni west controlled by the savages of an “Islamic State” even al-Qaeda hates and fears.
In Syria, where the United States has been aiding rebels to bring down Bashar Assad, that Islamic State now controls the northern and eastern half of the country. In Libya, where we delivered the air and missile strikes to smash Col. Gadhafi’s forces, Islamist fanatics have gained the upper hand in the civil war for control of that country. In all three countries, the United States, which claimed to be battling dictatorship to bring democracy, helped to create the power vacuum these Islamists have moved to fill.
We are the enablers of the Islamic State.
How grave is the threat? ISIS is a “direct threat to our homeland” says Rep. Peter King. “An existential threat” echoes Sen. Lindsey Graham, “I think of an American city in flames.” The Islamic State “is beyond anything we’ve seen,” says Sec. Chuck Hagel, an “imminent threat to every interest we have.” America is “in the most dangerous position we’ve ever been in,” says Sen. Jim Inhofe, “They’re crazy out there. And they are rapidly developing a method to blow up a major U.S. city.”
Undeniably, these are bloodthirsty religious fanatics who revel in beheadings and crucifixions and have exhibited battlefield bravery and skill. But are 17,000 jihadi fighters in landlocked regions of Iraq and Syria really an imminent and mortal threat to an America with thousands of nuclear weapons and tens of thousands of missiles and bombs and the means to deliver them?
How grave is this crisis? Consider the correlation of forces. Who are the vocal and visible friends and fighting allies of ISIS? They are nonexistent. Read More…
Fear not the rise of the machines? That appears to be the advice given by MIT economist David H. Autor in a paper he recently presented at the Kansas City Fed’s symposium in Jackson Hole, Wyoming. Responding to a significant uptick in economists’ concern over the effects of automation on employment, including the “stunning” results of a poll suggesting “that a plurality of mainstream economists has accepted—at least tentatively—the proposition that a decade of technological advancement has made the median worker no better off, and possibly worse off,” Autor suggests that it will be a lot harder to automate us all away than many journalists and expert commentators have indicated.
Autor argues that “Polanyi’s paradox,” whereby “We can know more than we can tell…” saves us from the threat of total automation dislocation, because there will always be jobs that rely on a variety of particularly human skills and tasks, skills and tasks that we can’t entirely explain to ourselves, much less to a computer. As the philosopher Michael Polanyi himself put it, “The skill of a driver cannot be replaced by a thorough schooling in the theory of the motorcar; the knowledge I have of my own body differs altogether from the knowledge of its physiology.”
While jobs consisting almost entirely of routine tasks, i.e. those easily codified into rules that can then be automated, have been and will continue to be replaced by machine labor, then, there is according to Autor a natural buffer to keep many people employed (if not necessarily well paid). In fact, Autor sees a significant opportunity for computer-enhanced human labor, for “tasks that cannot be substituted by computerization are generally complemented by it.” The construction worker, in his example, has to manage too many variables in a fluid environment to be automated away. However, he can be given a backhoe to replace his shovel, enhancing the productivity of his labor while making backhoe-trained workers more valuable than the merely shovel-ready.
This is a blue-collar example of “skill-biased technical change,” more traditionally described by Autor’s fellow MIT professors (and techno-employment pessimists) Erik Brynjolfsson and Andrew McAfee:
Technologies like robotics, numerically controlled machines, computerized inventory control, and automatic transcription have been substituting for routine tasks, displacing those workers. Meanwhile other technologies like data visualization, analytics, high-speed communications, and rapid prototyping have augmented the contributions of more abstract and data-driven reasoning, increasing the value of those jobs.
Brynjolfsson and McAfee discuss in their book a polarization of the employment market, where high-skill abstract-task intensive jobs are increasingly well compensated, and well complemented by machine labor. Low-skill Polanyi paradox jobs, like janitorial work and home health care, are also insulated from being automated away, but as Autor describes it, they are too well-insulated to even benefit from automation complementing their labor. Because their jobs require only the minimal amount of human reasoning that any competent adult can provide, their wages are depressed by the large supply of interchangeable labor. Middle-skill jobs, however, are nearly wiped out in Brynjolfsson and McAfee’s analysis.
Here, too, Autor finds some reason for more optimism. Read More…
Yesterday, Attorney General Eric Holder announced the latest in a series of massive, multi-billion-dollar settlements extracted from Wall Street’s largest banks as punishment for their role in the subprime mortgage bubble and bust that played a role in collapsing the economy in 2008. Charlotte-based Bank of America, paying for the sins of its own lenders as well as those of investment firm Merrill Lynch and subprime specialist Countrywide (both of which it acquired during the 2008 crisis), will be paying a record $16.65 billion settlement, including an also-record $9.65 billion in cash. This follows last month’s Citigroup settlement of $7 billion (including a then-record $4 billion in cash), and last fall’s $13 billion JPMorgan Chase settlement.
The numbers are truly eye-popping, and the repeated use of “record” has a nice, satisfying ring to it. But, just before the latest BoA settlement was announced, William Cohen denounced the entire exercise as “fine theater with the obvious caveat that nothing even remotely close to justice had been served.” Far from being impressed by the sums involved, Cohen sees the settlements as hush money, paid by Wall Street executives eager to keep the details of their worst behaviors out of the public eye.
The American people are deprived of knowing precisely how bad things got inside these banks in the years leading up to the financial crisis, and the banks, knowing they will be saved the humiliation caused by the public airing of a trove of emails and documents, will no doubt soon be repeating their callous and indifferent behavior.
Instead of the truth, we get from the Justice Department a heavily negotiated and sanitized “statement of facts” about what supposedly went wrong. In the case of JPMorgan, the statement of facts was 21 pages but contained little of substance beyond the fact that an unidentified whistle-blower at the bank tried to alert her superiors to her belief that shoddy mortgages were being packaged and sold as securities. Her warnings went unheeded and the mortgages were packaged and sold all the same.
JPMorgan’s CEO and famed finance guru Jamie Dimon reportedly made a personal call in the lead-up to a federal lawsuit being filed in order to bump his settlement offer by, ultimately, $10 billion dollars, hoping “the meeting would avert the lawsuit, which threatened to spotlight the bank’s questionable mortgage practices before the financial crisis.” The $13 billion was, the Times reported, half of one year’s annual profits for the bank.
Indeed, the Times followed up yesterday to show how even Bank of America was getting off lighter than the numbers might make it seem at first blush. “At issue is how much of the cost of the $7 billion in ‘soft dollars,’ or help for borrowers, the bank will bear under the settlement,” the Times said, as writing down principal on loans may have already been factored into BoA’s plans, or they may not even own the loans anymore, having sold them off in the infamous mortgage-backed securities. The investors who bought the bad bonds “note that the government promotes the settlements as punishment for dumping faulty loans on investors, but it devises deals that saddle investors with some of the costs.”
And in a rather perverse function of intersecting laws,
The actual pain to the bank could also be significantly reduced by tax deductions. Tax analysts, for instance, estimate that Bank of America could derive $1.6 billion of tax savings on the $4.63 billion of payments to the states and some federal agencies under the settlement. Shares of Bank of America jumped 4 percent on Thursday, suggesting investors believe that the bank could take the settlement in stride.
Moreover, the Justice Department and states began to get creative in their negotiations with BoA, Read More…
Among the demands of the “protesters” in Ferguson is that the investigation and prosecution of police officer Darren Wilson be taken away from St. Louis County Prosecutor Robert McCulloch. McCulloch is biased, it is said. How so? In 1964, his father, a St. Louis police officer, was shot to death by an African-American. Moreover, McCulloch comes from a family of cops. He wanted to be a police officer himself, but when cancer cost him a leg as a kid, he became a prosecutor.
Yet, in 23 years, McCulloch has convicted many cops of many crimes, and has said that if Gov. Jay Nixon orders him off this case, he will comply. Meanwhile, he is moving ahead with the grand jury. As for Gov. Nixon, he revealed his closed mind by demanding the “vigorous prosecution” of a cop who has not even been charged and by calling repeatedly for “justice for [Brown’s] family” but not Wilson’s.
What has been going on for two weeks now in Ferguson, with the ceaseless vilification of Darren Wilson and the clamor to arrest him, is anti-American. It is a mob howl for summary judgment, when this case cries out, not for a rush to judgment, but for a long tedious search for the whole truth of what happened that tragic day.
For conflicting stories have emerged. The initial version was uncomplicated. On August 9, around noon, Brown and a companion were walking in the street and blocking traffic when ordered by Wilson to move onto the sidewalk. Brown balked, a scuffle ensued. Wilson pulled out his gun and shot him six times, leaving Brown dead in the street. Open and shut. A white cop, sassed by a black kid, goes berserk and empties his gun.
Lately, however, another version has emerged.
Fifteen minutes before the shooting, Brown was caught on videotape manhandling and menacing a clerk at a convenience store he was robbing of a $44 box of cigars. A woman, in contact with Wilson, called a radio station to say that Brown and Wilson fought in the patrol car and Brown had gone for the officer’s gun, which went off. When Brown backed away, Wilson pointed his gun and told him to freeze. Brown held up his hands, then charged. Wilson then shot the 6’4,” 292-pound Brown six times, with the last bullet entering the skull. St. Louis County police then leaked that Wilson had been beaten “severely” in the face and suffered the fracture of an eye socket. Brown’s companion, Dorian Johnson, says Brown was running away when Wilson began to fire. But, according to the autopsies, all of the bullets hit Brown in the front. ABC now reports that Dorian Johnson has previously been charged with filing a false police report.
If the first version is true, Wilson is guilty. If the second is true, Brown committed two felonies before being shot, and Darren Wilson fired his weapon in defense of his life. Read More…
People see Facebook as a neutral platform, which they can use to have conversations. But in reality, it’s a company, and thus it has motivations and strategies and interests involved in the ways its users utilize the network. Facebook has commercial reasons behind its desire to curate specific experiences, happy or otherwise—and as of late, it’s been working rather hard to curate a positive, uplifting (or “Upworthy”) experience for its users, as Casey Johnston reports in an ArsTechnica post:
As the protests in Ferguson, Missouri over police fatally shooting 19-year-old Mike Brown have raged through the past several nights, more than a few people have noticed how relatively quiet Facebook news feeds have been on the matter. While #Ferguson is a trending hashtag, Zeynep Tufekci pointed out at Medium that news about the violence was, as best, slow to percolate through her own feed, despite people posting liberally about it.
While I’ve been seeing the same political trending tags, my feed is mundane as usual: a couple is expecting a baby. A recreational softball team won a league championship. A few broader feel-good posts about actor Chris Pratt’s ice-bucket challenge to raise awareness and money for ALS, another friend’s ice-bucket challenge, another friend’s ice-bucket challenge… in fact, way more about ice bucket challenges than Ferguson or any other news-making event. In my news feed organized by top stories over the last day, I get one post about Ferguson. If I set it to organize by “most recent,” there are five posts in the last five hours.
This harkens back to Facebook’s recently released news feed manipulation study, which tracked users’ responses to positive vs. negative content. Facebook manipulated its algorithm to give a different experience to different users, and documented the effect these disparate algorithms had on their emotional wellbeing. All of this was in order—they purport—to curate a better user experience. Johnston adds,
One of the things Facebook thrives on is users posting and discussing viral content from other third-party sites, especially from sources like BuzzFeed, Elite Daily, Upworthy, and their ilk. There is a reason that the content users see tends to be agreeable to a general audience: sites like those above are constantly honing their ability to surface stuff with universal appeal. Content that causes dissension and tension can provide short-term rewards to Facebook in the form of heated debates, but content that creates accord and harmony is what keeps people coming back.
Facebook is not the only company that curates a given experience for commercial gain. Most online tools use algorithms for similar curation—Google, for instance, doesn’t show you all potential search items for the keyword you give it: it shows you specific search items it thinks you will like, ones that accord with its idea of your personality or character. And while this can be a useful service, it also has it drawbacks. All of these tools have a way of perpetuating specific traits or bents (virtuous or otherwise) in users: if you used to search Google for pornography, it’s likely to continue offering you pornography, even if you are trying to avoid it. If you looking at an old ex’s photo album on Facebook, perhaps in a moment of jealousy or curiosity, their postings will continue showing up in your news feed for days or weeks to come. Outside of the personal, we should pay attention to the effect these sites have on our perception of the outside world: as Eli Pariser noted in a March 2011 TED Talk, two users could Google the word “Egypt” and get completely different results—one would get news of protests and the Arab Spring and Morsi, while the other would only see tourism information and pictures. Pariser noted that Facebook does the same: Read More…
It’s time once again to bring out the well-worn quote (from Marx) that history repeats itself, “first as tragedy, then as farce.” No one in my home could take their eyes off the television Monday evening, though all that was on was Jake Tapper marching around Ferguson, Missouri tracking down rumors of fresh violence. It all seemed so scripted. The outside world may be blowing up—major crises in Ukraine, the Mideast, and the South China Sea—which if escalated and spread could bring the world to the brink of global war. But we couldn’t turn away from the rinky-dink St. Louis suburb. Outside agitators—were there not such people, one would be want to put this ’60s era retro-phrase in quotation marks—have purportedly come to Ferguson from as far as New York and California.
There was real tragedy in the first round of American inner city riots—provoked in most cases by genuine police brutality. In 1967, the New York Review of Books ran Tom Hayden’s lengthy depiction of the riot there. Hayden, then a well known New Left figure, knew Newark, had been an anti-poverty organizer there. His sympathies were obvious, but Hayden is no fool, and I’m sure most of the facts are correct. One thing which stands out in those days was the way in which law and order views were expressed in forms indistinguishable from race baiting. Can one imagine the Democratic governor of a major state today saying, as New Jersey governor Richard Hughes did, before calling out the National Guard, “The line between the jungle and the law might as well be drawn here as any place in America.” Hughes, recall, was not George Wallace but a major progressive figure.
The costs of the insurrection to Newark were brutal. In Hayden’s summary
In the carrying out of the Governor’s weekend definitions and policies at least twenty Negroes died, nearly all from police shooting, another 1000 were injured and 1000 jailed; more than 100 Negro-owned businesses were attacked by police and troopers; and hundreds of apartments were fired into along the ghetto’s streets.
The outcomes were comparable in Detroit, in both cities the riots sparking an exodus of white population, with its skills and capital. New York, in great part due to the personal courage of John Lindsay, avoided the hot summer. The riots spurred a major national effort to integrate urban police forces, an effort which evidently bypassed the suburb Ferguson.
In those days the precipitating incidents were, in ways that the Ferguson killing does not seem to be, clear cut cases of racist police brutality. No one who saw the video of Michael Brown robbing a convenience store will think it out of the question that the police officer who shot him 15 minutes later feared genuinely for his life. Of course the shooting should not have happened: police officers have to be able to make arrests without using deadly force—and if they can’t, they should in most cases give way—as Brown’s shooter must surely feel today. But it can’t be easy in the heat of confrontation—just as most highly skilled professionals will make errors under duress, so will an average cop.
To read Hayden’s account is to be reminded that though history may in some ways repeat itself, in America race relations are much better, and feel very different. Read More…
1999’s “Ravenous” turns up on a lot of lists of underrated horror movies. It’s hard to get people to take you seriously when your plot is, “Guy Pearce eats people in the Old West.” But now that I’ve finally taken advantage of “Ravenous”‘s availability through Netflix streaming, I can tell you: This movie is criminally underrated. It’s not just a creepy, haunting cannibal Western from the producer of “Donnie Darko,” although that sounds great to me; it’s also an exploration of the temptations of power and the acceptance of powerlessness.