State of the Union

Bias and Recusal in Ferguson’s Investigation

Among the demands of the “protesters” in Ferguson is that the investigation and prosecution of police officer Darren Wilson be taken away from St. Louis County Prosecutor Robert McCulloch. McCulloch is biased, it is said. How so? In 1964, his father, a St. Louis police officer, was shot to death by an African-American. Moreover, McCulloch comes from a family of cops. He wanted to be a police officer himself, but when cancer cost him a leg as a kid, he became a prosecutor.

Yet, in 23 years, McCulloch has convicted many cops of many crimes, and has said that if Gov. Jay Nixon orders him off this case, he will comply. Meanwhile, he is moving ahead with the grand jury. As for Gov. Nixon, he revealed his closed mind by demanding the “vigorous prosecution” of a cop who has not even been charged and by calling repeatedly for “justice for [Brown's] family” but not Wilson’s.

What has been going on for two weeks now in Ferguson, with the ceaseless vilification of Darren Wilson and the clamor to arrest him, is anti-American. It is a mob howl for summary judgment, when this case cries out, not for a rush to judgment, but for a long tedious search for the whole truth of what happened that tragic day.

For conflicting stories have emerged. The initial version was uncomplicated. On August 9, around noon, Brown and a companion were walking in the street and blocking traffic when ordered by Wilson to move onto the sidewalk. Brown balked, a scuffle ensued. Wilson pulled out his gun and shot him six times, leaving Brown dead in the street. Open and shut. A white cop, sassed by a black kid, goes berserk and empties his gun.

Lately, however, another version has emerged.

Fifteen minutes before the shooting, Brown was caught on videotape manhandling and menacing a clerk at a convenience store he was robbing of a $44 box of cigars. A woman, in contact with Wilson, called a radio station to say that Brown and Wilson fought in the patrol car and Brown had gone for the officer’s gun, which went off. When Brown backed away, Wilson pointed his gun and told him to freeze. Brown held up his hands, then charged. Wilson then shot the 6’4,” 292-pound Brown six times, with the last bullet entering the skull. St. Louis County police then leaked that Wilson had been beaten “severely” in the face and suffered the fracture of an eye socket. Brown’s companion, Dorian Johnson, says Brown was running away when Wilson began to fire. But, according to the autopsies, all of the bullets hit Brown in the front. ABC now reports that Dorian Johnson has previously been charged with filing a false police report.

If the first version is true, Wilson is guilty. If the second is true, Brown committed two felonies before being shot, and Darren Wilson fired his weapon in defense of his life.  Read More…

Posted in . Tagged , . 14 comments

Will the Panopticon Save Us From the Police?

Since the shooting of Michael Brown by a Ferguson, Missouri police officer over a week ago, the idea of arming police with personal body cameras to record their on-duty actions has gained fresh currency.

German Lopez at Vox wrote, “If police officers were required to wear body cameras, questions about their conduct — like the ones that have arisen in the wake of the Michael Brown shooting — could be avoided.” Derek Thompson at The Atlantic  noted, “Although military technology has arguably given law enforcement an unreasonable amount of power, there is another piece of technology that could help restrain the militarization of America’s police in the future: a camera.” And Nick Gillespie of Reason insisted that “While there is no simple fix to race relations in any part of American life, there is an obvious way to reduce violent law enforcement confrontations while also building trust in cops: Police should be required to use wearable cameras and record their interactions with citizens.”

The idea is to attach small, portable cameras to police officers’ collars or sunglasses that can then provide objective evidence to be called upon to settle any disputes or complaints about misconduct. The California city of Rialto has been conducting a rigorous and increasingly high-profile police body camera experiment over the past couple years, and has found results to exceed what expectations even their staunchest advocates could likely have had. As the New York Times reported a year ago,

In the first year after the cameras were introduced here in February 2012, the number of complaints filed against officers fell by 88 percent compared with the previous 12 months. Use of force by officers fell by almost 60 percent over the same period.

Officers had been randomly assigned cameras and instructed to turn them on for any encounter with civilians. The Times continued, “Officers used force 25 times, down from 61 over the previous 12 months. And those wearing cameras accounted for 8 of those incidents.” The Rialto police chief William A. Farrar observed, “When you put a camera on a police officer, they tend to behave a little better, follow the rules a little better … And if a citizen knows the officer is wearing a camera, chances are the citizen will behave a little better.”

Yet even setting aside the natural privacy concerns raised by strapping recording devices to every patrol officer circumambulating their city’s streets, it is worth raising a smaller, subtler, but nevertheless potentially significant concern: the increasingly intermediated cop. One only has to glance in the window of a local patrol car to see the sprawling array of screens, keyboards, and communication devices designed to link the officer to all the information they could need. The problem being, of course, that the most important information the common cop needs still can’t be pulled up within his car: the knowledge gained from building relationships with those in the community he patrols.

That relationship-building is a core component of a police officer’s mission, and may be almost entirely divorced from the work he can get done on his car’s mounted notebook computer. It also requires a certain amount of discretion, getting to know a neighborhood’s warts as well as its virtues. The conversations that give an officer an accurate picture of the seedy but not destructive side of his citizens’ lives could very well be more difficult or awkward should the policeman’s sunglasses be rolling film.

Mark Steyn, in addressing the more expressly dangerous and frightening distortion of the police seen in the militarization of the Ferguson PD, gave some interesting and relevant history:

To camouflage oneself in the jungles of suburban America, one should be clothed in Dunkin’ Donuts and Taco Bell packaging. A soldier wears green camo in Vietnam to blend in. A policeman wears green camo in Ferguson to stand out – to let you guys know: We’re here, we’re severe, get used to it.

This is not a small thing. The point about ‘the thin blue line’ is that it’s blue for a reason. As I wrote a couple of months ago:

‘The police’ is a phenomenon of the modern world. It would be wholly alien, for example, to America’s Founders. In the sense we use the term today, it dates back no further than Sir Robert Peel’s founding of the Metropolitan Police in 1829. Because Londoners associated the concept with French-style political policing and state control, they were very resistant to the idea of a domestic soldiery keeping them in line. So Peel dressed his policemen in blue instead of infantry red, and instead of guns they had wooden truncheons.”

So, when the police are dressed like combat troops, it’s not a fashion faux pas, it’s a fundamental misunderstanding of who they are.

When the police are moving around dressed as Google Glassholes, might they also be living a misunderstanding? Body cameras may ultimately be necessary to protect us from the police, and the police from themselves. But the Benthamite logic that keeps our present-day peace will be fundamentally different from that governing the polis-protectors of Sir Robert Peel.


Posted in . Tagged , . 16 comments

Legalize Opium, Not Heroin

That the war on drugs, in its current form, is a failure is obvious to all but the most blinkered observers. But the proper response to this failure is a matter of contention. Pope Francis, for instance, recently suggested we address the underlying causes of drug abuse (without ending prohibition). Others recommend treatment-based approaches. The more libertarian among us are likely to back complete legalization of all drugs.

I would like to recommend a policy that does not reject any of the above as possibly the ultimate answer to this failure, but takes a measured, experimental step that, while running little risk of making matters significantly worse, holds out, I think, great hope for improving them.

With marijuana, the question is apparently being decided in favor of gradual, piecemeal legalization. But heroin and cocaine legalization has far less support, and with good reason: these drugs are far more addictive than pot. (I am not saying that therefore they should not be legalized, merely that is understandable that people might be more sanguine about marijuana legalization than about legalizing harder drugs.) I wish to suggest a halfway sort of legalization that I feel offers several potential upsides: let us try legalizing the milder substances from which cocaine and heroin are derived, namely, coca leaves and opium.

Perhaps if we could simply make cocaine and heroin disappear by wishing it were so, it would be the best of all possible solutions. But basing policy on fantasy is generally a poor choice. (Please see the second Iraq war for evidence.) And the current policy of strict prohibition has fueled organized crime and led to the increasing militarization of our police forces. My proposal offers the following advantages over the current situation:

  1. It allows us to test the waters of just how socially damaging full cocaine or heroin legalization might be, without simply plunging in head first. If simply legalizing coca leaves and opium produces droves of drugged-out zombies (which I don’t think it would), we could rule out full cocaine and heroin legalization, and even consider repealing this halfway legalization. If the effects are that bad, we can be sure that they would have been worse if we had legalized the harder forms of these drugs.
  2. A strong libertarian argument for full legalization (I say ”strong,” and not “decisive,” because I think there are significant counter-arguments here), is that many people are able to use these drugs in moderation without destroying their lives. (See the work of Jacob Sullum if you doubt this is true.) “Why,” the libertarian asks, ”should these people be denied legal access to them simply because others will abuse them? (And note: while such usage is often referred to as “recreational,” it might often more accurately be described as”medicinal”: such moderate users may suffer from problems in focusing, and find that a mild dose of cocaine alleviates this difficulty, or be in chronic pain, and find that a mild dose of heroin offers them the best relief.) Well, these moderate, responsible users ought to find a milder, safer, and legal form of the drug they use to be a very welcome thing indeed. They could avoid the risk of arrest, of unregulated and adulterated street products that may contain dangerous additives, of job loss, and would enjoy a much greater ability to control their dosage.
  3. The considerations in point number two indicate what I think would be the greatest potential upside of this idea: its impact upon the economics of the trade in hard drugs. The shift in consumption predicted above would greatly lessen the demand for the more dangerous forms of these drugs. Read More…
Posted in . 21 comments

Why the Democratic West Dreads Diversity

At the end of the Cold War, Francis Fukuyama famously wrote that our world may be at the “end of history” where “Western liberal democracy” becomes “the final form of human government.” A quarter century on, such optimism seems naive.

Consider the United States, the paragon of liberal democracy. An NBC/Wall Street Journal poll finds that only 14 percent of the people approve of Congress and only 19 percent approve of the GOP. Seventy-one percent believe America is headed in the wrong direction. Nor is this the exceptional crisis of a particular presidency.

JFK was assassinated. LBJ was broken by race riots and anti-war demonstrations. Richard Nixon, facing impeachment, resigned. Gerald Ford was rejected by the electorate. Ronald Reagan was highly successful—like Nixon, he won in a 49-state landslide after his first term—but during the Iran-Contra scandal of 1987 there was a real threat of a second impeachment. And Bill Clinton was impeached.

Our democracy seems to be at war with itself. Now there is talk of impeaching Obama. It will become a clamor should he grant executive amnesty to 5 million illegal immigrants. Political science has long described what seems to be happening.

From the tribal leader comes the monarch, whose reign gives way to an aristocracy that produces a middle class that creates a republic, the degenerative form of which is that pure democracy of which John Adams wrote: ”Democracy never lasts long. It soon wastes, exhausts, and murders itself. There is never a democracy that did not commit suicide.” Then comes the strong man again.

Is that our future? Is Western democracy approaching the end of its tether, with the seeming success of authoritarian capitalism in China and Russia? Recent history provides us with examples.

World War I, begun 100 years ago, brought down many of the reigning monarchs of Europe. The caliph of the Ottoman Empire was sent packing by Kemal Ataturk. Czar Nicholas II was murdered on the orders of the usurper Vladimir Lenin.

Fighting off a Bolshevik invasion, Marshal Pilsudski rose to power in Poland. Admiral Miklos Horthy ran the communists out of Budapest and took the helm. Mussolini led the 1922 March on Rome. Hitler’s Beer Hall Putsch in 1923 failed, but his party utilized democracy’s institutions to seize power and murder democracy. Out of the Spanish Civil War came the dictatorship of Gen. Franco. And so it went.

Vladimir Putin may be the most reviled European leader among Western elites today, but he is more popular in his own country than any other Western ruler, with 80 percent approval, for standing up for Russia and Russians everywhere. Polls in France say that, were elections held today, Marine Le Pen would replace Francois Hollande in the Elysee Palace.

Eurocrats bewail what is happening, but, inhibited by secularist ideology, fail to understand it. They believe in economism, rule by scholarly global elites, and recoil at the resurgence of nationalism and populism. They do not understand people of the heart because they do not understand human nature. Read More…

Posted in . 18 comments

To End the Child Migrant Crisis, End the Drug War

As children from Central America continue to pour across the U.S. border, pundits and politicians are still searching for a scapegoat. 70,000 unaccompanied minors are expected to cross the border this year, highlighting an urgent need to fix whatever precipitated the problem. Unfortunately, the real culprit that both sides of the aisle continue to ignore is one of the federal government’s own making—the War on Drugs that has incited violence south of America’s border for decades.

Americans shouldn’t expect their elected officials to comprehend the current crisis’ root cause anytime soon. Earlier this month, 33 House Republicans pointed the finger at President Obama in a recent letter, pointing to his 2012 executive order deferring court action for child migrants. However, this explanation is unlikely since the program only applies to children that arrived prior to 2007.

Meanwhile, many Democrats are eager to point out that most of the current immigration policies were implemented under the Bush Administration, including a 2008 law giving children from non-contiguous countries the right to a court hearing. Yet, even this does not address the motivation behind why thousands of Central American children are willing to risk their lives to cross a vast desert for the mere chance of a sympathetic court ruling.

Fortunately, there is one true impartial expert who is not afraid to speak truth above this bipartisan hullabaloo. Marine Corps Gen. John F. Kelly, head of the United States Southern Command, penned an essay for Military Times earlier this month pointing to the direct cause of the problem: “Drug cartels and associated street gang activity in Honduras, El Salvador and Guatemala, which respectively have the world’s number one, four and five highest homicide rates, have left near-broken societies in their wake.”

Conservatives may be skeptical that the drug violence is the immediate source of the problem, since Central American countries have been in a state of unrest of decades, but the most recent crime statistics are hard to ignore. “By U.N. statistics, Honduras is the most violent nation on the planet with a rate of 90 murders per 100,000 citizens,” Gen. Kelly points out. Bordering Central American countries are not too far off either, with Guatemala at 40 and El Salvador at 41. By comparison, the murder rate in current combat zones like Afghanistan and the Democratic Republic of the Congo are starkly lower at 28 per 100,000 as of 2012.

The dramatic surge in violence in recent years is indisputable. In 2000, the murder rate per 100,000 people was just short of half of what it currently is for Guatemala and Honduras at 26 and 51, respectively. Given these stark statistics, it’s no surprise that a recent UN survey found violence to be a top reason many migrant children cited their motivation to leave their homeland, such the 57 percent of those from Honduras whose reasons for leaving “raised potential international protection concerns.”

The United States is in large part responsible for precipitating the problem. It should come as no surprise that the production of narcotics, like any other black market good, was pushed into the hands of disreputable characters once the federal government started ramping up prohibition enforcement in the 1970s. In the decades since, thousands of Central American gangsters have been deported upon serving sentences in the United States, allowing violent criminals to dominate the drug trade south of the border. Read More…

Posted in . 16 comments

Bergdahl, Manning, and the Overextended National Security State

NSA Headquarters Trevor Paglen / cc

The contretemps over the release of Sgt. Bowe Bergdahl has been occasion for a raft of commentary taking President Obama’s lack of competence as the defining feature of the affair. And while there is certainly ample cause to call into question the merits of the deal with the Taliban, the wisdom of Mr. Obama’s highly misleading press conference with the Sergeant’s parents, and the subsequent reappearance of the wondrous Susan Rice on the Sunday morning talk shows, to my mind the most troubling aspect of the Bergdahl affair has to do with how someone so obviously troubled made his way into the ranks in the first place.

Like the deeply troubled Pfc. Chelsea (née Bradley) Manning before him, Bergdahl should never have been accepted into the ranks in the first place. He was admitted into the Army largely because an incompetent President, going against the wishes of the country, decided to double down on an ill-conceived and grossly mismanaged war. The story of how Bergdahl, who was discharged from the Coast Guard for psychological reasons in 2006, found his way back into what we are endlessly told is the greatest military in the history of the world, is profoundly discouraging. The Washington Post reports that by 2008, the year Bergdahl enlisted, the Army was issuing waivers to those with criminal backgrounds, health issues, and “other problems” at the rate of one for every five recruits. This perhaps points to a larger problem, reaching beyond the armed services.

The post-9/11 national security state, which consists of at least 17 federal intelligence agencies and organizations, requires hundreds of thousands of individuals to staff it. In light of the cases of Messrs. Manning, Snowden, and Bergdahl, it has become increasingly clear that the government has created a significant problem for itself. This was bound to happen given the sheer numbers involved. Consider the following from the groundbreaking 2010 report by the Post’s Dana Priest and William Arkin:

  • Some 1,271 government organizations and 1,931 private companies work on programs related to counterterrorism, homeland security, and intelligence in about 10,000 locations across the United States.
  • An estimated 854,000 people, nearly one-and-a-half times as many people as live in Washington, D.C., hold top-secret security clearances.

Four years on, the number of security clearances issued has continued apace. According to a report released by the Office of the Director of National Intelligence this past April, from 2012-13 the number of people deemed “eligible” for access to classified information increased by nearly a quarter of a million people. Roughly 5.15 million people currently hold security clearances, out of which around a million are outside contractors, about half of whom hold a top-secret clearance.

The conversation that needs to be going on should be focused on whether the national security structure, as it stands right now, is actually supportable. The Bergdahl affair ought to serve as a warning that as we keep expanding the military and enlarging the intelligence apparatus, the law of diminishing returns will (and probably has) set in. Yet no one in Washington ever thinks to say: enough. It’s past time for Congress to reconsider the efficacy, to say nothing of the desirability, of the post-9/11 national security leviathan.

Posted in . Tagged , , , . 12 comments

Reading & the “Attention War”

Focus is a difficult thing to muster these days. As David Brooks wrote in a recent New York Times column, we are all “losing the attention war. … Many of us lead lives of distraction, unable to focus on what we know we should focus on.” This affects many of our daily activities, especially things that require special mental concentration. It creates a minute-to-minute challenge for the reader, who will always feel the pull of the next Twitter story, the latest email, the newest Facebook post, etc. Whether reading news articles or books, we feel the distractions itch at our brain.

Yet despite our society’s general lack of focus, we aren’t necessarily abandoning long books—the Goldfinch, one of the most popular novels on the market right now, is 771 pages long. But this lack of focus does mean that modern books increasingly cater to the short-term attention span. Tim Park explains at the New York Review of Books:

Never has the reader been more willing than today to commit to an alternative world over a long period of time. But with no disrespect to Knausgaard, the texture of these books seems radically different from the serious fiction of the nineteenth and early-twentieth centuries. There is a battering ram quality to the contemporary novel, an insistence and repetition that perhaps permits the reader to hang in despite the frequent interruptions to which most ordinary readers leave themselves open. Intriguingly, an author like Philip Roth, who has spoken out about people no longer having the “concentration, focus, solitude or silence” required “for serious reading,” has himself, at least in his longer novels, been accused of adopting a coercive, almost bludgeoning style.

The author suggests that, in this world of limited attention spans, the serious writer may have to parcel out his or her writing into shorter sections or volumes, in order for anything like the eloquent and verbose reading of prior eras to remain. An idea worth considering: could this trend toward short reads give rise to a deeper appreciation of poetry in the future? It’s thought-provoking and often mind-taxing, true—but it’s short. Or at least, some of it is. It would be interesting to see whether poetry makes a comeback in the future.

Another possibility worth considering: what if we brought back the short story, or syndicated novel? The New Yorker publishes short fictional works, but few other large journalistic publications do. Yet these fictional stories used to occupy a considerable portion of the news cycle. What if modern novelists, like Charles Dickens in days past, published their novels in publications like the New York Times or TIME magazine—one chapter at a time? I think the public would love it—and it could also help print publications build an audience that they seem to be steadily losing. Read More…

Posted in , , , . 8 comments

Handwriting Matters, After All

Handwriting is largely viewed as an outdated skill: typing offers more efficient ease to teachers and students alike. The new Common Core standards, adopted in most states, only includes teaching of legible writing in kindergarten and first grade.

But according to new studies by psychologists, this recent dismissal of handwriting could have unintended consequences: the underrated skill is actually a boon to brain development and memory retention. New York Times reporter Maria Konnikova explained these studies in a Monday article:

When the children composed text by hand, they not only consistently produced more words more quickly than they did on a keyboard, but expressed more ideas. And brain imaging in the oldest subjects suggested that the connection between writing and idea generation went even further. When these children were asked to come up with ideas for a composition, the ones with better handwriting exhibited greater neural activation in areas associated with working memory — and increased overall activation in the reading and writing networks.

… Two psychologists, Pam A. Mueller of Princeton and Daniel M. Oppenheimer of the University of California, Los Angeles, have reported that in both laboratory settings and real-world classrooms, students learn better when they take notes by hand than when they type on a keyboard. Contrary to earlier studies attributing the difference to the distracting effects of computers, the new research suggests that writing by hand allows the student to process a lecture’s contents and reframe it — a process of reflection and manipulation that can lead to better understanding and memory encoding.

Current educational trends tend to emphasize vocational and pragmatic elements of education. Which subjects will help students get the most lucrative jobs? Which will make them the most competitive on a global stage? Which skills guarantee the greatest college-readiness?

Yet in the midst of our quantification, we’ve lost qualitative ground. In the age of numbers, we can’t teach handwriting because it is beautiful, fun, and a building block for deeper communication and understanding of language. Instead, we dispose of it—at least until the studies come out, in all their number-crunching glory, to tell us that handwriting is actually worth something. Then, in a rather ironic twist, we discover that these qualitative skills actually hold some quantitative value, after all.

This discovery reflects our larger discussion of the humanities and their role in the modern sphere: we wonder what such studies are worth, when the modern job market seems to demand experiential, pragmatic skill sets. We vest importance in what you can do, not how you can think.

Yet the new data on handwriting seems to have some people, formerly dismissive of handwriting’s importance, conceding ground. One Yale psychologist admitted that “Maybe it [writing by hand] helps you think better.”

Some people have always believed handwriting to be beautiful and important. They know writing by hand has helped them connect meaningfully with information, in addition to helping them communicate clearly with others. But perhaps there is a level of sentiment in such value-based affection. Thankfully, we now have the data to prove that, aside from its qualitative benefits, handwriting serves important quantitative purposes, as well. Hopefully some teachers (and students) will see these truths, and take note.

Posted in , , , . Tagged , , . 15 comments

When Automation Comes for the Professional Class

Everett Collection / Shutterstock.com

Last week Michael Strain composed a compelling summary of many of the reasons technology’s effects on employment are beginning to be feared now at levels perhaps unseen since the time of the first Luddites during Britain’s Industrial Revolution. As he wrote,

There is no question that technology is already having a major impact on the labor market. Over the last several decades, employment in Western economies grew in both low- and high-skill occupations, but fell in middle-skill occupations. That’s because middle-skill, middle-class occupations are those that can be most easily replaced by technology. (Think of a 1970s-era bank that employed a president, a bank teller, and a custodian. Today, it’s the bank teller who’s gone, replaced by an ATM.)

Many professionals perusing this site will read that paragraph and cluck to themselves with mild concern at the plight of the former bank teller, but reassure themselves that their job is high skill, and that they couldn’t possibly be replaced by a machine. Middle-skill jobs are for the undereducated, after all. Yet in the latest issue of City Journal, John O. McGinnis makes the case for seeing even such an august profession as the law as being composed of a great many essentially middle-skill jobs, almost all of which are increasingly coming under the competency of computers. As he puts it,

Law is, in effect, an information technology—a code that regulates social life. And as the machinery of information technology grows exponentially in power, the legal profession faces a great disruption not unlike that already experienced by journalism, which has seen employment drop by about a third.

McGinnis surveys all the legal work computers are already starting to take up, and finds “Discovering information, finding precedents, drafting documents and briefs, and predicting the outcomes of lawsuits—these tasks encompass the bulk of legal practice.” Just as a welder may have once thought his work to be too exacting to be done by brute robotics, so professionals think of their work as being too humane for a computer to understand. This mindset makes the classic mistake of thinking that computers would have to replicate human phenomenology in order to compete with human activities. Instead, all a computer has to do is emulate a task with greater speed and in greater volume. An initial lawyer is needed to guide the programmer, and a final lawyer will for the foreseeable future be needed to tidy up the finished product. But all the hordes of junior associates laboring in discovery and research can be neatly replaced by search engines powered by the artificial intelligence of IBM’s Watson, the computer with the language skills to vanquish Jeopardy’s human champions.

Just as rapid reporting journalism is starting to be done by algorithm, so too will many traditional high-skill workers find their skills start to degrade in the eyes of the marketplace when automation comes to them.


Posted in . 18 comments

Colorado Makes an End Run Around the FDA

Kianna Karnes was a 41-year-old mother of four children diagnosed with kidney cancer in 2002. Her doctors prescribed interleukin-2, the only medication approved by the Food and Drug Administration (FDA) at the time to treat the disease, but it proved insufficient to stop the cancer’s spread. Kianna’s family petitioned the FDA to allow their mother to try an investigational new drug (IND) still stuck in the agency’s approval process. After all, she had nothing to lose. Despite gaining powerful allies including Congressman Dan Burton (R-Ind.) and the Wall Street Journal editorial board, it was too late. The FDA approved Kianna’s IND request the very same day she died.

Kianna’s story is just one of countless patients whose lives have remained in jeopardy because of the FDA’s dangerously inefficient drug approval process. Now some states are taking it upon themselves to make the necessary reforms. Last week, Colorado became the first to sign a so-called “Right to Try” bill into law, empowering patients with the ability to petition pharmaceutical companies to provide them with INDs if they’ve exhausted all other options. While the push for Right to Try reform is compelling, its fate still remains uncertain due to the FDA’s reluctant history of allowing patients access to INDs.

While Kianna’s case was critical in drawing attention to the FDA’s bureaucratic approval process, it is by no means the first time the problem has gained national attention. In the 1980s, Milton Friedman wrote and spoke about the FDA’s perverse incentives in his book and television series, Free to Choose. Friedman explained that it is much riskier for the FDA to approve a drug than disapprove it, since the agency can be publicly blamed if a bad drug goes to market, but nobody would know if a good drug is rejected. As a result, “we all know of people who have benefited from modern drugs,” yet “we don’t hear much about … the beneficial drugs that the FDA has prohibited.”

Granted, the approval process has marginally improved since the era of Free to Choose. In the late 1980s, the FDA introduced Expanded Access Programs (EAPs) in response to the AIDS crisis, allowing patients access to potentially life-saving INDs. However, the federal agency’s bloated bureaucracy has unsurprisingly crept into this compassionate use program. As Christina Corieri of the Goldwater Institute explains, “[F]rom 1987 until 2002, the FDA approved only 44 treatment IND applications for conditions ranging from AIDS to chronic pain – an average of less than three per year.”

While the FDA also grants individual IND applications for specific drugs, the approval process to do so is also backed up. Patients must appear before an institutional review board (IRB) at a regional medical facility. However, many IRBs only meet once a month and can be located hundreds of miles from a patient’s home. Consequently, countless lives may have been lost navigating the FDA’s bureaucracy.

Right to Try may soon allow America’s sickest to cut through the red tape. Originally drafted as model legislation by the libertarian Goldwater Institute, the bill has received bipartisan support, having been approved as a ballot measure in Arizona and signed into law in Colorado; it is currently awaiting Gov. Bobby Jindal’s signature in Louisiana. Specifically, the model legislation would allow pharmaceutical companies to provide terminally ill patients with INDs that have passed at least Phase I of the FDA’s approval process, and prevents the state from bringing legal action against them. Read More…

Posted in . Tagged . 6 comments
← Older posts