I once tutored a student who could write an A+ essay, and then get a D on her multiple-choice tests. In working with that student, I learned that these two different exercises required entirely different skills. I learned that not all students test well—an unfortunate trait in this age of testing frenzy. The SAT and ACT rule supreme over the futures of prospective college students across the U.S. Want to attend an Ivy League? The tests will determine your fate.
Thanks to a new experiment being conducted this year, liberal arts school Bard College is breaking this mold. While students can still submit a standard application, with the traditional list of SAT scores, GPA, extracurriculars, etc., the New York Times reports that students can also opt for a different (and in many ways, more difficult) project:
… Bard for the first time invited prospective freshmen to dispense with all the preamble, and just write four long essays chosen from a menu of 21 scholarly topics. Very scholarly topics, like Immanuel Kant’s response to Benjamin Constant, absurdist Russian literature and prion disorders. The questions, along with the relevant source materials, were all available on the Bard website. As for the four essays, totaling 10,000 words, they were read and graded by Bard professors. An overall score of B+ or better, and the student got in.
So you can send in your reading lists, club activity, academic references, and transcripts. Or you can write 2,500 words on the topic, “What is the Relationship Between Truth and Beauty?” Which exercise, do you think, is more beneficial to the student? Which measures their creativity—and which demonstrates their ability to jump through hoops?
Bard’s president, Leon Botstein, said the experiment is an act of “declaring war on the whole rigmarole of college admissions.” The typical admissions process picks students based on their best set of quantifiable skills. But this essay method requires and reveals students’ resilience, creativity, and erudition.
Not surprisingly, it’s a rigorous exercise, and many students did not complete the process. The Times reports that only 50 people ended up submitting essays—applicants aged 14 through 23, hailing from seven countries and 17 states. Nine submissions were not complete. All three homeschooled applicants were accepted.
However, as awareness of the program grows, it seems likely they’ll receive more applicants—from students who delight in thinking and writing, or perhaps from students who struggled with tests and classes, and want a second chance. Of course, this process defies the quantifiable designations of a normal application process, and one must applaud Bard for defying the automatous ease of the modern era. This application process, if it grows, will mean more work for all parties.
But it also offers greater goods to those involved: it stretches the application process from a mere filling out of forms, into a learning process itself. As one student essayist told the Times, “I thought about other colleges, but when I started working on the essays, I became sort of obsessed.” Bard’s experiment takes learning out of the classroom, and challenges students at the very outset of their academic career.
While the traditional college application process isn’t wrong, it does leave important knowledge—and important people—out in the cold. Perhaps this experiment will encourage other institutions to look with greater depth at students’ ideas, not just their GPA.
Last Saturday I had the honor of addressing the 50th anniversary meeting of the Philadelphia Society. The title of the meeting was “The Road Ahead—Serfdom or Liberty?” My remarks sought to suggest that conservatives should be more circumspect about their rote incantation of the word “liberty,” and that there may even be something to be said for “serfdom,” properly understood. My remarks in full are printed, below.
“The Road Ahead—Serfdom or Liberty?”
The Philadelphia Society Annual Meeting—50th Anniversary
Patrick J. Deneen, The University of Notre Dame
I would like to begin my remarks by calling to mind two commercials that aired at different points during the last five years. The first aired in 2010, and was produced by the Census Bureau in an effort to encourage Americans to fill out their census forms. It opens with a man sitting in his living room dressed in a bathrobe, who talks directly into the camera in order to tell viewers that they should fill out the census form, as he’s doing from his vantage as a couch potato.
Fill out the census, he says, so that you can help your neighbors—and at this point he gets out his chair and walks out the front door, past his yard and the white picket fence and points at his neighbors who are getting into their car—You can help Mr. Griffith with better roads for his daily car pool commute, he says—and then, indicating the kids next door, “and Pete and Jen for a better school,” and continues walking down the street. Now neighbors are streaming into the quaint neighborhood street, and he tells us that by filling out the census, we can help Reesa with her healthcare (she’s being wheeled by in a gurney, about to give birth), and so on… “Fill it out and mail it back,” he screams through a bullhorn from a middle of a crowded street, “so that we can all get our fair share of funding, and you can make your town a better place!”
The other ad, produced in 2012, was produced by the Obama re-election campaign, though it was not aired on television and has today disappeared from the internet. It was entitled “The Life of Julia,” and in a series of slides it purported to show how government programs had supported a woman named Julia at every point in her life, from preschool funds from a young age to college loans to assistance for a start up to healthcare and finally retirement. In contrast to the Census commercial—which portrayed a neighborhood street filled with people who knew each others’ names—“The Life of Julia” portrayed a woman who appeared to exist without any human ties or relationships, except—in one poignant slide—a child that had suddenly appeared but who was about to be taken away on a little yellow school bus, and as far as we’re shown, is never seen again. No parents, no husband, a child who disappears.
The first ad is a kind of Potemkin Village behind which is the second ad. The first ad shows a thriving community in which everyone knows each others’ names, and as you watch it—if you aren’t duped by what it’s portraying—you are left wondering why in the world would we need government to take care of our neighbors if we knew each other so well? Why is my obligation to these neighbors best fulfilled by filling out the Census form? The commercial is appealing to our cooperative nature and our sense of strong community ties to encourage us to fill out the Census form, but in fact—as the commercial tells us—it is in order to relieve us of the responsibility of taking care of each other; perhaps more accurately, it’s reflecting a world in which increasingly we don’t know our neighbor’s names, and instead turn to the government for assistance in times of need.
The second commercial is what lies “behind” the Potemkin village of the first. Read More…
After perusing some old bookshelves and boxes, I discovered an old nature diary. These excerpts were written by my 11-year old self:
On Monday, July 23, my family and I went huckleberry picking in the forest. As we were looking for a good place to start picking, Mom cried, “Look at the baby deer!” Dad stopped the car, and we saw a doe dash behind a bush. The baby deer stayed for about five minutes, and then disappeared.
After picking at least a gallon of huckleberrys [sic], we went back to where we had had a picnic, and washed our hands in the creek. Then we looked for rocks, and I found a piece of petrified wood.
On the way home, Katie looked for more deer, and soon called out, “A buck! A buck!” Dad turned around the car, and we saw the buck run away! After continuing our drive Katie called, “A doe! A doe!” I barely saw her, and I didn’t have enough room to draw her, and so I didn’t include her.
This bed is all a-bloom. I investigated it thouroughly [sic] and not a flower does not have intricate purples, blues, whites, pinks, and vivid strawberry. This bed also, is filled with frilly purple, triumphant yellow, soft strawberry pink, luscious green, and a few hints of snowy white.
I must say spring has come swiftly and sufficiently. When I enter the house, it looks so dark and gloomy compared to the fresh, yellow-white sunlight that streams through branches outside.
It is terrible writing. But it’s also interesting. I see a little girl who loved the outdoors, and loved beauty—and she was trying, desperately, to capture these moments before they faded away. I didn’t want to forget the first time I saw a piece of petrified wood or baby deer. I wanted to capture the beauty of a garden rose. The words are an interesting juxtaposition of almost terse journalistic chronicling, and an exaggerated romantic style.
I didn’t know it then, but I was writing to an audience of selves. Years pass, and a new self visits the journal—someone with greater experience, skepticism, knowledge. This new person reads with fresh eyes and perhaps amusement—but aloof as they may seem, they still feel kinship with the writer of the past.
This is the horror and wonder of writing: we know we will return to ourselves, five or 10 years down the road, and see a face we’ve forgotten. We will look closely at a soul whose face we know, but whose expression and turn of phrase is now old-fashioned (and sometimes cringe-worthy) to our eyes.
There are two ways to “read where you’re from,” and I think both are valuable exercises. The first is to revisit old diary entries, scribbles, photographs, and personal collections, such as the above. Not everyone is a writer, but almost every child collects memories from their childhood in one form or another. Revisiting such materials gives us a window into who we were, and the ways we’ve grown—or stayed the same. Revisiting old notebooks and diaries from childhood requires a sense of humor, perspective, and candor—to see oneself clearly, then and now. Read More…
Andrew Leonard has a fascinating, if discursive, profile in Salon of the novelist Richard Powers. In it, Powers registers a note of discontent with the proliferation of both pop-musical content—the “insane torrent,” as he puts it in his latest novel Orfeo—and all the digital mechanisms that deliver it today.
“Suppose you were born in 1962 and you are coming into your own and music starts to become essential to you,” says Powers. “You are right on the tail end of that sort of folk rock thing, but you are aware historically of how these guys were revolting in a way against the previous generation. And because of the nature of the distribution mechanisms that you talk about, where it’s two radio stations and one record store, there’s a saturation effect for whatever is in vogue and there has to be a countervailing cultural move just to refresh our ears. And that’s the start of punk.
“So you can see these revolutions and counter-revolutions and you can see a historical motion to popular music and it’s thrilling and you want to know what happens next. The state that you just described of permanent wonderful eclectic ubiquitous interchangeable availability — there’s no sense of historical thrust.”
Leonard sums up the point thus: “Universal ubiquity has blown up the narrative.”
There’s something to this, I think. Granted, I’m pushing 40; I’m not as hip to trends as I was when I was a kid, and certainly not when I was covering them for a daily newspaper. But the point is, it’s undeniably the case that it was once fairly easy for a casual music fan to keep abreast of the latest thing. Right up until the early aughts and the passel of “The [monosyllabic plural-noun]” bands playing neo-garage rock: that’s the last time I can remember there being some kind of identifiable “movement” in rock music.
Now there is no one thing.
As a caveat, I will admit it’s somewhat rockist to think along these lines. Just because Pharrell Williams or Bruno Mars or Robin Thicke aren’t trad-rock frontmen, why can’t they be seen as the vanguard of an eclectic dance-pop revival?
And part of the fragmentation that Powers laments isn’t even technologically-driven; it’s generational and racial. Much of what we would’ve called rock music 30 years ago now falls under the rubric of country music. It’s Top 40 for white people. It’s blues-based rock tricked out with splashes of fiddle and pedal-steel. (Bruce Springsteen astutely noticed this recently: “country music is kind of where rock music has gone, really, at this point … It’s basically kind of pop-rock music … It’s where rock music continues to have a certain currency.”)
So, a good chunk of the energy in rock-ish songwriting is devoted to the Nashville machine, which critics don’t take terribly seriously and hence wouldn’t bother to suss out trends that fit on the continuum of Powers’s reaction/counter-reaction “historical thrust.”
All that said, I suspect Powers is right. We’ve probably seen the last of swing-turns-to-bop or prog-is-wiped-out-by-punk cultural shifts in popular music. Powers seems to quite openly admit that these shifts had been abetted by self-styled tastemakers, who have seen a drastic decline in influence over the last 15 years. He argues unabashedly that the culture desperately requires those media filters.
I don’t know if I’d go quite that far. Then again, I say that having come of age in a culture that had filters. I say I can do without them. But I was shaped by them. What about my kids’ generation: how will they know what’s good, what’s worth listening to, what (as a Clash fan would have put it) matters?
They’ll be free of a “narrative” imposed from above. Instead they will have the “insane torrent.”
Everything will be at their fingertips.
But perhaps they will miss greatness right under their noses.
Rarely do opinion pieces in college newspapers emerge as subjects of national controversy, but a recent essay by Harvard student Sandra Y.L. Korn has generated widespread denunciation among conservatives. Her essay—entitled “The Doctrine of Academic Freedom“—argues for dispensing with longstanding commitments to “academic freedom” in favor of what she calls “academic justice.” Academic freedom permits the airing and defense of any and all views, but she rightly notes that some views have come to be largely unacceptable in academia today. Since such views are not only socially unacceptable, but often discouraged or even prohibited as a matter of university policy, why should they not also be banned when they are articulated as findings of faculty research?
If our university community opposes racism, sexism, and heterosexism, why should we put up with research that counters our goals simply in the name of ‘academic freedom’? Instead, I would like to propose a more rigorous standard: one of ‘academic justice.’ When an academic community observes research promoting or justifying oppression, it should ensure that this research does not continue.
As might be expected, Ms. Korn’s essay has provoked strenuous criticism, including accusations of “academic totalitarianism,” investigations into her personal background aimed at exposing her as a limousine liberal, and criticism from at least one highly visible blusterer in the conservative media.
The default position of these conservatives is that Ms. Korn is attacking the sacred holy of the academic enterprise—academic freedom. In other words, mainstream conservatives have adopted the view of … John Stuart Mill, the lion of liberalism. The same John Stuart Mill who stated that most stupid people are likely to be conservative. And perhaps he had a point. Because academic freedom is not a particularly conservative principle. Academic freedom has been the vehicle by which the universities have been transformed into liberal bastions today, but it is now the inviolable principle that conservatives are rallying around in their denunciation of a Harvard undergraduate. True to form, and as I have argued in a previous column, American conservatives tend to be subject to drift, and almost inevitably end up occupying the territory once held by liberals when they move leftward. Their rallying to apparently contentless “academic freedom” is a particularly vivid case in point.
I agree with Ms. Korn—academic institutions inevitably are dedicated to some substantial commitments, and (often with difficulty) attempt to “patrol” those boundaries, if not with sticks, more often by populating their institutions with people who generally share those commitments. “Academic freedom” was the means by which the substantial commitments once held mainly by religious institutions were initially destabilized and eventually rejected, and provided the cover for their replacement with a new set of commitments. “Academic freedom” purports to be an openness to all views and opinions, but itself contains an implicit philosophy that eventually becomes manifest in, well, Sandra Korn (who is entitled to the confident assertion of the victory that has been won on today’s campuses by her teachers). Read More…
Mary Jo Anderson writes about the false compassion of euthanasia in Crisis Magazine:
The truth that lies underneath the “rights” rhetoric is who will decide what constitutes a quality of life and at what cost. Theodore Dalrymple is an English doctor, psychiatrist and author of OurCulture—What’s Left of It. Dalrymple wrote, “Euthanasia has a tendency to slide from the voluntary to the compulsory, as people increasingly make judgments on behalf of others as to what is a human life worth living.”
This is a temptation that many human rights advocates can be susceptible to: in desiring to help others, we often choose to reform them into our own image. America’s compassionate conservatives often fall prey to this tendency: out of a sincere desire to help people, they seek to re-shape another person’s life or will into their own.
Anderson’s post is highlighting a new bill passed in Belgium on February 13, one that permits euthanasia for young children. “We can no longer pin a wig over the bald truth of the culture of death,” she writes. For pro-life advocates, the move smacks of the same imperative judgments that characterize so many pro-choice arguments (i.e. the mother, specifically, is allowed to decide what’s best for the child and her family).
This bill is an example of what Anderson calls “a right to die”: it’s “thought to be a compassionate, advanced policy,” giving people the ability to decide just how much pain and hardship they are willing to suffer. Of course, making such a decision necessitates we assume a sort of omniscient posture: if we decide whether life is still worth living, we must have a very specific and (we hope) accurate understanding of what life is about, what it’s worth, and how much pain we can handle.
This is how much of the right handles non-life matters. Of course, conservatives may gasp in horror at the thought of aborting or euthanizing babies. But conservatives may also look at homeless people on the street, and tell them, “You should go get a job.” Isn’t this also a case of gross assumption—of putting someone else in your shoes, and commanding them to walk according to your path, willpower, and context? When we decide one country “deserves” democracy, or another should be “liberated” from its oppressors—often such words contain grains of truth. But what we really mean is, “Let us (our military, more often than not) come over and fix your problems, so that you can be just like us.”
Religious thought contains an important truth on this subject: most believe that humans are not omniscient. We were created by Another, a Being with infinite knowledge and discernment. We did not determine our birth. For many, this truth implies that we should not determine our end: our times are in God’s hands, as the Psalmist writes in Psalm 31.
This means we cannot look at our brothers and assume we know their best life (or death) course. Jesus told the story of a tax collector and Pharisee in Luke 18:9-14—the Pharisee thanked God that he was not “like this tax collector, this sinner.” But in reality, it was the tax collector, humble and contrite, who went home justified.
When Jesus had mercy and compassion on an individual, interestingly enough, He often gave them what they asked for. There were cases in which He told people, instead, of sins they needed to leave or truths they needed to confront. But in each and every case, He loved first.
Compassion: it’s difficult to turn such an emotion into right action. So often, we are separated in thought and soul from truly knowing others. We think we know what is right or good, but this lack of omniscience makes it difficult to rightly run people’s lives. Perhaps we should not try—instead, we should just love first.
Why is it that today’s liberals have become the most ardent cheerleaders of arbitrary monarchy? Wasn’t liberalism born of the effort to limit arbitrary rule of a single, unelected ruler?
No, I’m not suggesting that the Left has suddenly decided that they regret the American Revolution. But, in nearly every leading liberal magazine, newspaper and blog, there is a growing excitement and hope that Pope Francis will change the Roman Catholic Church’s “policies” on birth control, male celibate priesthood, homosexuality, gay marriage, divorce and (some, at least, though far fewer) abortion. They have celebrated the appointment of Pope Francis as a sign that the Church is finally going to join the modern world, and fervently hope that he will simply declare that those teachings are no longer valid and embrace today’s accepted orthodoxies. They yearn for executive fiat.
It is striking to witness this palpable longing in juxtaposition of the absence of any real concern on the Left about possible abrogations of the rule of law arising from President Obama’s decision to suspend the “employer mandate” until 2015, and general support of the President’s assertion that in the face of Congressional opposition that he has recourse to the “Pen and the Phone.” And, after a season of accusatory lamentation about Pope Benedict’s authoritarian treatment of the “Nuns on the Bus,” there has been deafening silence from the Left over the Obama administration’s decision to go to court to force the Little Sisters of the Poor to violate their conscience in accepting provision of contraception, abortifacients, and sterilization.
Liberalism was born, the story goes, as a reaction against arbitrary and unlimited rule by monarchs. Yet, today’s liberals seem to adore executive power when it’s used to effect their preferred ends, even hoping that one of the only remaining “monarchs”—the Pope—will single-handedly change the “rules” of the Church. They wish to exchange “fiat” in the sense of “let it be done” to “fiat” in the sense of “do as I say.”
(Of course, “conservatives” don’t escape from this general inclination—they tend also to be ardent supporters of expansive executive power when one of their own is in office, and it is generally conservative intellectuals who have been most interested in developing theories about active executive power.)
What happened to limited government, you might ask? I answer: exactly what liberalism promised. For, liberalism was never about “limited” government, but the pursuit and exercise of potentially limitless power toward seemingly “limited” ends of securing Rights. Read More…
Few things annoy academics more than being told that their work is irrelevant. So there’s nothing surprising about the backlash against Nicholas Kristof’s column in Sunday’s New York Times. Kristof contended that America’s professors, especially political scientists, have “marginalized themselves” by focusing on technical debates at the expense of real problems, relying too heavily on quantitative methods, and preferring theoretical jargon to clear prose. An outraged chorus of responses (round-up here) rejected Kristof’s generalization as a reflection of the very anti-intellectualism that he intended to criticize.
Some of the responses to Kristof reflect the expectation of public recognition for every contribution to the debate that makes so much academic writing a chore to read. Even so, Kristof ignores fairly successful efforts to make scholarship more accessible—even if you don’t count every blog by every holder of a Ph.D. The Tufts professor and Foreign Policy contributor Daniel Drezner has more than 25,000 Twitter followers—partly due to the success of a book the uses a zombie apocalypse scenario to compare theories of international relations. The Washington Post recently picked up The Monkey Cage (full disclosure: several of my colleagues at George Washington University are contributors) and The Volokh Conspiracy, which are populated by political scientists and legal academics, respectively. Not to mention Kristof’s own employer. Just the day before Kristof’s piece ran, the Times hired Lynn Vavreck to contribute to a new site concentrating of social science and public policy.
At least when it comes to political science, then, it’s just not plausible that “there are fewer public intellectuals on American university campuses today than a generation ago.” On the contrary, there are probably more academics who try to communicate with non-specialist audiences than there were in 1994. One difference is that the public intellectuals of past decades were more likely to engage directly with normative and historical Big Questions. That change reflects the declining influence of political theory in comparison with causal analysis, as well as the weakening of the Cold War imperative of justifying liberal democracy.
Of course, writing for non-specialist readers isn’t encouraged in graduate programs, and doesn’t often align with the requirements for hiring and promotion. But there are anecdotal reasons to think that expectations are slowly changing, as departments struggle to prove their ‘relevance’ in a period of financial retrenchment. If Kristof’s piece promotes these changes, it will have served a valuable function whether or not its argument is compelling.
In fairness to Kristof, however, none of the observations refute his basic claim. That’s because he isn’t actually talking about “public intellectuals”. Rather, he means old-fashioned mandarins, who move easily between Harvard Yard and Washington, usually without encountering many members of the public along the way. In a followup on Facebook, Kristof observes that “Mac Bundy was appointed professor of government at Harvard and then dean of the faculty with only a B.A.—impossible to imagine now.” After nearly a decade as dean, Bundy joined the Kennedy administration as National Security advisor, where his vast intellectual firepower led him to promote and defend the Vietnam War.
What Kristof really offers, then, is less an argument for public engagement by scholars than a plea for another crop of Wise Men who lend conventional wisdom the authority of the academy. Not coincidentally, he presents as an exception to the trend toward academic self-marginalization the former Princeton professor and State Department official Anne-Marie Slaughter, whose resume is as perfect a reflection of the meritocratic elite that Bundy helped create as Bundy’s own pedigree was of the old Establishment. More professors should learn to participate in public debate—including political theorists frustrated by the increasingly technical orientation of political science. But if ‘relevance’ means becoming mouthpieces of our new ruling class, then Kristof can keep it.
“Use every man after his desert and who shall ‘scape whipping?”
There’s a narrative that comes up whenever addiction is discussed publicly nowadays: the narrative in which the disease of addiction essentially replaces a person’s free will.
The barroom-wisdom version of it is the old line, “First the man takes a drink. Then the drink takes a drink. Then the drink takes the man.” A fairly heartbreaking version of it comes in this interview with author (and father of an addict) David Sheff: “Once I started realizing that my son was not making choices, I went from being angry and judgmental to being able to look at him with compassion: He’s sick; he needs help. It also allowed me to figure out what I needed to do: He’s sick; he needs to be treated.”
This narrative is, to put it mildly, not uncontested. Lots of addicts don’t agree with this description of our problem. Lots of non-addicts also disagree, typically more virulently (but I guess I’m biased), and insist that addicts are doing it on purpose: That’s a narrative I seem to see more on the right than on the left, though these are personal enough matters that ideological categories get blurry. The whole debate over how fully you want to reify the disease metaphor (which is what it is—that’s not bad, metaphors are a normal and poetic part of human understanding) gets tangled up in related debates about Alcoholics Anonymous, metaphors of surrender, treatment and/as/vs. punishment, and the promotion of personal responsibility.
There are five things wrong with an overemphasis on disease at the expense of free will—but the reaction against the disease metaphor often merely serves to strengthen it. First, the five things:
If you couldn’t understand what your family was saying, would you understand them better or worse?
Nina Raines’s ”Tribes” opens with four Britons hurling abuse at each other around the kitchen table. I think it’s supposed to be funny, but it’s mostly just crass and painful: Mom, Dad, brother and sister describing one another’s passions, hopes, beliefs, and sex lives in the most contemptuous terms possible. The fifth member of the family is deaf and yeah, you do feel that perhaps he’s the lucky one.
As the play moves forward, the younger characters get shades and nuance. (The parents, and especially the cartoonishly self-centered father, remain pretty much the same.) Daniel (Richard Gallagher), the hearing son, shows flashes of haunted vulnerability which reveal a gulf of misery under cover of vituperation. The entire family has raised Billy (James Caverly, who starts off with a beatific smile which is clearly at least partly a mask or role) to read lips rather than to sign. They’ve developed an ideological resistance to anything which smacks of Deaf culture.
They genuinely believe they’re protecting Billy, but they’re also terrified of losing their beloved son and brother to a culture which can promise him a kind of belonging they can’t offer. When that masky smile finally slips and Billy says that they view him as the family mascot, the audience can tell that it’s not true: If anything, he’s the family conscience, the only one they allow to be good, the only one they’ll openly love. Of course, he’s also the only one they never need to listen to.
When the play begins, the family is all trapped together. The hearing children, Daniel and Ruth, have retreated to the family home after a series of romantic and professional defeats in the outside world. (“I feel like a bonsai tree!” Ruth yells, in a line which got big, empathetic laughs.) Billy never left, has never had a job or a girlfriend. One of the major themes of the play is the fact that belonging is rarely chosen; you don’t get to pick the elements which make up your identity, the ties which bind. You can try to leave—and seriously, Daniel at least should do everything in his power to get out of his parents’ house, because they’re actively damaging his psyche; this isn’t a play about the comforts of home—but you will eventually have to return, if only to give an account of yourself.
There are some terrific little moments (the play’s humor eventually does become actually funny), often involving how much impromptu “sign” this resolutely anti-sign-language family uses. There are tough, basically unanswerable questions about how language shapes us and separates us from others: As Billy’s new girlfriend goes deaf, she wonders if she’s losing the ability to understand nuances and ambiguities which can’t be expressed in sign. Read More…