Because most of the traditional pathways to adulthood—marriage, economic independence, stable job—seem out of reach or prove to be reversible, working-class young adults have developed a new definition of maturity. This new pathway relies heavily on therapeutic culture: You become an adult by overcoming the trauma of your past, whether that involved abusive parents, drug addiction, mental illness, or less flamboyant hardships. Young adults who take on this new definition focus on protecting the fragile self, and they reject solidarity and close, committed relationships in favor of individualistic, judgmental competition.
This is the basic thesis of Jennifer M. Silva’s insightful, frustrating new book, Coming Up Short: Working-Class Adulthood in an Age of Uncertainty. “At its core,” Silva writes, “this emerging working-class adult self is characterized by low expectations of work, wariness toward romantic commitment, widespread distrust of social institutions, profound isolation from others, and an overriding focus on their emotions and psychic health.”
Silva interviewed 100 white and black working-class adults, defining “working-class” as native-born Americans, with native-born parents who didn’t have college degrees—Silva herself fits this definition. A lot of what she found rang very true to me from my own conversations with young adults in low-wage jobs, although I suspect her findings don’t generalize quite as much as she thinks: The worldview she describes has definitely influenced some of the women I counsel at the pregnancy center, for example, but they’re also shaped by communal ties, religious history and their own religious fervor, and a culture of childbearing.
Silva finds relatively little of the “redemption through procreation” language which men expressed in the recent Doing the Best I Can: Fatherhood in the Inner City, and none of the catastrophic romanticism which drove premarital relationships for the “red-state” young adults in 2011′s Premarital Sex in America. So she’s only capturing part of the picture of working-class or poor young adulthood.
And there were many times when I felt the heavy hand of the medium upon the planchette, as Silva seemed to overinterpret her interviewees’ words to drag them closer to her preexisting theories. She often quotes an interview and then summarizes it in terms which don’t really capture what I would mean if I said what her interviewee said, or what people I’ve heard say those things meant when they said them. She talks a lot about “social institutions” and how there should be more of them or how they’ve failed, but the only ones she actually names in a positive context are unions, and she relies almost exclusively on government intervention for signs of hope.
Like I said, it’s a frustrating book. But its primary insights are important and true. There is a working-class therapeutic culture, that culture did arise in large part because of the disruption of older institutions which provided stability and a source of identity, and that culture is intensely prone to self-blame and judgmental “crabs in a barrel” sniping at other working-class people who aren’t sufficiently self-helping.
A plethora of recent studies have delved into the effect social media has on our brains. As part of a widespread neurological fascination in social science, these reports offer various concerns on the effect new technological platforms have on our social consciousness.
The University of Michigan reported last month that Facebook, while cultivating an addictive fixation in its users, also fosters depression. Why? “Based on the responses from the participants,” TIME reports, “the scientists speculated that the Facebook users were comparing themselves with their peers, and many were feeling inferior as a result. Users also reported frustration and a “lack of attention” from having fewer comments, likes and feedback compared with their Facebook friends.”
Social scientists are also musing over the growing “selfie” trend – and trying to determine its ethical implications. Is our pouty-picture-taking symptomatic of typical youthful immaturity, or is it indicative of some deeper cultural phenomenon?
These are just a few minor grace notes in a growing public dialogue on “narcissism”: whether we are more narcissistic than in ages past, how social media affects our self-absorption, and whether we need to change. Dr. Jean Twenge, a professor of psychology at San Diego State University, believes that younger generations are “increasingly entitled, self-obsessed, and unprepared for the realities of adult life,” according to a recent New York Times article. Slate author Katy Waldman felt inclined to agree with her:
Look at the rise of plastic surgery, our painstaking attention to Facebook and Twitter profiles, our selfies, our relentless focus on self-improvement, and “soccer teams that give every kid a trophy … Could it be that she is onto something? How else to explain that Twenge’s thesis just feels right?
Some believe this “epidemic” of narcissism is a product of our digital era. According to another study, this one conducted by the University of California Los Angeles, language in books have shown the trend develop over the past 200 years. “The currently discussed rise in individualism is not something recent but has been going on for centuries as we moved from a predominantly rural, low-tech society to a predominantly urban, high-tech society,” said psychology professor Patricia Greenfield, who conducted the study. Words like “unique,” “individual,” “self,” “feel,” “choose,” and “get” increased significantly over time. Words like “authority,” “belong,” and “pray” are more rare than before.
The “selfie” is a commonly quoted example of the narcissistic tendency in social media. One psychologist told TIME that the self-captured image could be beneficial, since it allows “young adults and teens to express their mood states and share important experiences.” But not everyone sees the “selfie” as a correct depiction of the self. Brett McCracken, in a Mere Orthodoxy blog post, argued the self-projections we present on social media are both deceptive to others and to ourselves:
Social media’s “what are you doing now?” invitation to pose, pontificate and consume conspicuously only amplifies the narcissistic presentism of the generation depicted in The Bling Ring. It makes it easier than ever to tell the world exactly what you want them to know about you. Through a carefully cropped and color-corrected selfie, depicting whatever glamorized “now” we think paints us in the best light, we can construct a public persona as we see fit.
Are Instagram selfies and Twitter posts truly making us more narcissistic? Or are we merely viewing a greater publication of old human tendencies?
I would argue the latter. While the digital era could be making us more narcissistic, it seems more likely that it is doing exactly what it was created to do: showcasing the self in all its glory. As long as humans have roamed the planet, they have had selfish tendencies. But prior to the rise of social media, they did not have a global platform for this grandstanding. High school, with all its intense and sticky drama, was documented profusely in letters and diaries, but it rarely appeared in public discourse. Now, via Facebook, young adults have a communal outlet for their emotionally turbulent lives. Facebook is, in many ways, the new diary.
In addition, arguing that young adults are more narcissistic than ever seems too simplistic. Many young people are immature and tend to be self-interested. It is only with experience and age that we learn the planet does not revolve around us. Once again, this generation has a wider global platform for self-pontification than ever before. This may skew the narcissism balance in their favor. Not to mention that the young are less adept at disguising their selfishness. With age, we learn how to hide our self-absorption behind facades of one sort or another. Time makes some humble; it makes others clever.
It is true that individualism and life disassociated from family and community is more widespread today than in past civilizations. It could be that this individualism has fostered our egotism. The person who lives with family or roommates must learn patience, self-control, and sacrifice. The individual, however, need not reconcile with other people’s wants and desires. Personal wants reign supreme. This obviously cultivates a different set of values and perceptions in the individual. But one must caution against the tendency to stereotype and predict behavior based on such perceived “trends.”
The human mind is not one massive mound of ever-growing egotism. It is a complex, varied, wondrous thing — scarred with the pains of human experience, spotted with sin and excess, ever shifting with the cadence of human experience. While we should be wary of social media’s influence on our mental and spiritual growth, we needn’t discount the online experience because of a few paltry “selfies.” One of the beauties of social media is its ability to connect and share – to teach us more about each other.
Ali Eteraz, author of Children of Dust, wrote in Medium last Tuesday that words are losing their potency and power. He believes the mighty Instagram, with its frozen pixelated memories, is our future. Though perhaps unconsciously, he said, “Instagram and its cousins represent an undeclared war on writing. On words.”
Eteraz believes that in the beginning, the Internet encouraged words. Text statuses reigned supreme in the first days of Facebook. But then, a change slowly began to develop:
First, by progressively smaller bursts of text (websites became blogs, became status updates, became 144 character tweets), and then through the enthronement of the image. Whether it is moving pictures (Youtube, Vimeo, Liveleak), or photo-sharing sites like Instagram, Pinterest, and Snapchat, it goes without saying that we are well on our way to communicating with each other by way of pictures.
There is one important and distinctive difference, however, between the sorts of “words” projected on the Internet, and those words utilized by storytellers throughout time. In the print era, the newspapers and books told stories of the other: of the powerful in Washington or on Wall Street, of a neighbor’s child who won a spelling bee, of Bobby Jones the golf player or General Robert E. Lee. Besides the private introspection of the diary or public thoughtfulness of the memoir, stories of self were at least somewhat limited.
But in the world of Internet and social media, another narrative has begun to reign supreme: namely, the self-narration. Facebook and Twitter statuses create constant self-publication. This public diary has wooed us away from the storybooks; after all, between General Lee and myself, whom will my ego find more fascinating?
With the initiation of Instagram, self-describing words became self-describing pictures. Instead of striving to help the user know and understand “who I am and how I feel” via statuses, the Instagram image asks you to merely see who I am, and to know “me” on that front. Eteraz does not view this as an alarming trend – except perhaps selfishly, he surmises, as a writer who wants to save his income. For most of society, he supposes, such a trend is normal:
After all, we are descendants of cavemen that told their stories upon stone walls by way of images. And we are descended of societies where the primary language was the hieroglyph, which is nothing more than words represented in imagistic forms. From this perspective we shouldn’t show much concern if our societies transition away from words and move to communicating by way of the image.
But here again, Eteraz does not seem to notice that the very nature of what is communicated has changed. Hieroglyphics and caveman images did not contain self-musing diary entries. Many contained histories and chronicles of kingdoms and clans, as well as ceremonial and religious messages. The “hieroglyphics” of Instagram rarely contain any of these things. As the image grows omnipotent online, the stories we tell are changing. News has increasingly subsided from such typographic-focused sites as the New York Times to the image-based commentary of Buzzfeed. Rather than reading “long-winded” titles, we watch long-winded Netflix shows. Eteraz sees this trend to the image as inevitable:
…Most realists among the wordsmiths already know that short of some massive cataclysm that lays to waste the electronic grid that makes the delivery of images so easy, we are pretty much done for. Whenever I walk past the cinemas and the cafes with flat screen TV’s and look at our children tapping away at pictures on iPads and think about how no one cares about reading the lengths of Proust, or Yukio Mishima, or Qurratulain Hyder, the thoroughness of the wordsmith’s dispossession comes to mind…
But, he surmises, writers’ laments on the subject are merely selfish, as we cannot “live with the thought that [we] are also irrelevant. If there are no words there are no smiths.” This negates all the inherent and transcendent merits of words themselves. He does recognize – and devotes one sentence – to the thought that some writers believe “words have something to contribute to this world, something important.” But he does not extrapolate this thought: What do words contribute to this world? Why do they matter?
There is neither time nor space to full extrapolate the importance of words in this post. But perhaps this short list will identify some reasons for words’ preeminence throughout time as the highest form of communication:
Hundreds of colleges and universities assign a “common reading” book to their students every summer – often for incoming or honors students, sometimes for the entire student body. They are usually meant to represent the school’s values and philosophies, and to foster dialogue amongst students for the remainder of the semester. But according to the National Association of Scholars’ latest report, “Beach Books 2012-2013: What Do Colleges and Universities Want Students to Read Outside of Class?” the reading program often falls short of excellence.
The report found that instead of old books, most chosen works corresponded to current events and cultural issues: environmentalism, bioethics, social justice, information technology, urban poverty, animal rights, the war in Iraq, etc. Out of 309 colleges, only four assigned books qualified as classics. Only nine books were published before 1990. Some would not see this as a problem; however, the report’s authors believe there is a wealth of scholarship purposefully excluded from most reading programs. “Colleges … choose easy, trendy books because they hope that this will induce students to read the book, or at least think favorably of it,” said report author Ashley Thorne in an email interview. “Instead, they could be challenging students to go beyond their comfort zones and raise their sights to books they will find difficult but rewarding. The problem is not that books with mass appeal are bad, but that books for common reading could be much better.”
Interestingly, the most common reading chosen for the 2012-2013 year was The Immortal Life of Henrietta Lacks. The book, published in 2010, centers around Lacks and controversy over her laboratory-cultured cervical cancer cells. The NAS report saw the book as a compilation of the institutions’ favorite genres:
If colleges were primarily moved by the goal of teaching students something about science, we would expect that other books about science would also have registered strongly in the list of choices. That, however, did not happen. Accordingly it is probably best to conclude that The Immortal Life owes its popularity not to being a book about science but to being a book about science whose subjects—the Lacks family—happen to be black and poor and furnished with a victimhood narrative. In short, The Immortal Life of Henrietta Lacks is popular because it puts a racial grievance spin on science.
NAS President Peter Wood believes the book is an indication of the “soft manipulation” that often goes on at the modern university. Instead of espousing classical literature, schools often turn studies into an opportunity for acculturation. The widespread ideology amongst America’s colleges strives to re-create students according to its own understanding of race, gender, and class theory.
Anthony Esolen, a literature professor at Providence College, sees such a classifying tendency amongst his fellow professors as well, and condemns it strongly:
It does violence to the man to reduce him to such categories. It is an act of contempt for his humanity. It reduces him, not so that we may get to know him, but so that we can manipulate facts about him while not getting to know him at all. It is a study in subhumanity. That is exactly what schoolteachers, professors, and critics do to John’s art when they cram it into the pigeonholes of race, class, and gender. It is an act of violence.
Thorne called for colleges to use common reading programs as an opportunity to grow students’ knowledge of “our intellectual heritage” through including more works of classic literature. NAS published their list of 50 recommended common reading books – replete with both famous and lesser-known works, with authors spanning from Plato to Tom Wolfe.
It seems doubtful that universities already succumbing to “soft manipulation” will stop when they offer Plato’s Republic on their reading list. Rather, any inherent sexism or elitism in the work will immediately be fleshed out in discussion. This is the reductionism that Esolen refers to in his article, and it cannot be stopped by a mere exchange of titles. No book, classic or no, can offer purely impartial alternative. However, by choosing a challenging and brilliant book, there is hope that even one reader will discover a morsel of truth that would otherwise have gone undiscovered.
David M. Shribman accused Americans of ruining August on Monday, through their cacophony of work, school and frantic scheduling. He remembers when August was “an idyll of idleness, a time of pure ease” – but nowadays, it’s ebbed into work and school obligations:
“Not so long ago—well within the memory of half the American population—August was the vacation month. It was a time, much anticipated and much appreciated, of leisure, languor, lassitude and lingering at the beach well into suppertime… What we’ve done to August has made it the cruelest month: infuriating work and inescapable school obligations amid intoxicating weather.”
The New York Times actually wrote a similar story in August 2006, called “The Rise of Shrinking-Vacation Syndrome.” Mike Pina, a spokesman for AAA, told author Timothy Egan that “The idea of somebody going away for two weeks is really becoming a thing of the past. It’s kind of sad, really, that people can’t seem to leave their jobs anymore.” Egan pointed to ”the heightened pace of American life, aided by ever-chattering electronic pocket companions,” for crippling people’s ability to escape or just be “slothful.”
Since Alexis de Tocqueville’s famous study of American life in Democracy in America, we’ve been a documented case of superior work ethic. This work ethic has become tantamount to our national identity – and some might argue, our obsession. Social critic Morris Berman has written Why America Failed and Spinning Straw Into Gold, two books aimed at the nation’s work-fixated culture. He told the Atlantic in an interview last week that America is “essentially about hustling, and that goes back more than 400 years.” Americans, for the most part, lack true community or neighborly connection. They see careers, professional ambition, and prestige as means to “the good life.” In this mindset, life and pleasures become results-obsessed.
Berman does think these trends have escalated in the recent past: he writes that as the U.S. began to “speed up” from about 1965 on, “a kind of industrial, corporate, consumer ‘frenzy’ took over, which meant there was no time for anything except getting and spending.” This fits with Shribman’s description of the new August: a month that is now results-obsessed in its educational and vocational pursuits. Classes for high schoolers can begin as early as August 5 (whereas Shribman documents a time when they began after Labor day). College students return to campus mid-August, and summer travel has dropped by 30 percent. Americans, it would appear, are eager to achieve – not relax.
This frenzied environment may not stem entirely from technological advances (though gadgets certainly help) or even historical precedent. The country’s current economic situation fosters a sense of vulnerability and job insecurity, and this increases our desire to put in extra hours at the office. In turn, that economic anxiety may push students toward college and a degree with greater alacrity.
The imprudent tweet has endangered the career of many a professional—psychology professor and Congressman, to name two. Last week the University of New Mexico formally censured psychology professor Geoffrey Miller, who had tweeted in June, “Dear obese PhD applicants: if you don’t have the willpower to stop eating carbs, you won’t have the willpower to do a dissertation #truth.” The tweet sparked outrage online that spilled onto the pages of the Atlantic.
(You already know about the ex-Congressman.)
Using Twitter, it seems, involves the following trade-off: in exchange for accessing a low-level stream of nonsense and chatter, you take on the risk of damaging your reputation permanently with the One Bad Tweet.
That’s why professor Brian Leiter advises graduate students, who already face perilous career prospects, to steer clear of public social media of any kind.
“The evidence is that first impressions are ‘sticky,’ and there is way too much risk that a bad first impression will be created by unfortunate or out-of-context remarks on social media, rather than a student’s work,” wrote Leiter on his popular philosophy blog, Leiter Reports, in response to a concerned graduate student’s request for advice.
A hiring committee member might overlook your brilliant, considered work on metaethics if he only recalls a gaffe like “Deontologists are full of crap screw you guys #TeamUtilitarian,” “So hungover during metaphysics lecture #YOLO” or “I think that Mitt Romney is a decent guy.”
Predictably, Leiter’s advice set philosophers atwitter on…Twitter. And, predictably, several of the tweets in response were under-thought and drew Leiter’s (likely permanent) ire, promptly proving his point.
But philosophy professor Rani Lill Anjum, who maintains a list of philosophers on Twitter, sings the praises of Twitter. Initially a Twitter skeptic, she now encourages more academics to join her: for advice and encouragement, and even for help working through philosophical conundrums.
Twitter thinking is the opposite of philosophical thinking. John Campbell defines philosophy as “thinking in slow motion. It breaks down, describes and assesses moves we ordinarily make at great speed.” Twitter, by contrast, is half-thinking at blinding speed. So why tweet?
The difference between Leiter and Anjum seems to come down to two issues: how well Twitter users are able to exercise restraint, and how well they can sift through the blather to find useful information. Anjum finds Twitter an “utterly friendly and supportive” place for philosophical discussion. But Leiter’s not buying it: ”Twitter is (by and large) a lot of childish noise, and I think only in the mind of twitter users is it shaping the world,” he writes.
That’s fair enough, but perhaps Professor Leiter underestimates the ability of academics like Anjum to create a Twitter world largely sheltered from the noise and blather. For Anjum, Twitter is a place where a bunch of philosophers share ideas and encourage each other, with negligible occasion to embarrass themselves. It’s a small but inspiring feat, and it’s worth pondering how other communities might replicate it online.
Forbes contributor and CEI president Fred Smith has a different definition of the “culture war” than most: in a Monday op/ed, he argued that this war is economic at heart—waged between “the forces of economic dynamism and stasis.”
He offers two figures as inspiration for economic dynamism: the economist Deirdre McCloskey and Charles Dickens’s Ebenezer Scrooge, an old miser redeemed through spiritual intervention. According to Smith’s thesis, Scrooge is not a villain; rather, he was a victim of the anti-capitalist society. He references McCloskey’s book Bourgeois Dignity, where she writes that the Industrial Revolution was predicated on a shift in rhetorical values. Capitalism thrived after talk of private property, commerce, and the bourgeoisie became accepting and laudatory. Before this point, principles of hierarchy and community undermined the innovating class. Prudence and faithfulness enjoyed higher standing than virtues necessary for modern capitalism—namely, dignity and liberty:
“I claim here that the modern world was made by a new, faithful dignity accorded to the bourgeois – in assuming his proper place – and by a new, hopeful liberty – in venturing forth. To assume one’s place and to venture, the dignity and the liberty, were new in their rhetorics. And both were necessary.”
Smith argues that Scrooge (and other “penny pinchers” like him) was unfairly vilified because he championed McCloskey’s innovative virtues: “a willingness to break ranks with the cultural tribal norms of their community (courage), confidence that this course was needed (faith), and belief that their actions would eventually be vindicated (hope).” In creating Scrooge, Smith believes Dickens hearkened to a time when “guilds took care of tradesmen, nobles took care of their lands and serfs, and everybody had a station … He seemed not to have understood how the breakthroughs made by the Scrooges of that earlier age had helped transform England from a stagnant feudal society into the industrial powerhouse of his day. Certainly, he found nothing heroic or admirable in such individuals.”
Perhaps it was difficult for Dickens to applaud Scrooge’s capitalist “virtues” when facing the moral depravity and squalor that overwhelmed industrial England at the time. A Christmas Carol is full of characters who are destitute yet generous; he painted pre-reformed Scrooge as the antithesis of these poor but happy people.
Wilhelm Röpke’s book A Humane Economy dovetails with the transformed Scrooge. Rather than accepting Smith’s fully liberated individualism, Röpke advocated the principled economics of the “decentrist,” who “thinks in terms of human beings and also knows and respects history.” The decentrist adheres to “established principles; he is swayed more by a hierarchy of norms and values, by reason and sober reflection, than by passions and feelings.” This hierarchy-respecting capitalist is rather divergent from McCloskey’s progressive innovator. But Röpke believed innovation and free market economics have limits:
The market economy is not everything. It must find its place in a higher order of things which is not ruled by supply and demand, free prices, and competition. It must be firmly contained within an all-embracing order of society in which the imperfections and harshness of economic freedom are corrected by law … Man can wholly fulfill his nature only by freely becoming part of a community and having a sense of solidarity with it. Otherwise he leads a miserable existence and he knows it.
Scrooge came to know it. He forsook his icy palace of individualism for a life of community. Dickens and Röpke both seem to suggest that the virtues of dignity and liberty must be tempered and complemented with one more virtue, once called the greatest of them all: love.
“I have done that,” says my memory. “I cannot have done that,” says my pride, and remains inexorable. Eventually—memory yields.
–Friedrich Nietzsche, Beyond Good and Evil
“The Act of Killing” proves Nietzsche was too optimistic. This surreal documentary, which feels more like Variety Hour in Hell, began when filmmaker Joshua Oppenheimer found that it was impossible to get survivors of the brutal 1965-6 anti-Communist campaign in Indonesia to describe their experiences. He settled for what he considered the next best thing: interviews with the perpetrators. And for the reason Jean-Luc Godard gives here, that turned out to be the key to making one of the most eye-opening documentaries I’ve ever seen.
Because the killers were so proud! They boasted–and they didn’t just boast about killing Commies. They boasted about lying, labeling everybody Communist in order to have an excuse to kill them; they boasted about corruption and bare-faced bribery. They boasted about the grubbiest little crimes as well as the atrocities. They call themselves premen, a term which the film translates as “gangster” but which–as they frequently point out–is derived from the English “free man.” It covers everybody from the man who scalps tickets outside the movie theater to the man who slaughters ethnic Chinese.
Oppenheimer asked these men to reenact their killings in whatever way they wished. At first the reenactments are fairly straightforward: This is how I would loop the wire around the guy’s neck, this is where I did it. Then, as Oppenheimer would play their reenactments back for them and ask how they wanted to do it over, the scenes start getting seriously wiggy. There are Western-themed reenactments and noir-themed ones, and a glorious dance scene in front of a waterfall in which a victim thanks his killer for sending him to Heaven. There are interrogation scenes in which the interrogators still have “victim” and “pretty lady” makeup on from previous scenes. It’s a vertiginous experience which makes a lot of points–for example, the gangsters are very up-front about the fact that they are consciously modeling themselves and their techniques after what they saw in movies, the way American mobsters adopted the style of The Godfather–and makes the audience feel like reality itself is up for grabs.
The premen created an identity in which violence, lies, and self-seeking were praiseworthy. “Relax and Rolex!”, as one of them chortles. They use “sadistic” as a neutral-to-positive term. I wondered whether not only fear but also the lack of any corresponding positive identity as a survivor explained the reluctance of their victims to go on the record. The killers are often explicit about their desire to create an internal reality in which their actions were admirable: They work hard to make themselves the heroes, not the villains.
The University of Wisconsin political scientist Andrew Kydd offers an interesting critique of the spread of concealed carry and stand your ground statutes. Departing from a Weberian definition of the state as a monopoly on the legitimate use of force, Kydd suggests that “[t]he United States is now embarked on an unprecedented experiment, in that it is a strong state, fully capable of suppressing private violence, but it is increasingly choosing not to.” Kydd attributes loosening restrictions on the possession and use of guns to a libertarian fantasy that “the absence of the state will lead to a paradise for individuals.” But he follows Hobbes in predicting grimmer consequences: the replacement of violence under law by anarchic clashes between mercenaries, clans, and vigilantes.
I share Kydd’s concern about the decriminalization of gun violence, which looks to me like a risky solution to an exaggerated problem (violent crime has been falling for years). But his thinking about the relation between violence and the state is too Hobbesian to be convincing. For Hobbes, the “state of war” and the juridical state were mutually exclusive; violence was subject either to monopoly control or anarchic diffusion. For Kydd, similarly, the choice is between, say, the modern UK, in which firearms are very tightly regulated, and Afghanistan, where the strong do what they can, and the weak do what they must.
In the history of political thought, however, this is a false alternative. Following Hobbes, Kydd ignores the (small “r”) republican model of organized violence, in which the law is executed by an armed citizen body. The classical republic is not a state in the Weberian sense because it lacks a standing army or regular police force. On the other hand, it is not simply anarchic: citizens who possess the means of coercion cooperate on a relatively informal basis to enforce laws whose authority they all recognize.
The republic, in this sense, has always been more ideal than reality. But it is an ideal that has played an important role in the development of American political culture, particularly in connection with guns. For the republican tradition, particularly as transmitted by the Country party in British politics, a well-armed, self-organized citizenry poses less of a threat to safety and liberty than a strong state. That is the reasoning behind the 2nd Amendment.
There are serious and perhaps insurmountable obstacles to the revival of this tradition today. Apart from technological changes since the 18th century, the republican theory of violence presumes a relatively small, mostly agrarian society with a strong conception of public virtue. The contemporary United States, by contrast, is more like a multinational empire: a political form that has historically required much more coercive practices of government. Even so, the republican tradition reminds us that Leviathan is not the only possible source of order. We can acknowledge its necessity without regretting its evil.
An important concept in psychoanalytic theory is castration anxiety, the fear of emasculation. The French theorist Jacques Lacan, one of the titans of 20th-century philosophy, used the imaginary unit i to elucidate this idea:
The erectile organ can be equated with the √-1, the symbol of the signification produced above, of the jouissance [ecstasy] it restores–by the coefficient of its statement–to the function of a missing signifier: (-1).
In Europe, intellectuals such as Lacan, Foucault, and Sartre have traditionally enjoyed a much more prominent place in public life than intellectuals in America. They go on TV shows. Or the cameras come to them and they hold forth, shirtless, in bed. On the continent, especially France, philosophers can be celebrities. It’s also true that European philosophers (and those working in “continental” philosophy) are typically more abstruse and obscure than America and England’s analytic philosophers, who prize clarity of argument.
The Open Culture blog flags the Lacan passage above as part of a fantastic post, wherein they suggest that it’s no coincidence that continental philosophers are both celebrated and ferociously difficult to understand. Instead, the obscurity is part of the reputation. So argues the American philosopher Martha Nussbaum in a critique of Judith Butler, who writes in the French poststructuralist style:
Some precincts of the continental philosophical tradition, though surely not all of them, have an unfortunate tendency to regard the philosopher as a star who fascinates, and frequently by obscurity, rather than as an arguer among equals. When ideas are stated clearly, after all, they may be detached from their author: one can take them away and pursue them on one’s own. When they remain mysterious (indeed, when they are not quite asserted), one remains dependent on the originating authority. The thinker is heeded only for his or her turgid charisma.
Nussbaum and Open Culture are on to something important. Far too many bewildered undergraduates are made to suss out the arguments of thinkers who, in all likelihood, have not made a good faith effort to put forth a coherent argument. If you can’t understand Lacan above, the odds are it’s not your fault: it’s Lacan’s.
Although it may be delightful to see thinkers like Lacan and Slavoj Zizek savaged, it’s also important to note that there is a necessary and proper place for obscurity and difficulty. There’s no rule of reality that says everything should be explicable in simple, precise language. The world is complicated, so our theories will have to be complicated–quantum mechanics comes to mind. But what is important is the good-faith effort to make yourself understandable, to make claims that can be defended or refuted.
Daniel Dennett calls for an appropriate balance between continental showmanship and the austere analytic style in his latest book, Intuition Pumps and Other Tools for Thinking:
There is a time and a place in philosophy for rigorous arguments, with all the premises numbered and the inference rules named, but these do not often need to be paraded in public. We ask our graduate students to prove they can do it in their dissertations, and some never outgrow the habit, unfortunately. And to be fair, the opposite sin of high-flown Continental rhetoric, larded with literary ornament and intimations of profundity, does philosophy no favors either. If I had to choose, I’d take the hard-bitten analytic logic-chopper over the deep purple sage every time. At least you can usually figure out what the logic-chopper is talking about and what would count as being wrong.