As more Americans than ever tuned in to watch the World Cup over the past few weeks, the American media’s quadrennial habit of analyzing soccer’s place in the country raged on. Cranky right-wingers, embodied by Ann Coulter’s now-infamous ramble, put forth common criticisms of soccer: it has an insufficient gender gap, allows scoreless ties, prohibits using hands, is foreign and liberal, prioritizes team effort over individual prowess, and constitutes all-around “moral decay.” In the face of such resistance, soccer fans like Daniel Drezner proposed simply changing the rules of the game to assuage his fellow Americans’ sense of fairness, rather than asking Americans to adapt to the game’s delightful capriciousness like the rest of the world. Meanwhile, Peter Beinart and other commentators on the left celebrated the “soccer coalition” of youth, immigrants, and liberals—the same one that elected President Obama, he recalled—proving that Americanness is not contingent upon the white working-class culture idealized by Coulter. In short, Americans loudly participated in a soccer nation’s rite of passage by reading domestic politics into the sport every chance they could get.
Though the debate largely focused on whether soccer could possibly have a place in accepted American identity, this process of political theorizing and contention mirrors the way soccer has been absorbed into other cultures throughout the sport’s history. Americans who chafe at the sport’s European origins join the long tradition of our southern neighbors who idealized the “creolization” of soccer while forming national identity after the Latin American revolutions of the 19th century. In Argentina, soccer was the manifestation of the “melting pot” where Italian and Spanish immigrants took over British cultural imports, a process crafted in the pages of the magazine El Gráfico. In Brazil, soccer was a place to reconcile racial tensions by highlighting diversity as a source of American ingenuity and creativity, superior to formulaic and homogenous European play. The contemporary American media’s ongoing narratives of soccer are similar not just in their obsessive nature, but in the diverse subcultures they are trying to weld together.
Soccer has always come with class connotations that plague burgeoning sports cultures. The prevailing image of soccer, both in the U.S. now and in Latin America a century ago, is of white urban and suburban elites who use the sport to moralize. Soccer was formalized in British public schools in the 19th century in order to promote Victorian morality and “muscular Christianity”—as well as to simply keep boys busy—but it largely came to the Americas as the pastime of the “gentleman-athletes” among British immigrants to South America. The “amateur era” of early 20th century soccer parallels the American “soccer mom” values that encourage teamwork and cooperation in children before moving on to more individualist sports as adults, and it is just as widespread and pejoratively viewed as its predecessor. As American pundits critique this intrusion of foreign collectivist values, they are echoing, among others, 1920s and 1930s Argentines calling for “our own style” (“la nuestra”) to counter and replace British beliefs. Read More…
The romantic comedy film is either dying or dead, according to writers at The Atlantic and The Daily Beast. After watching “They Came Together,” a romantic comedy that parodies the genre, the Beast’s Andrew Romano argued that the romcom’s heydey has come to an end, due to shifts in audience targeting and gender preferences, as well as money problems and failed branding.
The Atlantic’s Megan Garber thinks that romcom plots no longer address the “way we live now,” in the age of online dating and delayed marriages. Christopher Orr made a similar argument last year: he said romcom plots are too outdated for today’s society—we no longer have taboos against premarital sex, nor do we have societal class divisions. The romantic conflicts of yesteryear are outdated in today’s society. However, Noah Millman wrote a rebuttal to Orr’s argument, reminding us that the romantic movies of 1940 weren’t popular or good “because there were arranged marriages (there were none) and it isn’t because women couldn’t get a divorce (all the female protagonists of the movies I cited are or get divorced) or couldn’t have sex … they work because they go internal, into character, to find both the conflict and its resolution, and they work because they don’t isolate the world of romantic love from the rest of the social universe.”
The troubles of the modern romcom may have monetary or societal threads, but it also has a problem with simplification and homogeneity that we can’t ignore. Most romantic comedies follow either a star-crossed lovers plot, or a “You Got Mail” storyline—the man and woman hate each other, or would never marry each other, but then slowly find out they’re perfect for each other (examples: “When Harry Met Sally,” “How to Lose a Guy in 10 Days,” “Sweet Home Alabama,” “The Switch,” “27 Dresses,” et cetera).
It’s true that both these types are rooted in classics—the star-crossed lovers are classic “Romeo and Juliet,” while the we-hate-each-other-no-wait-we-love-each-other is usually some reincarnation of Pride and Prejudice. But both these classics had greater complexity and depth than most of their modern manifestations. Both told stories of class and family, prejudice and tradition, virtue and vice. Their supporting characters were just as important as their leads—we couldn’t have Pride and Prejudice without Mr. Collins or Mrs. Bennet. Modern films don’t usually give us this rich, colorful tapestry.
As NPR’s Linda Holmes wrote in response to Orr last year, “The best [films] often have other elements, elements of real sadness, like the terrific and underappreciated Hugh Grant-Julia Roberts vehicle Notting Hill, for instance, which touches on not artificial obstacles, but on the way people in difficult circumstances sometimes hurt each other’s feelings and let each other down, not to mention supporting characters struggling with disability and fertility issues.” In contrast, says Holmes, “The [films] that take nothing seriously except dating … rarely work, and they’ve rarely ever worked, because love in life is usually mixed up with all kinds of other nasty stuff.” Millman agrees:
The romantic comedies that suck are the ones that adhere to a formula that none of the great romantic comedies of yore followed. They try to make both protagonists as “relatable” as possible by making them into everymen and everywomen – thereby depriving them of any interest. They focus overwhelmingly on the romance, treating the rest of the universe as so much “business” for low comedy, rather than exploring other themes that might reflect productively on the romance at the center. And they gin up artificial external obstacles instead of persuasive, character-driven internal ones.
Yet these are the films that we keep getting, with increasing regularity. They all tell familiar stories, with familiar conflicts—the plots may change somewhat, but they never surprise us. And romcoms aren’t the only films that suffer from this problem: modern cinema is teeming with stereotypical superhero stories, underdog sports stories, exploding/smashing action films, and their like. We can usually guess exactly how the plot will unfold in the first few minutes of the film.
People increasingly want different, surprising stories—and we’re starting to see some that are new, interesting, and complex. Many explore themes of friendship, rather than romance. Disney created an international sensation when they released “Frozen”—and perhaps one of its greatest surprises was that it was mainly about sisterhood, rather than the usual romance. “The Grand Budapest Hotel,” “Saving Mr. Banks,” “The Monuments Men,” “Gravity”: all were primarily stories of friendship, trust, camaraderie, sacrifice. In the realm of television, many people love BBC’s new “Sherlock” series, and the friendship between Benedict Cumberbatch’s Sherlock and Martin Freeman’s Watson.
We may be tired of films that tell the same old story—but that doesn’t mean we should get rid of the romcom, or the dystopian film, or the action movie. We just need to reconsider the stories we tell, the plots we create, and bring innovation and complexity to these genres once more. We need stories that allow tragedy in their endings, stories with real protagonists and real villains, stories that reflect the complexity and confusion of life. If we get rom-com movies that reflect these things, then perhaps the romcom will be revitalized. But for now, the genre feels much like a broken record. It isn’t that we’ve run out of stories to tell; we’ve just told the same story too many times.
“Call me Ishmael,” the opening line to one of America’s greatest works of literature, looks very different when rendered in emoji characters.
The Library of Congress accepted data engineer Fred Benenson’s pictorial rewrite of Moby Dick, titled “Emoji Dick,” after Michael Neubert advocated its addition:
“[The book] takes a known classic of literature and converts it to a construct of our modern way of communicating, making possible an investigation of the question, ‘is it still a literary classic when written in a kind of smart phone based pidgin language?’”
Pictorial communication is becoming increasingly widespread as Emoji, “the more elaborate cousins of emoticons,” get deployed incessantly across social media. There is a forthcoming communication app called Emoji.li that uses exclusively emoji characters to communicate. One Tumblr account offers emotional analysis based on emoji use. There is even an art and design show dedicated to the pictorial system.
Hannah Rosenfield took a look at the linguistic possibilities (and impossibilities) of emoji. It has yet to develop syntax or grammar: changes in the placement of emoji within a “sentence” fail to convey any significant change in meaning. It shares many characteristics with pidgin languages, which often arise when two groups without significant linguistic common ground must communicate. “Pidgins typically have a limited vocabulary and lack nuance, a developed syntax, and the ability to convey register,” Rosenfield writes.
But just because pictorial systems are non-viable as a means of communication themselves doesn’t mean that they can’t enrich the language—and this goes beyond emoji to other forms of visual media. For example, teachers are utilizing graphic novels to aid reading comprehension in schools, as well as relying more upon digital and visual media to engage young children in their text work.
The key, though, is that these are used to supplement—not replace—traditional language. Camilla Nelson writes that “good transmedia narratives do not merely repeat across media platforms. Rather, each text offers a way to supplement, analyse and evaluate the rest—a bit like pieces of a puzzle that need to be put together through the use of imagination and problem solving.” Indeed, “Emoji Dick” is primarily an exercise in translation: Benenson accompanied the strings of emoji “sentences” with the original text, in order to provide context for the reader and intelligibility to the characters.
Picture languages give us an opportunity to emphasize or complement the language in which we think and speak, be it utilized for the sake of education, to bridge a language gap, or in casual communication. “Emoji, for all its detractors, is about embellishment and added context,” writes Rhodri Marsden for The Independent. “[I]t’s about in-jokes, playfulness, of emphasising praise or cushioning the impact of criticism, of provoking thought and exercising the imagination.”
The idea that pictorial systems could be used to engage language—streamline it, give it further nuance—has been around since before the 1500s. The problem is that visual representation is just that: representation. It refers to something concrete; points, as it were, to something else. Advanced languages derive meaning from context, from the relationships between the words themselves as well as the associations they evoke. Emoji and other visual media are highly contingent, useless without at least some explanation.
Useful? Perhaps, but by no means representative of the eclipse of the written word.
Two recent studies offer up what appear to be contradictory results: one, by the Pew Research Center, indicates that young Americans are less patriotic than their forebears. On the other hand, a recent MTV survey purports to have found that young Americans far and away surpass past generations’ patriotism.
One explanation for the disparate findings is that both surveys approach patriotism in a different manner. For MTV’s purposes, patriotism implies zealous adherence to and belief in “American ideals.” For the Pew Research Center, patriotism and exceptionalism are neatly conflated. (For example, one question the Pew survey uses to determine “patriotism” is whether the U.S. “stands above all other countries in the world.”)
Daniel Larison recently looked at the disparity between “old” and “new” exceptionalism. He writes that “believing that the U.S. is exceptional in certain respects because of its political traditions and institutions doesn’t require one to endorse the idea that the U.S. ‘stands above’ all other countries.” Conversely, a commitment—no matter how zealous—to an America of universal platitudes does not mean that you are patriotic.
But it is precisely this zealous commitment to universalist moralism that MTV emphasized in its own survey. As Alyssa Rosenberg of the Washington Post put it:
Rather than trying to boil patriotism down to whether millennials think we are number one, MTV took a look at what young people think constitute American values and how much faith they have that the country can live up to them.
Are you proud of America? Are you inspired by America? The use of emotional, idealistic vocabulary led to paradoxical results. From the MTV press release:
· Nearly 90 percent of young people feel it is “American” to advocate for equality and fairness, yet nearly 7 in 10 believe the country only embraces college-educated, well-off people.
· Over 80 percent of Millennials say America remains the land of opportunity, however 56 percent also feel the American system has let them down.
· Nearly 7 in 10 Millennials believe “America is the best country in the world,” yet 8 in 10 young people agree that some actions of the American government make it hard to be proud.
As Richard Gamble explained in a 2012 cover article for TAC, contemporary American exceptionalism and its belief that American values are universal values results in an aggressively global perspective; it presumes America “to be a beacon to the world and a liberator on a mission of universal redemption.”
But abstract principle comprises only part of American identity. The Declaration of Independence, while universalist in its rhetoric, was not intended to be an absolute expression of human emancipation.
Charles C.W. Cooke of National Review makes an excellent point when he says that the Founders “sought a restoration of their inheritance, the Constitutional Congress asserting in 1774 that British subjects in America were ‘entitled to all the rights, liberties, and immunities of free and natural- born subjects, within the realm of England.’”
Cooke writes that, “[w]ith its attestation that all men are created equal, the Declaration of Independence represented a glorious break with all that had gone before.” But the Founders did not seek to march forward into global progress. He continues: “…it is a reflective, rather than a subversive document — one that is predicated upon ancient principle and marinated in a wisdom that had taken centuries to accrue.”
Young Americans are emphatically committed to the principles upon which America was founded, but will sometimes reject the country itself as well as the wisdom and history embodied in its establishment. It is fealty to an idea, not loyalty to a nation, that they profess.
We live in a rapidly urbanizing world. But Brian Chesky, co-founder and CEO of Airbnb, thinks we are also seeing an type of urbanism resurface, one in which trust—and the village—take center stage.
Airbnb’s business model is dependent on principles of trust and friendliness: it enables people to rent out their homes to travellers, thus replacing the more customary and mainstream hotel. At the Aspen Ideas Festival, says Atlantic editor Uri Friedman, Chesky told attendees the Internet is actually moving things back to a local level by enabling people to become “micro-entrepreneurs.” This local economic empowerment then has a seismic impact on urban business and cultural development as whole:
“At the most macro level, I think we’re going to go back to the village, and cities will become communities again,” he added. “I’m not saying they’re not communities now, but I think that we’ll have this real sensibility and everything will be small. You’re not going to have big chain restaurants. We’re starting to see farmers’ markets, and small restaurants, and food trucks. But pretty soon, restaurants will be in people’s living rooms.”
Not everyone may be comfortable with a model this decentralized—but it is true that online tools like Twitter, Facebook, and mobile apps have changed the way businesses work. Food trucks can tweet their locations to followers, thus building a faithful community as they travel. Hole-in-the-wall restaurants can be found easily via Google maps and Facebook pages. The app I reviewed on Thursday, Huckle & Goose, is another example of the way technology is helping people connect with local entrepreneurs—in this case, local farmers. Companies like Airbnb and Uber take things to another level: they require us to place our faith in the host company and its system of accountability, as well as the entrepreneur whose services we claim.
David Brooks affirmed this in his column on Airbnb, called “The Evolution of Trust“—he writes that, in today’s world, people “are both hungrier for human contact and more tolerant of easy-come-easy-go fluid relationships.” In this world, apps like Airbnb are perfect catalysts for a “a new trust calculus,” a new status quo in which “flexible ad-hoc arrangements” and peer-to-peer commerce are the norms.
But the village mentality that Brooks and Chesky are observing doesn’t necessitate actual geographic villages. To the contrary: the places these apps and websites are most likely to be used are urban or international places. They help convey the feel of a village, in the rush and clamor of the big city. But perhaps this is where such services are most needed: real villages are geographically, necessarily, connected and close. The city is where we most often feel lost and isolated.
Friedman notes that the rapid urbanization of our world seems to go against the trend Chesky is identifying:
“Chesky sees village-like networks sprouting in cities at a time when urbanization is also going in the polar opposite direction. More than half of the world currently lives in cities, and the United Nations predicts that two-thirds of the global population will be urban-dwellers by 2050. In 2011, there were 23 “megacities” of at least 10 million people around the world. By 2050, there will be 37. It’s possible that as cities balloon to overwhelming sizes, we’re coping by carving out smaller communities. But it’s also possible that the phenomenon Chesky is describing is primarily playing out in Western countries. After all, Asia, where Airbnb has a relatively small presence, will account for most new megacities in the coming decades.”
I think Friedman’s first reason is spot-on, though only time will tell if he’s correct: in the midst of rapid globalization, people seem to be struggling to find a niche, a community. They don’t just want to visit the same chain stores, the same thoroughfares. They don’t want to constantly feel like another face in the crowd. Instead, they’re looking for ways to build community, even as their world becomes more isolated and atomized. Companies like Airbnb seem to provide that.
Some have accused technology of speeding up globalization—of creating a world in which we feel lonely and separated from the little platoons around us. But could it be that, with time, technology will fix the woes it created? Human nature will always yearn for community—Aristotle called us “social animals.” If he was right, then our desire for real closeness with other humans won’t simply go away. Either we’ll abandon the tools that isolate us, or we’ll adapt them to suit our community-craving needs. If Chesky is right, the latter may create the urban community of the future.
According to the Quantified Self movement, data aggregation is the answer to one of mankind’s oldest and most fundamental problems. Its catchphrase, “self knowledge through numbers,” implicitly suggests that objective self analysis by various means of data collection will allow users to reach their goals, whatever those may be.
Lo! Self knowledge, delivered in an easy-to-read graph and often helpfully accompanied by concrete, attainable steps to self-improvement.
The American public has proven incredibly receptive to the idea. Indeed, the Quantified Self movement is nestled within a much larger industry of self-quantification and description. Josh Bersin for Forbes Magazine points out that we have turned nearly everything into a means to this end:
Not only are we instrumenting our bodies, we are instrumenting everything else. … Facebook itself is an “instrumentation” system – it encourages us to post more and more data about where we are, what we’re doing, and who we are with. And Twitter, which now has around 260 million users, has become a self-description engine.
One man has taken to coding himself a daily itinerary. HTML prompts appear at various intervals, reminding him to get up and stretch, tell his wife he loves her, evaluate his energy level, take 10 minutes to think, etc. Another man used the massive amounts of data he self-collected to lose 100 pounds and launch himself into a successful business career. And then there’s the man who just got drunk.
A look into the ways consumers use the data collection industry gives insight into why they are self-aggregating in the first place: they are telling a story. (Take, for example, the tendency of Facebook and Twitter users to thoroughly chronicle and distribute their existential minutiae.) The “narrative” of the Age of Information is a torrential data stream.
While this transformation seems radical, American society has experienced rapid shifts in the quantity and speed of shared data before. A City Journal article points out that the printing press, telegraph, and telephone each vaulted data distribution to what were then-unprecedented levels. The authors, Mark P. Mills and M. Anthony Mills, make the distinction between two prior stages of the information revolution and the peculiar nature of this latest development:
The first information revolution spawned mass printing and telecommunications; the second saw the ascendance of mass computing and the Internet. … We are now witnessing the emergence of a new type of data derived from every aspect of human interaction and behavior, from commercial exchanges to biological processes.
The distribution of this information seems to require no interpretation; only mediation, passive transmission of fact. It is tempting to lament that the qualitative self—man as storyteller, as interpreter—has been largely supplanted by the quantitative self: a man of data and metadata, of fact and sleek efficiency.
Though our imagination may very well have shifted, our nature has not. We are intrinsically subjective, incapable of complete objectivity. The image of mankind rendered by quantification, though ever more extensive, is by no means comprehensive. Scientific description cannot now and never will alter fundamental facts of human nature. Relentless reduction to objective fact cannot somehow circumscribe mankind.
“Know thyself” is an exhortation as profound and relevant in the Age of Information as it was in the age of Classical Greece: there is more to know, more to be known by, than numbers.
A lot of Americans are beginning to express interest in the idea of eating local, seasonal food. But at the same time, most of us spend our days working (at least) 40 hours a week, cramming social, extracurricular, and athletic events into evenings and weekends. Who has time to cook—let alone go to the farmer’s market, pick out produce, and plan meals?
This was the motivation behind Huckle & Goose, a new cooking app and website for people who want to eat local, but don’t have the time to create meal plans on their own. The app provides weekly curated recipe plans, specific to U.S. region, complete with an automated shopping list. Farmers across the U.S. email Huckle & Goose their harvest schedule, and the plans are then tailored for each region based on what’s available.
The company was started by sisters-in-law Christine Lucaciu and Anca Toderic. Both are Romanian: Toderic was born there, and Lucaciu is a first-generation American citizen. This Romanian heritage gave them a love of local, seasonal food. Toderic remembers canning tomato sauce and going to the market with her grandmother. There was an appreciation and awareness of produce’s seasonality: Toderic and her siblings would each get one orange at Christmastime, as a special treat. You can imagine, then, Toderic’s astonishment when first walking into an American grocery store, and seeing mountains of oranges in the produce section. This was “the land of milk, honey, and processed foods,” says Lucaciu. But several years ago, Lucaciu and Toderic encountered the locavore movement—and they adopted the idea wholeheartedly: “Now we don’t have to wait until summer trips to Romania to taste grass-fed meats and vegetables full of flavor,” Lucaciu said.
However, buying local has its challenges—Lucaciu and Toderic found it difficult to plan the meals they had envisioned when they bought their fresh produce. Kohlrabis and beets rotted in the back of their fridge, while they searched for recipes that were feasible to create on a busy schedule. This challenge inspired Huckle & Goose. The name refers to huckleberries and gooseberries—two berries that aren’t grown conventionally, and are only available a few months a year. “Not being able to buy them at the grocery store whenever the mood strikes cultivates a deeply rooted sense of gratitude and patience that’s so rare in our Western food culture,” Lucaciu and Toderic say on their website. In addition to the weekly meal plans, Huckle & Goose offers an archive of searchable recipes and a blog with additional cooking tips and ideas.
I signed up for a trial version of the app, and have been using it over the past couple weeks. Thus far, I’ve been impressed by the ease of the app and the versatility of the recipes. Some are more complicated than others—but Lucaciu and Toderic didn’t want to make the recipes too easy. Cooking, they said, is supposed to take time: it’s a ritual that we can enjoy. Though they offer some easy recipes, they also encourage people to try new and challenging ones.
Lucaciu and Toderic are also conscientious of the cost that often accompanies buying local—and for this reason, they offer a lot recipes that have a shorter ingredient list. This is the secret and the joy of buying local: fresh things taste great on their own, so you don’t have to add much. Read More…
Facebook has taken a lot of heat in the past few days for toying with users’ emotions—albeit for scientific purposes. In January 2012, 700,000 Facebook users were the subjects of a scientific study, in which data scientists tweaked the website’s news feed algorithm to skew content in either a positive or negative emotional direction. The news feed posts were chosen based on the amount of positive or negative words in each post. Users were then observed, to see whether these emotionally charged news feeds had any effect on their own moods. Sure enough: by the end of the week, study subjects were likely to post more positive or negative words, based on the information they’d been shown throughout the past seven days.
The study has been greeted with rampant disapproval and alarm. Writers have questioned the data privacy issues at stake, and discuss their fear that Facebook is treating them like “lab rats” in a grand social experiment. Wired called it a “blatant violation of trust.” As Laurie Penny wrote at the New Statesman,
Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively. There are no precedents for what Facebook is doing here. Facebook itself is the precedent. What the company does now will influence how the corporate powers of the future understand and monetise human emotion.
But additionally, one must question why the study was allowed in the first place. Adam D.I. Cramer, a Facebook data scientist, said they conducted the study because “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” according to a post on his Facebook page. “At the same time,”he continued, “we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
What did Facebook intend to do (or what have they, perhaps, already done) as a result of this fear? Skew our news feeds in a more positive direction, to shield our fragile egos and comparison-prone selves? Do they intend to shield us from our worst selves by only giving us the information they deem important, worthwhile, positive, happy?
This is not Facebook’s job, and this is not part of providing a “better service,” as Cramer said was their intent (“The goal of all of our research at Facebook is to learn how to provide a better service.”). Doing research to provide “better service” involves shielding users from hackers, bugs, and glitches in the system. It involves creating a platform (not an information-sorting station) for users to interact with friends and family, without having to fear subtle manipulation. Read More…
One might think that having a stronger cultural bias toward religion would lead to less discrimination against religious affiliations. Two sociological studies, one of New England and one of the South, indicate that in the workplace, at least, that may not be true.
The first study, the survey of New England, measured employer response to fictitious resumes that had various religious affiliations (including a made-up “Wallonian” faith) and a control group that had no such affiliation listed. The second study was a simple replica of the first, performed in the South. Both studies found that including an overt religious affiliation caused a significant drop in follow-up contacts from prospective employers.
The original study was focused on New England because, in the authors’ own words, “New Englanders express the lowest levels of religiosity in the country,” and the “notoriously taciturn New Englanders are not typically prone to flamboyant displays of religious fervor.” The New England study was then replicated in the South, a region known instead for being strongly religious. (The introduction to the survey of the South calls it the “most devout,” in contrast to New England’s label of “least religious.”)
The results? In both studies, Muslims were by far the least likely applicants to receive a follow-up contact, receiving 38 percent fewer e-mails and 41 percent fewer phone calls than the control group in New England. In the South, the numbers were 38 and 54 percent, respectively.
The New England study found that Catholics, pagans, evangelicals, and atheists were again subject to discrimination, if considerably less than Muslims. Each group received roughly 27-29 percent fewer phone calls: close to the effect that religious affiliation itself had on contact returns. (The reaction against these minority groups, save evangelicals, was similar but more pronounced in the South.)
Even cultures that are highly religious—such as the South—will discriminate against expressed religious affiliation in entry-level hirings. From the conclusion of the study conducted in New England:
This suggests that the secularization has developed a normative aspect … prescribing when and where it is “acceptable” to express one’s religiousness. … As such, secularization implies not just independence from religion but normatively enforced separation from it — even to the point of religious discrimination.
The key to overcoming prejudice seems to lie, oddly enough, with Southern Jews. They were given even preferential treatment by employers, despite the fact that their culture is unfamiliar to the regnant strain of Protestantism. From the conclusion of the survey conducted in the South by Wallace, Wright, and Hyde:
While Jews are culturally different from evangelicals in many respects, Southern Jews have deep historical roots in the South and have more successfully assimilated into mainstream culture than Jews in other regions. … In short, Jews thrived in the South, not by brandishing their religious differences but by embracing key aspects of Southern evangelical culture.
American culture has indeed secularized and, as this study shows, become more wary of overt religious affiliation. The separation between public and private freedom of religion is more and more strictly defined. Many Christians chafe at the idea of such restriction. But perhaps, as the success of Southern Jews has demonstrated, the answer is not to “brandish religious differences” but, as Samuel Goldman describes in the latest issue of The American Conservative, to make peace, within limits, with the culture in which we live, for the sake of common harmony.
Hobby Lobby has won. The big box craft store’s lawsuit challenging the contraceptive mandate regulation issued by the Department of Health and Human Services in its implementation of Obamacare had become seen as high-stakes test case determining the future balance between the religious liberty of conservative Christians and the mandates of sexual modernity. When the case was first heard back in March, Patrick Deneen weighed in here at TAC with a decidedly unorthodox argument. To Deneen, Christians relying on a big box retailer to represent them against a secular leviathan signaled “the culminating absurdity of what Polanyi called the ‘utopia’ of our modern economic disembedding”:
The dominant narrative—religious liberty against state-mandated contraception—altogether ignores the economic nature of the case, and the deeper connections between the economy in which Hobby Lobby successfully and eagerly engages and a society that embraces contraception, abortion, sterilization, and, altogether, infertility. Largely ignored is the fact Hobby Lobby is a significant player in a global economy that has separated markets from morality. Even as it is a Christian-themed brand, it operates in a decisively “secular” economic world. …
Hobby Lobby—like every chain store of its kind—participates in an economy that is no longer “religious” or even “moral.” That is, it participates in an economy that arose based on the rejection of the subordination of markets embedded within, and subject to, social and moral structures. This “Great Transformation” was detailed and described with great acuity by Karl Polanyi in his masterful 1944 book of that title. … As he succinctly described this “transformation,” previous economic arrangements in which markets were “embedded” within moral and social structures, practices, and customs were replaced by ones in which markets were liberated from those contexts, and shorn of controlling moral and religious norms and ends. …
How delicious he would doubtless find the irony of a “religious corporation” seeking to push back against the State’s understanding of humans as radically autonomous, individuated, biologically sterile, and even hostile to their offspring. For that “religious corporation” operates in an economic system in which it has been wholly disembedded from a pervasive moral and religious context. Its “religion” is no less individuated and “disembedded” than the conception of the self being advanced by the State. …
I hope Hobby Lobby wins its case. But we should not deceive ourselves for a minute that what we are seeing is the contestation between a religious corporation and a secular State. We are seeing, rather, the culminating absurdity of what Polanyi called the “utopia” of our modern economic disembedding—the absurdity of a chain store representing the voice of religion in the defense of life amid an economy and polity that values turning people and nature into things. Our entire economy is an education in how to be “pro-choice.” What it most certainly is not in any way, shape or form, is about helping us to understand our true condition as embedded human beings.
Scott Galupo responded to distinguish his position as “a counterweight to the despair of both moral traditionalists like Deneen and Dreher and market purists-slash-declinists like Kevin Williamson.” On the Hobby Lobby case in particular, he invoked Yuval Levin and Garry Wills to contrast the Obama administration’s drive from a more appropriately modest accommodation:
Here, Levin calls to mind Garry Wills’s distinction between the progressive-liberal “order of justice” and the “order of convenience.” To sum up a complex essay, Wills believed it should not be the aim of the state to dispense “raw justice” (Chesterton’s phrase), but rather to facilitate convenience (in the John Calhoun sense of the word—to “convene” or “concur” or bring about social peace). …
A proper order of convenience would be able to accommodate Hobby Lobby’s religious objections. On this matter and others, the Obama administration seeks an order of justice. I hope, in this case, that it loses.