We live in a rapidly urbanizing world. But Brian Chesky, co-founder and CEO of Airbnb, thinks we are also seeing an type of urbanism resurface, one in which trust—and the village—take center stage.
Airbnb’s business model is dependent on principles of trust and friendliness: it enables people to rent out their homes to travellers, thus replacing the more customary and mainstream hotel. At the Aspen Ideas Festival, says Atlantic editor Uri Friedman, Chesky told attendees the Internet is actually moving things back to a local level by enabling people to become “micro-entrepreneurs.” This local economic empowerment then has a seismic impact on urban business and cultural development as whole:
“At the most macro level, I think we’re going to go back to the village, and cities will become communities again,” he added. “I’m not saying they’re not communities now, but I think that we’ll have this real sensibility and everything will be small. You’re not going to have big chain restaurants. We’re starting to see farmers’ markets, and small restaurants, and food trucks. But pretty soon, restaurants will be in people’s living rooms.”
Not everyone may be comfortable with a model this decentralized—but it is true that online tools like Twitter, Facebook, and mobile apps have changed the way businesses work. Food trucks can tweet their locations to followers, thus building a faithful community as they travel. Hole-in-the-wall restaurants can be found easily via Google maps and Facebook pages. The app I reviewed on Thursday, Huckle & Goose, is another example of the way technology is helping people connect with local entrepreneurs—in this case, local farmers. Companies like Airbnb and Uber take things to another level: they require us to place our faith in the host company and its system of accountability, as well as the entrepreneur whose services we claim.
David Brooks affirmed this in his column on Airbnb, called “The Evolution of Trust“—he writes that, in today’s world, people “are both hungrier for human contact and more tolerant of easy-come-easy-go fluid relationships.” In this world, apps like Airbnb are perfect catalysts for a “a new trust calculus,” a new status quo in which “flexible ad-hoc arrangements” and peer-to-peer commerce are the norms.
But the village mentality that Brooks and Chesky are observing doesn’t necessitate actual geographic villages. To the contrary: the places these apps and websites are most likely to be used are urban or international places. They help convey the feel of a village, in the rush and clamor of the big city. But perhaps this is where such services are most needed: real villages are geographically, necessarily, connected and close. The city is where we most often feel lost and isolated.
Friedman notes that the rapid urbanization of our world seems to go against the trend Chesky is identifying:
“Chesky sees village-like networks sprouting in cities at a time when urbanization is also going in the polar opposite direction. More than half of the world currently lives in cities, and the United Nations predicts that two-thirds of the global population will be urban-dwellers by 2050. In 2011, there were 23 “megacities” of at least 10 million people around the world. By 2050, there will be 37. It’s possible that as cities balloon to overwhelming sizes, we’re coping by carving out smaller communities. But it’s also possible that the phenomenon Chesky is describing is primarily playing out in Western countries. After all, Asia, where Airbnb has a relatively small presence, will account for most new megacities in the coming decades.”
I think Friedman’s first reason is spot-on, though only time will tell if he’s correct: in the midst of rapid globalization, people seem to be struggling to find a niche, a community. They don’t just want to visit the same chain stores, the same thoroughfares. They don’t want to constantly feel like another face in the crowd. Instead, they’re looking for ways to build community, even as their world becomes more isolated and atomized. Companies like Airbnb seem to provide that.
Some have accused technology of speeding up globalization—of creating a world in which we feel lonely and separated from the little platoons around us. But could it be that, with time, technology will fix the woes it created? Human nature will always yearn for community—Aristotle called us “social animals.” If he was right, then our desire for real closeness with other humans won’t simply go away. Either we’ll abandon the tools that isolate us, or we’ll adapt them to suit our community-craving needs. If Chesky is right, the latter may create the urban community of the future.
According to the Quantified Self movement, data aggregation is the answer to one of mankind’s oldest and most fundamental problems. Its catchphrase, “self knowledge through numbers,” implicitly suggests that objective self analysis by various means of data collection will allow users to reach their goals, whatever those may be.
Lo! Self knowledge, delivered in an easy-to-read graph and often helpfully accompanied by concrete, attainable steps to self-improvement.
The American public has proven incredibly receptive to the idea. Indeed, the Quantified Self movement is nestled within a much larger industry of self-quantification and description. Josh Bersin for Forbes Magazine points out that we have turned nearly everything into a means to this end:
Not only are we instrumenting our bodies, we are instrumenting everything else. … Facebook itself is an “instrumentation” system – it encourages us to post more and more data about where we are, what we’re doing, and who we are with. And Twitter, which now has around 260 million users, has become a self-description engine.
One man has taken to coding himself a daily itinerary. HTML prompts appear at various intervals, reminding him to get up and stretch, tell his wife he loves her, evaluate his energy level, take 10 minutes to think, etc. Another man used the massive amounts of data he self-collected to lose 100 pounds and launch himself into a successful business career. And then there’s the man who just got drunk.
A look into the ways consumers use the data collection industry gives insight into why they are self-aggregating in the first place: they are telling a story. (Take, for example, the tendency of Facebook and Twitter users to thoroughly chronicle and distribute their existential minutiae.) The “narrative” of the Age of Information is a torrential data stream.
While this transformation seems radical, American society has experienced rapid shifts in the quantity and speed of shared data before. A City Journal article points out that the printing press, telegraph, and telephone each vaulted data distribution to what were then-unprecedented levels. The authors, Mark P. Mills and M. Anthony Mills, make the distinction between two prior stages of the information revolution and the peculiar nature of this latest development:
The first information revolution spawned mass printing and telecommunications; the second saw the ascendance of mass computing and the Internet. … We are now witnessing the emergence of a new type of data derived from every aspect of human interaction and behavior, from commercial exchanges to biological processes.
The distribution of this information seems to require no interpretation; only mediation, passive transmission of fact. It is tempting to lament that the qualitative self—man as storyteller, as interpreter—has been largely supplanted by the quantitative self: a man of data and metadata, of fact and sleek efficiency.
Though our imagination may very well have shifted, our nature has not. We are intrinsically subjective, incapable of complete objectivity. The image of mankind rendered by quantification, though ever more extensive, is by no means comprehensive. Scientific description cannot now and never will alter fundamental facts of human nature. Relentless reduction to objective fact cannot somehow circumscribe mankind.
“Know thyself” is an exhortation as profound and relevant in the Age of Information as it was in the age of Classical Greece: there is more to know, more to be known by, than numbers.
A lot of Americans are beginning to express interest in the idea of eating local, seasonal food. But at the same time, most of us spend our days working (at least) 40 hours a week, cramming social, extracurricular, and athletic events into evenings and weekends. Who has time to cook—let alone go to the farmer’s market, pick out produce, and plan meals?
This was the motivation behind Huckle & Goose, a new cooking app and website for people who want to eat local, but don’t have the time to create meal plans on their own. The app provides weekly curated recipe plans, specific to U.S. region, complete with an automated shopping list. Farmers across the U.S. email Huckle & Goose their harvest schedule, and the plans are then tailored for each region based on what’s available.
The company was started by sisters-in-law Christine Lucaciu and Anca Toderic. Both are Romanian: Toderic was born there, and Lucaciu is a first-generation American citizen. This Romanian heritage gave them a love of local, seasonal food. Toderic remembers canning tomato sauce and going to the market with her grandmother. There was an appreciation and awareness of produce’s seasonality: Toderic and her siblings would each get one orange at Christmastime, as a special treat. You can imagine, then, Toderic’s astonishment when first walking into an American grocery store, and seeing mountains of oranges in the produce section. This was “the land of milk, honey, and processed foods,” says Lucaciu. But several years ago, Lucaciu and Toderic encountered the locavore movement—and they adopted the idea wholeheartedly: “Now we don’t have to wait until summer trips to Romania to taste grass-fed meats and vegetables full of flavor,” Lucaciu said.
However, buying local has its challenges—Lucaciu and Toderic found it difficult to plan the meals they had envisioned when they bought their fresh produce. Kohlrabis and beets rotted in the back of their fridge, while they searched for recipes that were feasible to create on a busy schedule. This challenge inspired Huckle & Goose. The name refers to huckleberries and gooseberries—two berries that aren’t grown conventionally, and are only available a few months a year. “Not being able to buy them at the grocery store whenever the mood strikes cultivates a deeply rooted sense of gratitude and patience that’s so rare in our Western food culture,” Lucaciu and Toderic say on their website. In addition to the weekly meal plans, Huckle & Goose offers an archive of searchable recipes and a blog with additional cooking tips and ideas.
I signed up for a trial version of the app, and have been using it over the past couple weeks. Thus far, I’ve been impressed by the ease of the app and the versatility of the recipes. Some are more complicated than others—but Lucaciu and Toderic didn’t want to make the recipes too easy. Cooking, they said, is supposed to take time: it’s a ritual that we can enjoy. Though they offer some easy recipes, they also encourage people to try new and challenging ones.
Lucaciu and Toderic are also conscientious of the cost that often accompanies buying local—and for this reason, they offer a lot recipes that have a shorter ingredient list. This is the secret and the joy of buying local: fresh things taste great on their own, so you don’t have to add much. Read More…
Facebook has taken a lot of heat in the past few days for toying with users’ emotions—albeit for scientific purposes. In January 2012, 700,000 Facebook users were the subjects of a scientific study, in which data scientists tweaked the website’s news feed algorithm to skew content in either a positive or negative emotional direction. The news feed posts were chosen based on the amount of positive or negative words in each post. Users were then observed, to see whether these emotionally charged news feeds had any effect on their own moods. Sure enough: by the end of the week, study subjects were likely to post more positive or negative words, based on the information they’d been shown throughout the past seven days.
The study has been greeted with rampant disapproval and alarm. Writers have questioned the data privacy issues at stake, and discuss their fear that Facebook is treating them like “lab rats” in a grand social experiment. Wired called it a “blatant violation of trust.” As Laurie Penny wrote at the New Statesman,
Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively. There are no precedents for what Facebook is doing here. Facebook itself is the precedent. What the company does now will influence how the corporate powers of the future understand and monetise human emotion.
But additionally, one must question why the study was allowed in the first place. Adam D.I. Cramer, a Facebook data scientist, said they conducted the study because “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” according to a post on his Facebook page. “At the same time,”he continued, “we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
What did Facebook intend to do (or what have they, perhaps, already done) as a result of this fear? Skew our news feeds in a more positive direction, to shield our fragile egos and comparison-prone selves? Do they intend to shield us from our worst selves by only giving us the information they deem important, worthwhile, positive, happy?
This is not Facebook’s job, and this is not part of providing a “better service,” as Cramer said was their intent (“The goal of all of our research at Facebook is to learn how to provide a better service.”). Doing research to provide “better service” involves shielding users from hackers, bugs, and glitches in the system. It involves creating a platform (not an information-sorting station) for users to interact with friends and family, without having to fear subtle manipulation. Read More…
One might think that having a stronger cultural bias toward religion would lead to less discrimination against religious affiliations. Two sociological studies, one of New England and one of the South, indicate that in the workplace, at least, that may not be true.
The first study, the survey of New England, measured employer response to fictitious resumes that had various religious affiliations (including a made-up “Wallonian” faith) and a control group that had no such affiliation listed. The second study was a simple replica of the first, performed in the South. Both studies found that including an overt religious affiliation caused a significant drop in follow-up contacts from prospective employers.
The original study was focused on New England because, in the authors’ own words, “New Englanders express the lowest levels of religiosity in the country,” and the “notoriously taciturn New Englanders are not typically prone to flamboyant displays of religious fervor.” The New England study was then replicated in the South, a region known instead for being strongly religious. (The introduction to the survey of the South calls it the “most devout,” in contrast to New England’s label of “least religious.”)
The results? In both studies, Muslims were by far the least likely applicants to receive a follow-up contact, receiving 38 percent fewer e-mails and 41 percent fewer phone calls than the control group in New England. In the South, the numbers were 38 and 54 percent, respectively.
The New England study found that Catholics, pagans, evangelicals, and atheists were again subject to discrimination, if considerably less than Muslims. Each group received roughly 27-29 percent fewer phone calls: close to the effect that religious affiliation itself had on contact returns. (The reaction against these minority groups, save evangelicals, was similar but more pronounced in the South.)
Even cultures that are highly religious—such as the South—will discriminate against expressed religious affiliation in entry-level hirings. From the conclusion of the study conducted in New England:
This suggests that the secularization has developed a normative aspect … prescribing when and where it is “acceptable” to express one’s religiousness. … As such, secularization implies not just independence from religion but normatively enforced separation from it — even to the point of religious discrimination.
The key to overcoming prejudice seems to lie, oddly enough, with Southern Jews. They were given even preferential treatment by employers, despite the fact that their culture is unfamiliar to the regnant strain of Protestantism. From the conclusion of the survey conducted in the South by Wallace, Wright, and Hyde:
While Jews are culturally different from evangelicals in many respects, Southern Jews have deep historical roots in the South and have more successfully assimilated into mainstream culture than Jews in other regions. … In short, Jews thrived in the South, not by brandishing their religious differences but by embracing key aspects of Southern evangelical culture.
American culture has indeed secularized and, as this study shows, become more wary of overt religious affiliation. The separation between public and private freedom of religion is more and more strictly defined. Many Christians chafe at the idea of such restriction. But perhaps, as the success of Southern Jews has demonstrated, the answer is not to “brandish religious differences” but, as Samuel Goldman describes in the latest issue of The American Conservative, to make peace, within limits, with the culture in which we live, for the sake of common harmony.
Hobby Lobby has won. The big box craft store’s lawsuit challenging the contraceptive mandate regulation issued by the Department of Health and Human Services in its implementation of Obamacare had become seen as high-stakes test case determining the future balance between the religious liberty of conservative Christians and the mandates of sexual modernity. When the case was first heard back in March, Patrick Deneen weighed in here at TAC with a decidedly unorthodox argument. To Deneen, Christians relying on a big box retailer to represent them against a secular leviathan signaled “the culminating absurdity of what Polanyi called the ‘utopia’ of our modern economic disembedding”:
The dominant narrative—religious liberty against state-mandated contraception—altogether ignores the economic nature of the case, and the deeper connections between the economy in which Hobby Lobby successfully and eagerly engages and a society that embraces contraception, abortion, sterilization, and, altogether, infertility. Largely ignored is the fact Hobby Lobby is a significant player in a global economy that has separated markets from morality. Even as it is a Christian-themed brand, it operates in a decisively “secular” economic world. …
Hobby Lobby—like every chain store of its kind—participates in an economy that is no longer “religious” or even “moral.” That is, it participates in an economy that arose based on the rejection of the subordination of markets embedded within, and subject to, social and moral structures. This “Great Transformation” was detailed and described with great acuity by Karl Polanyi in his masterful 1944 book of that title. … As he succinctly described this “transformation,” previous economic arrangements in which markets were “embedded” within moral and social structures, practices, and customs were replaced by ones in which markets were liberated from those contexts, and shorn of controlling moral and religious norms and ends. …
How delicious he would doubtless find the irony of a “religious corporation” seeking to push back against the State’s understanding of humans as radically autonomous, individuated, biologically sterile, and even hostile to their offspring. For that “religious corporation” operates in an economic system in which it has been wholly disembedded from a pervasive moral and religious context. Its “religion” is no less individuated and “disembedded” than the conception of the self being advanced by the State. …
I hope Hobby Lobby wins its case. But we should not deceive ourselves for a minute that what we are seeing is the contestation between a religious corporation and a secular State. We are seeing, rather, the culminating absurdity of what Polanyi called the “utopia” of our modern economic disembedding—the absurdity of a chain store representing the voice of religion in the defense of life amid an economy and polity that values turning people and nature into things. Our entire economy is an education in how to be “pro-choice.” What it most certainly is not in any way, shape or form, is about helping us to understand our true condition as embedded human beings.
Scott Galupo responded to distinguish his position as “a counterweight to the despair of both moral traditionalists like Deneen and Dreher and market purists-slash-declinists like Kevin Williamson.” On the Hobby Lobby case in particular, he invoked Yuval Levin and Garry Wills to contrast the Obama administration’s drive from a more appropriately modest accommodation:
Here, Levin calls to mind Garry Wills’s distinction between the progressive-liberal “order of justice” and the “order of convenience.” To sum up a complex essay, Wills believed it should not be the aim of the state to dispense “raw justice” (Chesterton’s phrase), but rather to facilitate convenience (in the John Calhoun sense of the word—to “convene” or “concur” or bring about social peace). …
A proper order of convenience would be able to accommodate Hobby Lobby’s religious objections. On this matter and others, the Obama administration seeks an order of justice. I hope, in this case, that it loses.
Simon Preston noticed that most areas of Britain don’t have a vibrant food culture. Besides obvious place-tied dishes—things like Cornish pasties—few other dishes had a distinctive regional trademark. In an article at the Guardian, Preston writes that many Brits have developed a rather globally encompassing attitude toward food:
We’re a population that grazes dishes from across the world and, for the most part, we feel no more connected to a local dish than we do to a curry. When travelling abroad, we’re quite taken with the regional dishes that appear again and again, but closer to home, local food culture is still a fairly new idea, mostly driven by the trend-led efforts of creative chefs and encouraged by food hobbyists.
Eating international cuisine isn’t a problem—but, as Preston points out, there are benefits to having a local food culture, as well. So he asks this interesting question: is it possible to invent a food culture in the 21st century? He decided to try and create one in the rural Aberdeenshire town of Huntly:
I set up a dining table and chairs in the supermarket and used tea and cake to entice shoppers to join me. Chattering families, reminiscing pensioners and bemused workers who had raced in for a ready meal shared their stories: how they came to Huntly, why they stayed, places they had loved and lost, ghost stories and tall tales.
… A huddle of local chefs gathered and soon, my dossier of local anecdotes became dishes. The ancient standing stones in the town square were represented in the positioning of prize-winning local haggis bonbons on a plate. Barley appeared in a risotto, which in turn referenced the Italian connection found in so many Scottish towns. A schoolgirl’s tale of a JK Rowling manuscript locked in the local police station safe inspired a Huntly Mess, made with local raspberries and whisky, and the Deveron river – a place where the town goes to play, to think, to celebrate and to court – brought local trout to one dish and a river bend slick of sauce to another.
The dishes began to catch on as local restaurants and pubs served them. Customers were delighted to see their stories and memories take gastronomic form. The food culture can, it seems, be invented from scratch.
It’s an interesting idea, especially for many American who have lost the culinary cultures of their past, due to the burgeoning influences of other cultures and food chains in their homelands. Excepting certain cities with distinctive gastronomic traditions, like New York City or Philadelphia, many American towns don’t have dishes to call their own. But as Preston points out, it’s never too late to begin examining local ingredients again: our states, counties, and cities offer us a wealth of history, terrain, crops, and animals with which to build a local food culture.
In the Idaho town where I grew up, corn and onion fields had a distinctive presence. Farmers grew a lot of alfalfa and mint, and there were orchards scattered here and there. We got fresh goat’s milk from one farmer, and fresh beef from another. My brothers raised chickens. There’s a local coffee roaster in the town beside us. There are a couple nearby lakes for fishing, and the Salmon River isn’t far off. It’s only a couple hour drive into the mountains, if you want fresh huckleberries.
There are also some incredible recipes, handed down over time, jotted on note cards in spidery script, that I would add to my local food culture: my grandmother’s baked beans, my aunt’s “mile-high biscuits,” grandpa’s barbecued chicken, my great-grandmother’s brown bread, and her much-coveted recipe for peach pie.
Many chefs here throughout the Northern Virginia and Washington, D.C. area love to use local produce—and they have ample resources to work with. Farms in NOVA and Maryland feature high-quality and fantastic tasting produce. With these resources, it’s entirely possible to create and curate a local flavor, to showcase the parts of your culture that are distinctively local.
Of course, this isn’t meant to demean the rich international traditions that influence our various cities—in Idaho’s capitol, Boise, there’s an entire Basque district, with its own distinctive (and incredible) food culture. Outside D.C., in Annandale, Virginia, there’s a significant Korean immigrant population, and the restaurants there are fantastic. New York City’s immigrants are part of what give the city such incredible food. The point isn’t that imported ingredients and recipes are bad—to the contrary, they help form a vibrant local food culture. Without them, our regions wouldn’t have as much culinary color and vibrancy.
But foods that are chosen from local ingredients also have a distinctive story. Whether invented or preserved, local foods help define, and give flavor, to our places. That’s why Preston invented a food culture, and why I have the beginnings of mine.
What’s your local food culture?
Traditional marriage has experienced a shift in popular culture from “cornerstone” to “capstone” of adult achievement. As a result, many young men and women are delaying marriage; indeed, today’s generation is marrying later than ever before.
For poor women, or those with low levels of education, late marriage has resulted in demographically unprecedented nonmarital birthrates. But for young female professionals, children during early life are often out of the question. And, unlike their counterparts of lower socioeconomic status, they have the means to both willfully prevent conception and push back the hands of the proverbial biological clock.
The odd thing is that early childbearing outside of marriage is culturally and economically incentivized for those of the lower class, whereas the same mechanisms operate to discourage childbearing at all for women of wealth and high social standing.
Among the poor, childbearing is still seen as a rite of adulthood, a chance to achieve some form of success and personal fulfillment. Olga Khazan quotes a Johns Hopkins sociologist, Andrew Cherlin, in an Atlantic article:
‘Many young women think they will be able to care for the kid—they have a mother who can help, a sister they can rely on,’ Cherlin said. Particularly among the very poorest Americans, ‘this is a way a woman or man can be a successful adult when all other paths are blocked.’
For the wealthy, however, children born outside of wedlock present a significant social and financial burden. Educational and career successes are the milestones used to judge success as an adult—meaning that young female professionals often choose to delay marriage and children.
Because marriage is no longer a moral prerequisite for reproduction, economics and personal preference are left to determine whether or not marriage occurs. As a result, those stricken by financial hardship have all but abandoned marriage. It demands resources they simply do not have: money, time, and long-term commitment.
Affluent women, however, have been proven to reap disproportionate financial benefits from delaying marriage and child rearing. Eleanor Barkhorn, in an article for the Atlantic, says that women “who marry later make more money per year than women who marry young.” There is a 56 percent increase in income for college-educated women who marry after 30, relative to the same group who married before age 20.
Thus, we have a boom of single mothers among the lower classes and a scarcity of mothers altogether among our professional and upper classes. The poor are unable to manipulate biology as effectively and end up having children at roughly the same point in their life as they always have, albeit outside the stability of a marriage. But the affluent can afford to cling to the normative marriage-then-children pattern, using money and technology to delay childbearing until they have found a suitable spouse.
The pattern is cruelly self-reinforcing. Young women raised in the broken homes so common among low-income social circles often grow up to perpetuate the same destructive cycle. Young women raised among the elite, on the other hand, feel overwhelming pressure to postpone childbearing for the sake of professional success.
In both instances, marriage has been relegated to secondary status. There is no longer any moral force behind the institution and, as a result, it is discarded or delayed for the sake of financial interests. Families are not a commodity; a household is more than a mere microcosm of our capitalistic society. But until something changes, the rift between the childless elite and spouseless poor will only continue to grow.
In her essay “Can Marriage Be Saved?,” Boston College Scholar in Residence Laura L. Garcia suggests we may be looking at marriage from the wrong perspective:
I conclude that given the values dominating our culture today, the ideal marriage might be that between two homosexuals. There is no possibility that children will enter the picture unexpectedly to create burdens on the couple’s time or money or freedom. Partners are free to leave whenever the relationship no longer suits them, with no repercussions on children and little financial impact. There are likely to be few financial difficulties, in fact, since both partners are likely to be working and in general handle their accounts separately. Sexual desires are gratified without risk of pregnancy. If children are seen as a desirable addition, perhaps they can be adopted or artificially produced—poster babies for Planned Parenthood’s slogan ‘Every child a wanted child.’
Perhaps only someone like her who views marriage as a sacred union created by God between one man and one woman can see the essential issues. Well before homosexual marriage became a public policy issue, heterosexual America had already redefined marriage. In the modern dispensation, the purpose of marriage was not lifetime mutual support whose love’s goal, if not necessarily actual fruit, was biological children; it instead had morphed into an alliance of two individuals maximizing their own interests in any way that suited them, dissoluble anytime either party desired. Transitioning from men and women to same-sex partners was a small step once marriage was so redefined.
The facts are hardly in contention. As far as the prospects for traditional marriage go, Richard Reeves of the Brookings Institution recently outlined them in The Atlantic:
It is too late. Attitudes to sex, feminist advances, and labor market economics have dealt fatal blows to the traditional model of marriage. Sex before marriage is the new norm. The average American woman now has a decade of sexual activity before her first marriage at the age of 27. The availability of contraception, abortion, and divorce has permanently altered the relationship between sex and marriage. As Stephanie Coontz, the author of Marriage, A History and The Way We Never Were, puts it, “marriage no longer organizes the transition into regular sexual activity in the way it used to.” Feminism, especially in the form of expanded opportunities for women’s education and work, has made the solo-breadwinning male effectively redundant. Women now make up more than half the workforce. A woman is the main breadwinner in 40% of families. For every three men graduating from college, there are four women. Turning back this half century of feminist advance is impossible (leaving aside the fact that is deeply undesirable). There is class gap here, however. Obsolete attitudes towards gender roles are taking longest to evolve among those with the least education.
His solution is to promote parenting as the rationale for marriage, plus stay-at-home dads for the lower classes since most of these breadwinners are women anyway. But why should liberated moderns accept either sacrifice? As the great economic historian Joseph Schumpeter predicted almost a century ago, once
men and women learn the utilitarian lesson and refuse to take for granted the traditional arrangements that their social environment makes for them … and … as soon as they introduce into their private life a sort of inarticulate system of cost accounting – they cannot fail to become aware of the heavy personal sacrifices that family ties and especially parenthood entail under modern conditions and of the fact that…children cease to be economic assets.
No one can complain that moderns have been slow to learn the lesson, with childbearing collapsing in Europe except among mostly Muslim immigrants, and barely holding on at replacement levels for European- and African-Americans; it’s even abating among Hispanic-Americans.
Gay marriage is hardly immune from the same rationalizing process; indeed, it illuminates it. Read More…
American Christianity today is largely schismatic and divided. Rod Dreher pinpointed this problem in a recent article, “The Problem of Truth and Triumphalism“: he writes that Christians are fragmented, and often sneering, about their denominational differences and preferences. This division and disdain builds resentment and anger amongst Christians. Rather than experiencing unity, they foster anger and derision. Dreher references an article by Alan Jacobs, in which Jacobs critiques Jody Bottum’s book on post-Protestant America:
As I commented earlier today on Twitter, in the last twenty years I’ve seen theologically-serious Protestants become more and more respectful of and interested in Catholicism — but I have simultaneously seen many serious Catholics withdraw completely into a purely Catholic world, with little interest in other Christian traditions except to critique them…
Dreher says he has experienced this as well: “I remember a professor telling me years ago at a conference that he might have left Protestantism for Catholicism, except for the fact that his Catholic convert friends were so intellectually haughty in their newfound Catholicism that they kept him away from the Roman church,” he writes. “What that man experienced is a constant temptation for intellectual converts to Catholicism.”
But I would argue that this is a temptation for any intellectually serious Christian: the more we pour ourselves into Christian study and thought, the more likely we are to become profoundly convicted and impassioned by a specific vein of study or a denominational tradition. And whether you’re Baptist or Orthodox, you will be a dogged supporter of your church.
This passion, by itself, is good. Love and devotion to the church is part of what creates a healthy and functioning “body,” so to speak—a congregation that works for the good of the whole, and believe in the mission of the whole. But as Rod points out, “the way in which we are faithful to the truth, as we understand it, is almost as important as the truth itself.” He sees a greatest danger in Christian intellectual thought in what he calls “triumphalism”: it’s an attitude of superiority that “blinds us to the faults within ourselves and our tradition. It also blinds us to what is good within other traditions, misguided though they might ultimately be … [it will] likely blind the Other to the truth within our tradition, and may well, in the end, keep the Other from embracing the truth as we know it.”
This reminded me of another article on kindness and the “other,” written by Emily Esfahani Smith for The Atlantic last week. She writes that the greatest destroyer of marriages is contempt, whereas the greatest builder of marriage is kindness:
Contempt, [researchers] have found, is the number one factor that tears couples apart. People who are focused on criticizing their partners miss a whopping 50 percent of positive things their partners are doing and they see negativity when it’s not there. People who give their partner the cold shoulder—deliberately ignoring the partner or responding minimally—damage the relationship by making their partner feel worthless and invisible, as if they’re not there, not valued.
In contrast, she writes, “If you want to have a stable, healthy relationship, exercise kindness early and often.” Smith lists several ways to be more consciously kind, but one of the primary ways it to be “generous about your partner’s intentions … The ability to interpret your partner’s actions and intentions charitably can soften the sharp edge of conflict.”
This simple advice should be applied to more than just a marital relationship. Read More…