What’s the problem with Ivy League schools these days? According to William Deresiewicz, their problems are legion. In an article for The New Republic, he cautions parents and students against pursuing an Ivy League institution. The schools are mere machines, he writes, drawing students who are “smart and talented and driven,” yet also “trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.” He explains further:
So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk. You have no margin for error, so you avoid the possibility that you will ever make an error. Once, a student at Pomona told me that she’d love to have a chance to think about the things she’s studying, only she doesn’t have the time. I asked her if she had ever considered not trying to get an A in every class. She looked at me as if I had made an indecent suggestion.
Peter Lawler has recognized this problem in the past, and points to our technification of education as a large part of the problem. “I’ve long believed that the main threat to liberal education—real higher education, in my view—is our tendency to judge the success of academics in technical terms,” he wrote recently at Minding the Campus. “… The art of teaching is becoming a technology defined by skills, competencies, machine-based grading, ‘smart’ classrooms, rubrics, and expert-generated ‘best practices.’” Lawler calls professors back to a mode of teaching in which history, philosophy, and literature are not lost in the quantifiable and scientific: in which professors mind their students’ “souls” and virtues.
In his article, Deresiewicz points to some problems in the admissions process, and offers some suggestions on how to fix it. A lot of his propositions are intriguing:
The education system has to act to mitigate the class system, not reproduce it. Affirmative action should be based on class instead of race, a change that many have been advocating for years. Preferences for legacies and athletes ought to be discarded. SAT scores should be weighted to account for socioeconomic factors. Colleges should put an end to résumé-stuffing by imposing a limit on the number of extracurriculars that kids can list on their applications. They ought to place more value on the kind of service jobs that lower-income students often take in high school and that high achievers almost never do.
… More broadly, they need to rethink their conception of merit. If schools are going to train a better class of leaders than the ones we have today, they’re going to have to ask themselves what kinds of qualities they need to promote. Selecting students by GPA or the number of extracurriculars more often benefits the faithful drudge than the original mind.
Deresiewicz is right: admissions departments are more likely to pull students based on their amount of technical abilities and measurable gifts than their more abstract (yet often more virtuous or innovative) talents. Even the colleges who acknowledge this problem are struggling to change. They are used to using numbers to measure and make all decisions, and their desire for quantifiable control can turn almost humorous. As Eric Hoover wrote for Nautilus, Read More…
As people begin to develop a renewed interest in where their food comes from, many young people and urbanites are seeking out agricultural lifestyles, giving up desk jobs for tractors and field work. But it’s difficult to kickstart a profitable farm, especially as a primary career.
A new initiative in Virginia is striving to help these new farmers—even while encouraging them not to quit their day job. Created through a partnership between Virginia Tech, the Virginia Cooperative Extension’s Loudoun Office, and the Loudoun Department of Economic Development, the new program targets Loudoun County residents who are launching second careers in agriculture. Program coordinator Jim Hilleary explained to the Washington Post:
‘Across the nation, there’s this recognition that there is a new type of farmer emerging, and that is generally a second-career farmer,’ he said. ‘Virginia Tech realized that, and they drafted a curriculum for beginning farmers. And what we’ve done here locally is to take part of that statewide curriculum, localize it and apply it to the residents here in Loudoun County.’
These second-career farmers, says the Post, now “account for the majority of new agricultural business owners in the county.” This model will probably continue to increase in popularity: even while a lot of mid-sized farms are suffering, there is a “growing army,” as the New York Times put it, of small local farms, springing up in response to the sprouting market for organic and locavore foods. But many of these aspiring agriculturists don’t know what they’re getting themselves into—and this where Hilleary’s program steps in:
Rather than delve into the technical elements of farming, the worksheet urges aspiring farmers to think more broadly about what they hope to accomplish and to thoroughly consider what a new agricultural venture will demand of spouses, children and other family members.
“That’s where I’d say that this is distinct from other introduction-to-farming programs,” Hilleary said. “It doesn’t teach you how to be a swine producer; it doesn’t teach you how to raise cattle. . . . Rather, it helps you develop a mind-set for the challenges that are to come. And if people say ‘This is not for us,’ then that’s a success, because we just saved them a lot of time and money.”
Modern farming, bombarded by federal regulations and certification requirements, can be an expensive endeavor—even if you only own a small farm. Aspiring farmers need a program like Hilleary’s to help them grapple with the real costs involved in their chosen vocation.
The article reminded me of a piece I read last year about the newest generation of farmers, and how they’re faring: Narratively published a feature about married couple Dan and Kate Marsiglio, who left their teaching jobs in 2005 to start an organic farm. Though they’ve made great improvements over the years, they’ve also found farming to be more difficult than hoped:
In mainstream food magazines and agricultural journals alike, tales of city kids and hedge fund managers trading suits and ties for overalls have many forecasting a future of yeomanry in America. To be sure, new farmers remain hopeful that moment will come. But they’re also the first to report that in beginning farming, the honeymoon period is brief. It is almost a matter of course that regardless of how mentally and physically prepared a new farmer is for long, sweaty days of toil and winters of debt, farming will deliver more stress and heartache than expected.
Eight years after they launched their farm, the Marsiglios are still barely breaking even, and all thought of retirement remains in the murky unknown. Meanwhile, the gritty everyday work of farming grows more wearing with every year.
It will be interesting to see how these new second-career farmers cope with the difficulties of the modern industry—and how they’re received by more established producers in their area. Hilleary mentioned the “raised eyebrows” that these young farmers can get from veteran family farmers, even while “newcomers might have misconceptions about established, conventional farmers.” Hilleary hopes the initiative will bind both groups together: “We want to help them understand that they are tied together by common goals, and they shouldn’t allow themselves to be in categories like old versus new or organic versus conventional,” he said.
The way Americans farm seems to be evolving at present—current growth represents a more decentralized mode of agriculture that seems popular and promising. It may be years or even decades before such endeavors turn into full-time work. But through initiatives like Hilleary’s, perhaps we will build a band of farmers who can confront these challenges head-on.
I wanted to get off Facebook—to deactivate my account entirely. It seemed like such a waste of time, a distraction from real-life interactions and relationships. If Facebook no longer pulled at my attention, I thought, perhaps I would be a better friend, and invest in those people who are truly closest to me. I could invest time in the place I live, rather than in a virtual world full of acquaintances and people I barely know.
So I decided to take a break from Facebook to see whether my social relationships would improve or change at all. Before logging off, I let friends know that I’d be away, and gave them my email. It wasn’t really a full “unplugging” experiment, since I use the computer so much for work. But it meant that, in the evenings, I spent much less time online. I occasionally worked on writing projects or wrote emails—but not much else. I wrote long email letters or made phone calls to my closest friends and family members. I continued to use Instagram, but tried to send direct-message pictures to my family, rather than simply using the “public” feature. I marked friends’ birthdays on my personal calendar before deactivating my Facebook account, and tried to email or call them on their birthdays, rather than leaving the prosaic “Happy birthday!” wall post.
Leaving Facebook showed me how much time I do, in fact, rely on it to fill moments of pause. When I sat in the car, waited for the metro, or stood in line, social media was the first thing I turned to. Without Facebook, my fingers itched. What else could I browse—Instagram? Twitter? Anything to feel connected. Anything to pass the time. I realized how frenzied and information-obsessed my brain can become, and made an effort to cultivate quiet, and to appreciate the moments of stillness.
However, despite these advantages, my month away from Facebook wasn’t a time of great awakening, social revitalization, or spiritual growth. Though it did serve a few good purposes, there were also strong disadvantages to leaving Facebook—primarily, the sense of disconnection from family and friends. The world didn’t pause its social media usage when I did: friends would ask me why I hadn’t responded to messages or event invites, whether I had seen this picture or that link. I realized how much I relied on Facebook to get updates from more distant family members or old friends in my home state; though Instagram provided some information, I hadn’t thought about the fact that relationships, engagements, weddings, and graduations are primarily announced (and commented upon) via Facebook.
I began to evaluate my experiment. The thing I craved most about non-Facebook interactions was their closeness, their intimacy and depth. I was tired of the self-aggrandizing statuses, the public displays of affection between couples (or even friends) that would have been more meaningful, at least in my eyes, if shared privately.
But Facebook also does one thing very well—better, perhaps, than any other social media tool: it enables us to form and cultivate little platoons. And this, I realized, was what I had missed in the last month. Though I was able to invest in individual friendships, my lack of Facebook presence made it harder to host events, or to check up on the groups of people who meant so much in my life. I realized that not everyone checks email with the same rapidity I do—but everyone checks their Facebook notifications. I tried to coordinate a dinner with friends via email a few days ago—and only received one reply over the course of the next 48 hours. Then I created a Facebook event, and invited all the same people. They RSVP’d within 10-15 minutes. Read More…
A mother lets her daughter play in the park unaccompanied. A mother leaves her son in the car for a few moments, while she runs into a store to buy headphones. Many parents would consider these actions to be unwise—but are they criminal? According to three recent stories in the news, yes.
In the first case, Debra Harrell, a resident of North Augusta, South Carolina, allowed her daughter to play at the park while she worked at a local McDonald’s. She gave her daughter a cell phone. Lenore Skenazy noted in Reason that the park is “so popular that at any given time there are about 40 kids frolicking … there were swings, a ‘splash pad,’ and shade.” But on her second day at the park, an adult asked the girl where her mother was. When the little girl said she was working, the adult called the cops, who declared the girl “abandoned,” and arrested Harrell.
The second story was shared by Kim Brooks in Salon back in June: her four-year-old son insisted on accompanying her to the grocery store for a quick errand, but then refused to go inside the store. After noting that it was a “mild, overcast, 50-degree day,” and that there were several cars nearby, Brooks agreed and quickly ran into the store. Unbeknownst to her, an adult nearby saw her leave her son, and proceeded to record the whole incident on his phone, watched Brooks return and drive away, and then called the police. The police issued a warrant for her arrest.
These are only a few recent stories in which parents have faced arrest after leaving their children unsupervised. As Radley Balko notes at the Washington Post, these incidents seem to signal the “increasing criminalization of just about everything and the use of the criminal justice system to address problems that were once (and better) handled by families, friends, communities and other institutions.”
This latter point hearkens back to Robert Nisbet’s excellent book The Quest for Community: Nisbet predicted that, in a society without strong private associations, the State would take their place—assuming the role of the church, the schoolroom, and the family, asserting a “primacy of claim” upon our children. “It is hard to overlook the fact,” he wrote, “that the State and politics have become suffused by qualities formerly inherent only in the family or the church.” In this world, the term “nanny state” takes on a very literal meaning. Read More…
The romantic comedy film is either dying or dead, according to writers at The Atlantic and The Daily Beast. After watching “They Came Together,” a romantic comedy that parodies the genre, the Beast’s Andrew Romano argued that the romcom’s heydey has come to an end, due to shifts in audience targeting and gender preferences, as well as money problems and failed branding.
The Atlantic’s Megan Garber thinks that romcom plots no longer address the “way we live now,” in the age of online dating and delayed marriages. Christopher Orr made a similar argument last year: he said romcom plots are too outdated for today’s society—we no longer have taboos against premarital sex, nor do we have societal class divisions. The romantic conflicts of yesteryear are outdated in today’s society. However, Noah Millman wrote a rebuttal to Orr’s argument, reminding us that the romantic movies of 1940 weren’t popular or good “because there were arranged marriages (there were none) and it isn’t because women couldn’t get a divorce (all the female protagonists of the movies I cited are or get divorced) or couldn’t have sex … they work because they go internal, into character, to find both the conflict and its resolution, and they work because they don’t isolate the world of romantic love from the rest of the social universe.”
The troubles of the modern romcom may have monetary or societal threads, but it also has a problem with simplification and homogeneity that we can’t ignore. Most romantic comedies follow either a star-crossed lovers plot, or a “You Got Mail” storyline—the man and woman hate each other, or would never marry each other, but then slowly find out they’re perfect for each other (examples: “When Harry Met Sally,” “How to Lose a Guy in 10 Days,” “Sweet Home Alabama,” “The Switch,” “27 Dresses,” et cetera).
It’s true that both these types are rooted in classics—the star-crossed lovers are classic “Romeo and Juliet,” while the we-hate-each-other-no-wait-we-love-each-other is usually some reincarnation of Pride and Prejudice. But both these classics had greater complexity and depth than most of their modern manifestations. Both told stories of class and family, prejudice and tradition, virtue and vice. Their supporting characters were just as important as their leads—we couldn’t have Pride and Prejudice without Mr. Collins or Mrs. Bennet. Modern films don’t usually give us this rich, colorful tapestry.
As NPR’s Linda Holmes wrote in response to Orr last year, “The best [films] often have other elements, elements of real sadness, like the terrific and underappreciated Hugh Grant-Julia Roberts vehicle Notting Hill, for instance, which touches on not artificial obstacles, but on the way people in difficult circumstances sometimes hurt each other’s feelings and let each other down, not to mention supporting characters struggling with disability and fertility issues.” In contrast, says Holmes, “The [films] that take nothing seriously except dating … rarely work, and they’ve rarely ever worked, because love in life is usually mixed up with all kinds of other nasty stuff.” Millman agrees:
The romantic comedies that suck are the ones that adhere to a formula that none of the great romantic comedies of yore followed. They try to make both protagonists as “relatable” as possible by making them into everymen and everywomen – thereby depriving them of any interest. They focus overwhelmingly on the romance, treating the rest of the universe as so much “business” for low comedy, rather than exploring other themes that might reflect productively on the romance at the center. And they gin up artificial external obstacles instead of persuasive, character-driven internal ones.
Yet these are the films that we keep getting, with increasing regularity. They all tell familiar stories, with familiar conflicts—the plots may change somewhat, but they never surprise us. And romcoms aren’t the only films that suffer from this problem: modern cinema is teeming with stereotypical superhero stories, underdog sports stories, exploding/smashing action films, and their like. We can usually guess exactly how the plot will unfold in the first few minutes of the film.
People increasingly want different, surprising stories—and we’re starting to see some that are new, interesting, and complex. Many explore themes of friendship, rather than romance. Disney created an international sensation when they released “Frozen”—and perhaps one of its greatest surprises was that it was mainly about sisterhood, rather than the usual romance. “The Grand Budapest Hotel,” “Saving Mr. Banks,” “The Monuments Men,” “Gravity”: all were primarily stories of friendship, trust, camaraderie, sacrifice. In the realm of television, many people love BBC’s new “Sherlock” series, and the friendship between Benedict Cumberbatch’s Sherlock and Martin Freeman’s Watson.
We may be tired of films that tell the same old story—but that doesn’t mean we should get rid of the romcom, or the dystopian film, or the action movie. We just need to reconsider the stories we tell, the plots we create, and bring innovation and complexity to these genres once more. We need stories that allow tragedy in their endings, stories with real protagonists and real villains, stories that reflect the complexity and confusion of life. If we get rom-com movies that reflect these things, then perhaps the romcom will be revitalized. But for now, the genre feels much like a broken record. It isn’t that we’ve run out of stories to tell; we’ve just told the same story too many times.
“Does GPS kill curiosity?” That’s the title of a piece David Sturt and Todd Nordstrom wrote over at Forbes yesterday. Ironically, their article doesn’t really give an answer—their points on geocaching and the criminalization of curiosity are metaphors for a discussion of creative innovation in the workplace. They encourage businesspeople to “go somewhere ‘in-between’,” to “lose your map,” but all in a figurative sense. So despite the options offered to innovators in the workplace, we’re left with this question:
… Has the true explorer died within our culture? Have the pathways, roadmaps, proven strategies and GPS units killed off the spirits of great explorers like Magellan, Lewis and Clark, and Marco Polo?
It is good to consider the effect GPS systems have had on our culture. They have greatly enhanced the ease of travel—the ability to get from point A to point B—but they also make it more difficult to “go somewhere ‘in-between,’” as Sturt and Nordstrom write. We journey, most often, on freeways that are disconnected from the social and architectural fabric of passing communities. We are often too busy noting our estimated time of arrival and upcoming traffic patterns to enjoy passing landscapes. The GPS always promotes the most efficient route for drivers to take—but it doesn’t take note of scenic or historic importance. This is great when you’re in a rush, but can be potentially damaging for road trips, when we’re meant to see and savor.
GPS-navigated travel can also encourage a sort of mental laziness on the part of the driver. We aren’t forced to fully remember which turns we take, or which roads we’re driving on. We merely follow the GPS’s step-by-step instructions. Contrast this with traveling by maps (even a printed out Google map): while in the former scenario, we’re fed baby bites of navigation, the latter forces us to pay careful attention to every sign that passes, every twist and turn of the road. We recognize landmarks and road signs, and can easily find our way a second time, sans map.
Maps, of the smartphone and printed variety, are incredibly useful tools. I’m not saying we should stop using them. But there are times when, perhaps, we should consider taking Sturt and Nordstrom’s advice more literally—when we should seek out “in-between” places, rather than focusing on “getting from destination to destination.” Some of the best places are “in-between” larger places: small towns nestled up on mountain roads, scenic hikes tucked away from urban bustle, hole-in-the-wall restaurants hiding from more bustling thoroughfares. Our explorations of place should involve a desire for detours, and a willingness to stop.
We should also be willing to put down the maps. As Sturt and Nordstrom put it, “All of us have grown up in a world where our outcomes have been directed, and expected … we are suggesting you let your curiosity be your guide into some unknown territory.” Following maps—whether a smartphone GPS system, or merely our own travel-worn steps in a familiar town—can prevent us from discovering new and beautiful things. Setting down these guides enables us to discover beauty and mystery, both on the road and in our most familiar places. Read More…
We live in a rapidly urbanizing world. But Brian Chesky, co-founder and CEO of Airbnb, thinks we are also seeing an type of urbanism resurface, one in which trust—and the village—take center stage.
Airbnb’s business model is dependent on principles of trust and friendliness: it enables people to rent out their homes to travellers, thus replacing the more customary and mainstream hotel. At the Aspen Ideas Festival, says Atlantic editor Uri Friedman, Chesky told attendees the Internet is actually moving things back to a local level by enabling people to become “micro-entrepreneurs.” This local economic empowerment then has a seismic impact on urban business and cultural development as whole:
“At the most macro level, I think we’re going to go back to the village, and cities will become communities again,” he added. “I’m not saying they’re not communities now, but I think that we’ll have this real sensibility and everything will be small. You’re not going to have big chain restaurants. We’re starting to see farmers’ markets, and small restaurants, and food trucks. But pretty soon, restaurants will be in people’s living rooms.”
Not everyone may be comfortable with a model this decentralized—but it is true that online tools like Twitter, Facebook, and mobile apps have changed the way businesses work. Food trucks can tweet their locations to followers, thus building a faithful community as they travel. Hole-in-the-wall restaurants can be found easily via Google maps and Facebook pages. The app I reviewed on Thursday, Huckle & Goose, is another example of the way technology is helping people connect with local entrepreneurs—in this case, local farmers. Companies like Airbnb and Uber take things to another level: they require us to place our faith in the host company and its system of accountability, as well as the entrepreneur whose services we claim.
David Brooks affirmed this in his column on Airbnb, called “The Evolution of Trust“—he writes that, in today’s world, people “are both hungrier for human contact and more tolerant of easy-come-easy-go fluid relationships.” In this world, apps like Airbnb are perfect catalysts for a “a new trust calculus,” a new status quo in which “flexible ad-hoc arrangements” and peer-to-peer commerce are the norms.
But the village mentality that Brooks and Chesky are observing doesn’t necessitate actual geographic villages. To the contrary: the places these apps and websites are most likely to be used are urban or international places. They help convey the feel of a village, in the rush and clamor of the big city. But perhaps this is where such services are most needed: real villages are geographically, necessarily, connected and close. The city is where we most often feel lost and isolated.
Friedman notes that the rapid urbanization of our world seems to go against the trend Chesky is identifying:
“Chesky sees village-like networks sprouting in cities at a time when urbanization is also going in the polar opposite direction. More than half of the world currently lives in cities, and the United Nations predicts that two-thirds of the global population will be urban-dwellers by 2050. In 2011, there were 23 “megacities” of at least 10 million people around the world. By 2050, there will be 37. It’s possible that as cities balloon to overwhelming sizes, we’re coping by carving out smaller communities. But it’s also possible that the phenomenon Chesky is describing is primarily playing out in Western countries. After all, Asia, where Airbnb has a relatively small presence, will account for most new megacities in the coming decades.”
I think Friedman’s first reason is spot-on, though only time will tell if he’s correct: in the midst of rapid globalization, people seem to be struggling to find a niche, a community. They don’t just want to visit the same chain stores, the same thoroughfares. They don’t want to constantly feel like another face in the crowd. Instead, they’re looking for ways to build community, even as their world becomes more isolated and atomized. Companies like Airbnb seem to provide that.
Some have accused technology of speeding up globalization—of creating a world in which we feel lonely and separated from the little platoons around us. But could it be that, with time, technology will fix the woes it created? Human nature will always yearn for community—Aristotle called us “social animals.” If he was right, then our desire for real closeness with other humans won’t simply go away. Either we’ll abandon the tools that isolate us, or we’ll adapt them to suit our community-craving needs. If Chesky is right, the latter may create the urban community of the future.
A lot of Americans are beginning to express interest in the idea of eating local, seasonal food. But at the same time, most of us spend our days working (at least) 40 hours a week, cramming social, extracurricular, and athletic events into evenings and weekends. Who has time to cook—let alone go to the farmer’s market, pick out produce, and plan meals?
This was the motivation behind Huckle & Goose, a new cooking app and website for people who want to eat local, but don’t have the time to create meal plans on their own. The app provides weekly curated recipe plans, specific to U.S. region, complete with an automated shopping list. Farmers across the U.S. email Huckle & Goose their harvest schedule, and the plans are then tailored for each region based on what’s available.
The company was started by sisters-in-law Christine Lucaciu and Anca Toderic. Both are Romanian: Toderic was born there, and Lucaciu is a first-generation American citizen. This Romanian heritage gave them a love of local, seasonal food. Toderic remembers canning tomato sauce and going to the market with her grandmother. There was an appreciation and awareness of produce’s seasonality: Toderic and her siblings would each get one orange at Christmastime, as a special treat. You can imagine, then, Toderic’s astonishment when first walking into an American grocery store, and seeing mountains of oranges in the produce section. This was “the land of milk, honey, and processed foods,” says Lucaciu. But several years ago, Lucaciu and Toderic encountered the locavore movement—and they adopted the idea wholeheartedly: “Now we don’t have to wait until summer trips to Romania to taste grass-fed meats and vegetables full of flavor,” Lucaciu said.
However, buying local has its challenges—Lucaciu and Toderic found it difficult to plan the meals they had envisioned when they bought their fresh produce. Kohlrabis and beets rotted in the back of their fridge, while they searched for recipes that were feasible to create on a busy schedule. This challenge inspired Huckle & Goose. The name refers to huckleberries and gooseberries—two berries that aren’t grown conventionally, and are only available a few months a year. “Not being able to buy them at the grocery store whenever the mood strikes cultivates a deeply rooted sense of gratitude and patience that’s so rare in our Western food culture,” Lucaciu and Toderic say on their website. In addition to the weekly meal plans, Huckle & Goose offers an archive of searchable recipes and a blog with additional cooking tips and ideas.
I signed up for a trial version of the app, and have been using it over the past couple weeks. Thus far, I’ve been impressed by the ease of the app and the versatility of the recipes. Some are more complicated than others—but Lucaciu and Toderic didn’t want to make the recipes too easy. Cooking, they said, is supposed to take time: it’s a ritual that we can enjoy. Though they offer some easy recipes, they also encourage people to try new and challenging ones.
Lucaciu and Toderic are also conscientious of the cost that often accompanies buying local—and for this reason, they offer a lot recipes that have a shorter ingredient list. This is the secret and the joy of buying local: fresh things taste great on their own, so you don’t have to add much. Read More…
Facebook has taken a lot of heat in the past few days for toying with users’ emotions—albeit for scientific purposes. In January 2012, 700,000 Facebook users were the subjects of a scientific study, in which data scientists tweaked the website’s news feed algorithm to skew content in either a positive or negative emotional direction. The news feed posts were chosen based on the amount of positive or negative words in each post. Users were then observed, to see whether these emotionally charged news feeds had any effect on their own moods. Sure enough: by the end of the week, study subjects were likely to post more positive or negative words, based on the information they’d been shown throughout the past seven days.
The study has been greeted with rampant disapproval and alarm. Writers have questioned the data privacy issues at stake, and discuss their fear that Facebook is treating them like “lab rats” in a grand social experiment. Wired called it a “blatant violation of trust.” As Laurie Penny wrote at the New Statesman,
Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively. There are no precedents for what Facebook is doing here. Facebook itself is the precedent. What the company does now will influence how the corporate powers of the future understand and monetise human emotion.
But additionally, one must question why the study was allowed in the first place. Adam D.I. Cramer, a Facebook data scientist, said they conducted the study because “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” according to a post on his Facebook page. “At the same time,”he continued, “we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
What did Facebook intend to do (or what have they, perhaps, already done) as a result of this fear? Skew our news feeds in a more positive direction, to shield our fragile egos and comparison-prone selves? Do they intend to shield us from our worst selves by only giving us the information they deem important, worthwhile, positive, happy?
This is not Facebook’s job, and this is not part of providing a “better service,” as Cramer said was their intent (“The goal of all of our research at Facebook is to learn how to provide a better service.”). Doing research to provide “better service” involves shielding users from hackers, bugs, and glitches in the system. It involves creating a platform (not an information-sorting station) for users to interact with friends and family, without having to fear subtle manipulation. Read More…
Simon Preston noticed that most areas of Britain don’t have a vibrant food culture. Besides obvious place-tied dishes—things like Cornish pasties—few other dishes had a distinctive regional trademark. In an article at the Guardian, Preston writes that many Brits have developed a rather globally encompassing attitude toward food:
We’re a population that grazes dishes from across the world and, for the most part, we feel no more connected to a local dish than we do to a curry. When travelling abroad, we’re quite taken with the regional dishes that appear again and again, but closer to home, local food culture is still a fairly new idea, mostly driven by the trend-led efforts of creative chefs and encouraged by food hobbyists.
Eating international cuisine isn’t a problem—but, as Preston points out, there are benefits to having a local food culture, as well. So he asks this interesting question: is it possible to invent a food culture in the 21st century? He decided to try and create one in the rural Aberdeenshire town of Huntly:
I set up a dining table and chairs in the supermarket and used tea and cake to entice shoppers to join me. Chattering families, reminiscing pensioners and bemused workers who had raced in for a ready meal shared their stories: how they came to Huntly, why they stayed, places they had loved and lost, ghost stories and tall tales.
… A huddle of local chefs gathered and soon, my dossier of local anecdotes became dishes. The ancient standing stones in the town square were represented in the positioning of prize-winning local haggis bonbons on a plate. Barley appeared in a risotto, which in turn referenced the Italian connection found in so many Scottish towns. A schoolgirl’s tale of a JK Rowling manuscript locked in the local police station safe inspired a Huntly Mess, made with local raspberries and whisky, and the Deveron river – a place where the town goes to play, to think, to celebrate and to court – brought local trout to one dish and a river bend slick of sauce to another.
The dishes began to catch on as local restaurants and pubs served them. Customers were delighted to see their stories and memories take gastronomic form. The food culture can, it seems, be invented from scratch.
It’s an interesting idea, especially for many American who have lost the culinary cultures of their past, due to the burgeoning influences of other cultures and food chains in their homelands. Excepting certain cities with distinctive gastronomic traditions, like New York City or Philadelphia, many American towns don’t have dishes to call their own. But as Preston points out, it’s never too late to begin examining local ingredients again: our states, counties, and cities offer us a wealth of history, terrain, crops, and animals with which to build a local food culture.
In the Idaho town where I grew up, corn and onion fields had a distinctive presence. Farmers grew a lot of alfalfa and mint, and there were orchards scattered here and there. We got fresh goat’s milk from one farmer, and fresh beef from another. My brothers raised chickens. There’s a local coffee roaster in the town beside us. There are a couple nearby lakes for fishing, and the Salmon River isn’t far off. It’s only a couple hour drive into the mountains, if you want fresh huckleberries.
There are also some incredible recipes, handed down over time, jotted on note cards in spidery script, that I would add to my local food culture: my grandmother’s baked beans, my aunt’s “mile-high biscuits,” grandpa’s barbecued chicken, my great-grandmother’s brown bread, and her much-coveted recipe for peach pie.
Many chefs here throughout the Northern Virginia and Washington, D.C. area love to use local produce—and they have ample resources to work with. Farms in NOVA and Maryland feature high-quality and fantastic tasting produce. With these resources, it’s entirely possible to create and curate a local flavor, to showcase the parts of your culture that are distinctively local.
Of course, this isn’t meant to demean the rich international traditions that influence our various cities—in Idaho’s capitol, Boise, there’s an entire Basque district, with its own distinctive (and incredible) food culture. Outside D.C., in Annandale, Virginia, there’s a significant Korean immigrant population, and the restaurants there are fantastic. New York City’s immigrants are part of what give the city such incredible food. The point isn’t that imported ingredients and recipes are bad—to the contrary, they help form a vibrant local food culture. Without them, our regions wouldn’t have as much culinary color and vibrancy.
But foods that are chosen from local ingredients also have a distinctive story. Whether invented or preserved, local foods help define, and give flavor, to our places. That’s why Preston invented a food culture, and why I have the beginnings of mine.
What’s your local food culture?