Christopher Beha’s Arts & Entertainments is built around a classic morality-tale structure: the devil’s bargain, the spiraling consequences, the choice between a good reputation and a good conscience.
Eddie Hartley is an acting teacher of the “those who can’t do” type, whose marriage is being ground down by his wife Susan’s longing for the children they can’t conceive on their own. To pay for another round of IVF treatments, Hartley sells a sex tape from his youthful relationship with an actress who has gone on to become a megastar. He plans to keep his face off the tape and his name out of the press, but that plan turns out as well as literally all of his previous ones.
Hartley becomes a celebrity: what we have instead of Punch and Judy shows, or lives of the saints. He’s part Wile E. Coyote, the battered villain whose pain provokes the audience’s laughter, and part scapegoat, punished and outcast for our sins as much as for his own. By the end of the novel he’s been fired, kicked out by his wife, pelted with eggs, even drained of blood; a shadowy reality-TV king (albeit a king enslaved by his audience) orchestrates a reunion with Susan which is as unsettling as the drugged-up “happy ending” of A Midsummer Night’s Dream.
Facebook has taken a lot of heat in the past few days for toying with users’ emotions—albeit for scientific purposes. In January 2012, 700,000 Facebook users were the subjects of a scientific study, in which data scientists tweaked the website’s news feed algorithm to skew content in either a positive or negative emotional direction. The news feed posts were chosen based on the amount of positive or negative words in each post. Users were then observed, to see whether these emotionally charged news feeds had any effect on their own moods. Sure enough: by the end of the week, study subjects were likely to post more positive or negative words, based on the information they’d been shown throughout the past seven days.
The study has been greeted with rampant disapproval and alarm. Writers have questioned the data privacy issues at stake, and discuss their fear that Facebook is treating them like “lab rats” in a grand social experiment. Wired called it a “blatant violation of trust.” As Laurie Penny wrote at the New Statesman,
Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively. There are no precedents for what Facebook is doing here. Facebook itself is the precedent. What the company does now will influence how the corporate powers of the future understand and monetise human emotion.
But additionally, one must question why the study was allowed in the first place. Adam D.I. Cramer, a Facebook data scientist, said they conducted the study because “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” according to a post on his Facebook page. “At the same time,”he continued, “we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
What did Facebook intend to do (or what have they, perhaps, already done) as a result of this fear? Skew our news feeds in a more positive direction, to shield our fragile egos and comparison-prone selves? Do they intend to shield us from our worst selves by only giving us the information they deem important, worthwhile, positive, happy?
This is not Facebook’s job, and this is not part of providing a “better service,” as Cramer said was their intent (“The goal of all of our research at Facebook is to learn how to provide a better service.”). Doing research to provide “better service” involves shielding users from hackers, bugs, and glitches in the system. It involves creating a platform (not an information-sorting station) for users to interact with friends and family, without having to fear subtle manipulation. Read More…
Bowe Bergdahl was “an American prisoner of war captured on the battlefield” who “served the United States with distinction and honor,” asserted Susan Rice, the president’s national security adviser.
Rice was speaking to ABC’s George Stephanopoulos the morning after Barack Obama’s Rose Garden celebration of Bergdahl’s release. When she spoke last Sunday, could Rice have been ignorant of the widespread reports that Bergdahl had deserted?
Before last Sunday, her credibility was already in tatters.
Five days after Ambassador Chris Stevens and three Americans were killed in Benghazi, Rice went on five Sunday shows to describe the terrorist attack as a spontaneous riot ignited by an anti-Muslim video.
Not only has her credibility now suffered a second near-lethal blow, her competence as a presidential adviser is open to question. How could she let the president strut into the Rose Garden to celebrate the release of a soldier whose reported desertion triggered a province-wide search that may have cost the lives of half a dozen American soldiers?
As The Hill reported, a Pentagon investigation in 2010 concluded Bergdahl had walked out on his unit and left a note in his tent saying he was disillusioned with the Army and no longer supported the war. Was Rice ignorant of this? Did she think it not relevant, when she approved the president’s hosting of Bergdahl’s parents in the Rose Garden? Is Rice not responsible for the humiliation President Obama has endured all week and the fiasco that diverted national and international attention from his trip to Warsaw, Brussels and Normandy?
Forty-eight hours after Obama celebrated Bergdahl’s release, the chairman of the Joint Chiefs was promising an investigation of the soldier on the charge of desertion and related allegations he may have defected and collaborated. If Gen. Martin Dempsey was aware an investigation into charges so serious that they carry the death penalty was ahead for Bergdahl, did he not flag the White House before the president went before the nation to celebrate Bergdahl’s return? Read More…
Despite The Atlantic’s elegy for Twitter last week, the real-time social network simply isn’t dead. Far from it—instead of dying a natural death, it may be undergoing a similarly natural process: evolving with the times to maintain its relevance. The half-life of the Internet grows shorter and shorter, forcing would-be social media companies to either adapt or fade into obscurity.
Part of the false perception of Twitter’s purported demise is its near-constant comparison to Facebook. From Wall Street investors to the layman who utters Twitter in the same breath as Facebook as though the two companies are partners, Facebook and Twitter are thus paired as the icons of social media. But, as Will Oremus at Slate correctly points out, that comparison is misleading. Facebook focuses on connecting you with people you know in real life; Twitter helps you share content with strangers. You can’t share photo albums, invite people to events, or instant message people on Twitter; it’s a largely impersonal space. Twitter is a portal into public discourse, a tool that allows a glimpse into groupthink, and provides a platform to build your own public persona.
Twitter is indicative of a new trend of communication; its real-time format has expanded the reach of conversations from a handful of people offline to dozens or hundreds at a time. You even can be made or broken by a single tweet: the seemingly innocuous tweet by communications professional Justine Sacco did not only cost Sacco her job, but her company’s shareholders their wealth, at least temporarily. Ellen DeGeneres’s Oscar selfie got over 3 million favorites and 1.5 million retweets. News anchors refer to tweets on the air to either bolster their claims or to get immediate feedback on their shows.
A social media platform that has become an indispensable part of the public discourse will remain alive as long as people value its service. It may have lost its “little platoon” feel, but that is the result of an increase in users, an unfortunate byproduct of expansion. In order to maintain the “good neighbors” aspect of Twitter, users may have to put up some good fences to protect their conversations from the wake of larger-scale shouting matches.
This past weekend, the White House Correspondent’s Dinner had a formidable Twitter presence, and the hashtag #BringOurGirlsHome has been gaining momentum, prompting the Nigerian government to take more action to rescue hundreds of kidnapped schoolgirls. These are not the symptoms of a dead or dying organism. They are the signs of a growing, changing, complex system.
New York Congressman and potential 2016 presidential candidate Peter King put points on the soundbite scoreboard yesterday with a fiery appearance on MSNBC’s “Morning Joe.” Host Joe Scarborough was happy to escalate the intra-party war between the hawkish Republican establishment and Sen. Rand Paul.
“Let’s start with Rand Paul,” Scarborough said. “I take it that you would not be comfortable with Rand Paul being the commander-in-chief.”
“His views would be disastrous, Joe. I think he appeals to the lowest common denominator,” King said of the man who has assumed a contentious front-runner status for the Republican presidential nomination.
For advocates of a limitless global American military presence, Paul’s rising popularity is an existential threat to the GOP as we have known it for generations. The suddenness of Paul’s rise has unleashed a flood of conservative-on-conservative invective. Paul’s views have been described as “nakedly unacceptable … unhinged … lunacy” (Bret Stephens in the Wall Street Journal), “fringe … extremism … hooey” (Jennifer Rubin at the Washington Post) and “dewy-eyed foolishness” that is “more appropriate to a dorm-room bull session than the Situation Room” (Rich Lowry at National Review).
And that was just last week.
Last month, former Bush press secretary Ari Fleischer told Time that Paul has “a naiveté that’s going to be a problem.” In the midst of Paul’s 13-hour filibuster protesting President Obama’s drone powers last spring, Sen. John McCain uttered what remains Paul’s most enduring label, when he called his younger senate colleague a “wacko bird.”
Paul’s policy deviations are significant, but the “wacko bird” wasn’t taken seriously until the polls showed that his message was catching on with voters. Now that he represents an unquestionable insurgency within the GOP rank and file, the establishment has finally roused itself into an organized counter-attack. Rubin is leading the media charge (she wrote four Paul hit pieces in the last four days), while anonymous donors huddled around a Time reporter to deliver a mafioso message in Las Vegas—they will bury Rand Paul’s candidacy in its infancy.
Paul is marching on, wrangling over the Reagan legacy even as his foes tar him as a strident leftist for remarks he made in a 2009 video where he seemed to implicate former Vice President Dick Cheney in wanton war profiteering. On ABC’s “This Week,” this week, Paul said that Cheney “loves his country.” And while he said that he does not question Cheney’s motives, he reiterated that the former veep’s role at Halliburton, a major defense contractor for Operation Iraqi Freedom, created “a chance for a conflict of interest.”
Paul is being careful with this messaging, but only to a point. He is not holding back his belief in what constitutes “realism” in foreign policy, a worldview that he maintains is the more authentic conservative and Reaganesque model. In the weekend interview, he pointed to what he sees as the hypocrisy and recklessness of diplomatic bellicosity. To hawks who claim that the United States will not accept a nuclear Iran under any circumstances, Paul offered a historical comparison.
“We woke up one day and Pakistan had nuclear weapons,” he said. “If that would have been our policy towards Pakistan, we would be at war with Pakistan.”
In an op-ed in Wednesday’s Washington Post, he distilled the point:
“If, after World War II, we had preemptively announced that containment of nuclear powers would never be considered, the United States would have trapped itself into nuclear confrontations with Russia, China, Pakistan, India and North Korea.”
To Peter King, these are the views of “an isolationist wing from the 1930s.” In his bulldog appearance Wednesday morning, King said that Paul’s views on foreign policy, drones, and domestic surveillance are at “a hysterical level” that is “feeding into paranoia.”
This is the same Congressman King who, in a 2004 radio interview with Sean Hannity, said that “80 to 85 percent of mosques in this country are controlled by Islamic fundamentalists.” American Muslims, King said at the time, are “an enemy living amongst us.”
It seems apparent that, of the two men, King has engaged in the greater fear-mongering. While Paul may have entertained some far-fetched hypotheticals about drones wiping out a Starbucks, the Pentagon would survive a Paul presidency. He is far more moderate on defense spending than his father ever was, and skepticism of government overreach is arguably a core American value. One of these men may be prone to paranoid hysteria, but it isn’t Rand Paul.
Michael Ames reports on politics, culture, and land issues. He lives in Brooklyn, New York. Follow him on Twitter @mirkel
Andrew Leonard has a fascinating, if discursive, profile in Salon of the novelist Richard Powers. In it, Powers registers a note of discontent with the proliferation of both pop-musical content—the “insane torrent,” as he puts it in his latest novel Orfeo—and all the digital mechanisms that deliver it today.
“Suppose you were born in 1962 and you are coming into your own and music starts to become essential to you,” says Powers. “You are right on the tail end of that sort of folk rock thing, but you are aware historically of how these guys were revolting in a way against the previous generation. And because of the nature of the distribution mechanisms that you talk about, where it’s two radio stations and one record store, there’s a saturation effect for whatever is in vogue and there has to be a countervailing cultural move just to refresh our ears. And that’s the start of punk.
“So you can see these revolutions and counter-revolutions and you can see a historical motion to popular music and it’s thrilling and you want to know what happens next. The state that you just described of permanent wonderful eclectic ubiquitous interchangeable availability — there’s no sense of historical thrust.”
Leonard sums up the point thus: “Universal ubiquity has blown up the narrative.”
There’s something to this, I think. Granted, I’m pushing 40; I’m not as hip to trends as I was when I was a kid, and certainly not when I was covering them for a daily newspaper. But the point is, it’s undeniably the case that it was once fairly easy for a casual music fan to keep abreast of the latest thing. Right up until the early aughts and the passel of “The [monosyllabic plural-noun]” bands playing neo-garage rock: that’s the last time I can remember there being some kind of identifiable “movement” in rock music.
Now there is no one thing.
As a caveat, I will admit it’s somewhat rockist to think along these lines. Just because Pharrell Williams or Bruno Mars or Robin Thicke aren’t trad-rock frontmen, why can’t they be seen as the vanguard of an eclectic dance-pop revival?
And part of the fragmentation that Powers laments isn’t even technologically-driven; it’s generational and racial. Much of what we would’ve called rock music 30 years ago now falls under the rubric of country music. It’s Top 40 for white people. It’s blues-based rock tricked out with splashes of fiddle and pedal-steel. (Bruce Springsteen astutely noticed this recently: “country music is kind of where rock music has gone, really, at this point … It’s basically kind of pop-rock music … It’s where rock music continues to have a certain currency.”)
So, a good chunk of the energy in rock-ish songwriting is devoted to the Nashville machine, which critics don’t take terribly seriously and hence wouldn’t bother to suss out trends that fit on the continuum of Powers’s reaction/counter-reaction “historical thrust.”
All that said, I suspect Powers is right. We’ve probably seen the last of swing-turns-to-bop or prog-is-wiped-out-by-punk cultural shifts in popular music. Powers seems to quite openly admit that these shifts had been abetted by self-styled tastemakers, who have seen a drastic decline in influence over the last 15 years. He argues unabashedly that the culture desperately requires those media filters.
I don’t know if I’d go quite that far. Then again, I say that having come of age in a culture that had filters. I say I can do without them. But I was shaped by them. What about my kids’ generation: how will they know what’s good, what’s worth listening to, what (as a Clash fan would have put it) matters?
They’ll be free of a “narrative” imposed from above. Instead they will have the “insane torrent.”
Everything will be at their fingertips.
But perhaps they will miss greatness right under their noses.
It’s that time of year again: the glitz, the glamor, the gowns. Who will go home with a coveted statue, and who will go home empty-handed? This year’s Oscar-nominated films were particularly heartfelt and inspiring (or about as close as Hollywood can manage), and TAC’s culture critic Noah Millman has seen most of them. He can tell you which ones are worth watching—or rewatching:
Loosely based on the Abscam scandal, David O. Russell takes a crack at screwball dramedy, with mixed results. Millman writes: “Russell wants it both ways – he wants you to enjoy the Scorcesean roller-coaster even as at every turn he’s showing you that his real pleasure tilt-a-whirl. And it turns out you can’t quite have it both ways.”
12 Years a Slave
The film’s undiluted portrait of slavery that had audiences sobbing in the theater is nominated for Best Picture—and the two leads, Chiwetel Ejiofor and Michael Fassbender, are nominated for Best Actor and Best Supporting Actor, respectively. Lupita Nyongo is also nominated for Best Supporting Actress. Millman criticized director Steve McQueen for failing to end the film on a positive note: “McQueen doesn’t give us that uplifting twist… McQueen could have shown us a determined Northrup engaged in t[he] pursuit [of his captors], vowing never to rest, and ended his movie on an ‘up’ note. He chose not to.”
The genre-bending romantic drama of a man who falls in love with his operating system is a thought-provoking tale of humans’ dependency on their machines. Millman describes the film as “…a particularly clever Pygmalion story, one that is more attuned to what a modern man might actually want in a fantasy companion, as opposed to a mere sexual fantasy.”
Millman compares Alexander Payne’s newest film to his 2002 work “About Schmidt”, a rambling, dour film about an unhappy old man: “Payne’s new movie, ‘Nebraska,’ has a lot in common with ‘About Schmidt.’ Both are set primarily in Nebraska; both deal with elderly men who feel they have missed life somehow (and associate that missing out with having married June Squibb), and who go on a quixotic road trip in a roundabout way of trying to resolve their existential dilemmas.”
Critics have raved about the gorgeous cinematography and complained about the nail-biting twists and turns this film makes. Millman offers praise for the visual component of the film. “Enormous effort has been put into getting the physics right, and that effort pays off magnificently. The film is stunningly beautiful – more than that, it is sublime (to use the Burkean distinction).”
Based on a true story of a commercial ship hijacked by Somali pirates, Millman praises director Paul Greengrass’s ability to depart from the classic thriller structure to weave a more complex narrative: “The structure he’s chosen, which takes real risks in terms of pacing, allows him to draw that straight line between Captain Phillips’s resourcefulness and the might of the U.S. Navy, while also showing what, and who, lies on the other side of that line.”
Philomena and The Wolf of Wall Street
Rod Dreher doesn’t comprehensively review these two films, but sheds very important light on the religious and moral undertones of both films, bringing their messages into stark relief. Noah Millman in his Oscar post calls Philomena “a sweet little film, well-written and well-structured.” He gives faint praise to “The Wolf of Wall Street” but claims too much time is given to the protagonist, who, in Millman’s estimation, “just isn’t a very interesting person.”
The Dallas Buyers Club
While TAC did not review this film, the New Yorker’s review is more than apt, and appropriately highlights Matthew McConaughey’s transformation from romantic comedy beach bum to a serious dramatic actor.
What is “mindfulness”? TIME Magazine recently released a cover story on “The Mindful Revolution,” outlining a return to conscious living emphasized by modern enthusiasts. Many in the “mindfulness” crowd want us to step away from the Internet, unplug, and take a break. But Evgeny Morozov put the trend into perspective in his article, “The Mindfulness Racket”:
If it takes an act of unplugging to figure out how to do it [“mindfulness”], let’s disconnect indeed. But let us not do it for the sake of reconnecting on the very same terms as before. We must be mindful of all this mindfulness.
Morozov’s piece considers calls to take a digital “sabbath” or “detox,” to disconnect temporarily “so that we can resume our usual activities with even more vigor upon returning to the land of distraction.” But all of this talk of disconnecting and becoming mindful begs the question: What is “mindfulness” for? Why should we be mindful, and how can we cultivate a proper mindfulness?
I would suggest that there are two types of mindfulness: one inwardly focused, the other outwardly focused. When people draw upon the terms “fasting” and “detoxing” for technology disconnections, they bring up two interesting examples of how mindfulness affects our inward and outward health.
In religious fasting, a lack of eating is not focused on dietary benefits. It is meant to be a time of focus on larger, more important things than bodily craving. In fasting, we center on the spiritual, and step away temporarily from the physical. Yet in that stepping away, we learn to appreciate the goodness of food and drink. Fasting brings us to thankfulness. We eventually return to savor the goodness of the things we left.
Detoxes focus on inward cleansing, looking inward at incongruences and junk in my own system. A “digital detox” would necessitate that, in stepping away from digital tools, one also looks inward at personal cravings and inclinations. How is digital media shaping the way we live? Is it fostering unhealthy habits within us? Have we allowed it to build callouses of anger, discrimination, or fear in our souls? Statistically speaking, Facebook is reported to foster depression in its users. We must exercise introspection toward online habits, desires, and emotions we’ve inculcated. A detox may also help us consider what we haven’t dwelt on or considered enough—a person, situation, or decision we’ve neglected in the technological mayhem.
But a digital “fast” would be a time of mindfulness toward religion, family, friends, and society. One could consider relationship priorities outside the constraints of Facebook and Twitter, even the telephone and Skype. Such a fast would prompt us to analyze the ways such tools shape societal and work interactions, and consider how they benefit (or detract from) the spiritual, soul-fostering work of relationship. This fast would, one hopes, enable the abstainer to return to his or her computer with a renewed sense of purpose and limits.
A proper understanding of human nature is vital in our consumption. We must keep our own inclinations in mind. Pride is a natural susceptibility of the human soul. In every tweet and Facebook status, our own pride will venture to the surface. We must look carefully at our motivations, and consider what sort of online (and inward) presence we’re cultivating. This is the sort of consideration that might go into a digital “detox.”
Stepping away from technology will not make us more or less mindful—just as a day of detoxing or fasting will not automatically change our perception of food. In all things, we want to cultivate Aristotle’s virtuous mean, searching for that place between excess and defect where excellence dwells. Mindfulness does not necessitate pure abstention from iPhones, Twitter, and the like—it is not about neglecting certain platforms. Rather, it defines how we use those platforms. The reason for a “fast,” “detox,” or what-have-you is to help us see the big picture, to give us greater purpose, understanding, and discretion.
American viewers expressed outrage Sunday after NBC reporter Christin Cooper interviewed U.S. skier Bode Miller, driving him to tears with inquiries about his deceased brother. Miller tied for a Bronze medal in competition Sunday, and mentioned his brother in connection with the win. But subsequent prodding from Cooper on the subject was too much.
Twitter and other social media exploded indignantly, accusing Cooper of malicious indifference. But Miller defended the reporter Monday: “I’ve known Christin for a long time and she’s a sweetheart of a person,” he said on NBC’s Today Show. “I know she didn’t mean to push. I don’t think she really anticipated what my reaction was going to be. I think by the time she sort of realized, then I think it was too late.”
It’s kind of Miller, and sheds a compassionate light on the incident. Nonetheless, the incident should prompt journalists (and their audience) to consider the consequences of our sensationalistic reporting.
This sort of journalism is not abnormal. The media’s normal attitude toward celebrities is often just as uncouth and intrusive. Our reality shows regularly exploit the emotions of viewers by drawing on any and every tragedy experienced by its candidates. We’ve become a culture that feeds on sensationalism and misfortune: we love the underdog, the tragic hero. This often becomes a form of emotional exploitation,targeting the audience, protagonist, or people involved in the protagonist’s life. As long as a semblance of fiction or separation remains, we feel okay with this exploitation. But when a man breaks into tears, and the camera hovers in front of his bowed form, we begin to feel guilty. Yes, we—for it is our fault too, not just Christin Cooper’s. The media gives us what we want. It’s a simple case of supply and demand: Americans eat up the media’s shameless emotional pandering (usually).
Also, it’s important to note that Cooper was not the only reporter who seemed to rub pain in Miller’s face. The cameraman zoomed in on his face, and held the camera close to his grief. He wouldn’t go away. It was an inexcusable breach of propriety. If the cameraman had turned away when Miller began crying, perhaps the incident wouldn’t have been as painful. But this isn’t the only time such shameless recording has taken place in Sochi: other failed contestants, when pushed to tears, are immediately dogged by cameras. They want to see the tears fall, to capture the moment of grief for viewers at home.
When Cooper and her cameraman zoomed in on Miller’s grief, they detracted from the true story of the moment: his victory, hard work, and resilience. That should have been the focus of the interview. They ought to have adopted a more professional manner. Indeed, beyond professionalism, mere courtesy would suggest that a person, in moments of grief and pain, should be granted greater privacy.
But American audiences should also consider the role they play in shaping the news: until they demand otherwise, this sensationalism will not stop.
What happens when an artist finds himself above reproach? Woody Allen, it would seem, finds himself in this rather fortunate position, even as new allegations about his conduct toward stepdaughter Dylan Farrow emerge. In a New York Times piece published Saturday, she described the sexual abuse Allen inflicted on her as a seven-year-old girl. She says the abuse has haunted her throughout her life: “Each time I saw my abuser’s face—on a poster, on a t-shirt, on television—I could only hide my panic until I found a place to be alone and fall apart.”
If any politician or pundit faced these accusations, one would hope they would lose their position, or at least be subjected to careful scrutiny and investigation. Yet in Allen’s case, many are willing to shrug off the immensity of the crime because of his great artistic talent. Rod Dreher wrote in a blogpost a few days ago, “I completely agree that Allen is a pig—if there were no Dylan Farrow accusations at all, his unrepented-of conduct with Soon-Yi Previn was enough to establish his swinishness—but that does not detract from his accomplishments as a filmmaker.”
Perhaps these allegations do not diminish Allen’s accomplishments. But even so, one must carefully consider the ethics of this position. Maureen Orth gathered substantive accusations in 1992 that supported claims of inappropriate behavior by Allen toward Dylan Farrow when she was little. Allen has denied all accusations, and continues to live without culpability. Does one continue to watch his movies, to view and enjoy his art? Andrew Sullivan thinks so, though he seems to believe Farrow is telling the truth:
Perhaps with less essential talents, the sins may more adequately define the artist. But that, in many ways, only makes the injustice worse. Those with the greatest gifts can get away with the greatest crimes.
We can and should rail against this, while surely also be realistically resigned to it. It struck me, for example, rather apposite that as the blogosphere is debating whether to boycott Woody Allen’s films in the future because of this horrifying story, exponentially more people are tuning into the Super Bowl to watch a game we now know will render many of its players mentally incapacitated in their middle ages and beyond. We know that this spectacle is based on the premise of brain damage for many of its participants, but we watch anyway.
This argument appears flawed and alarming to me for a couple simple reasons. First, the Super Bowl comparison does not hold, because tackle football’s dangers and horrors are all known to the athletes involved, and they engage in the sport voluntarily. If Peyton Manning’s concern over his future health and wellbeing ever trumped his desire to play football and make money, he could resign in a heartbeat. He plays football on his own prerogative, and he bears the consequences of his own actions. How do the choices of a 37-year-old man with power, fame, and athletic prowess compare to the involuntary and frightened oppression of a seven-year-old girl? Read More…