I just banned a commenter from posting here after he tried to post a comment that personally insulted me. As you longtime readers know, I try to have a liberal attitude towards comments. I’m pleased to post comments that are quite critical of me or anybody else who posts here, but I draw the line at personal insults and invective. Because I’m the only person who moderates comments, I sometimes slip up and let things through that shouldn’t be there. If you see this, please let me know via e-mail (rod — at — amconmag — dot — com), or otherwise, and I’ll be happy to review the comment and take it down if warranted.

I also reserve the right to refuse to post any comment, for any reason at all. For example, I post comments critical of Israel, but when I judge a comment crosses the line into anti-Semitism, I don’t post it. This is a difficult call to make when readers criticize Jews as Jews. Again, I try to be as liberal-minded as I can, and allow wide discussion, but there is a line. Similarly, I want there to be a robust discussion about race and race relations, but when I get the sense that a particular comment on race is being made from a place of bad faith, or simply that it would make useful discussion harder, I won’t post it.

For the record, I usually don’t post comments that are nothing more than long blocks of text copied from another source. And there are times when I don’t post a comment because it’s so dopey I don’t want it junking up the comments section.

Please do not assume that because I post somebody’s comment, that indicates that I agree with them. You should also not assume that because your comment did not appear, it’s because I sent it to the spam bucket. Alas for us all, the WordPress software has a quirky habit of marking some perfectly fine comments by longtime users as … spam. Your jaw would drop if you saw how much actual spam the filter nets from bots. No kidding, we must get 300 or more spam comments generated by bots for every one comment from a real person the filter nabs. Sometimes those real-person comments are just fine, and I can’t find any reason why it would have gone into the spam bucket. For example, to my recollection, I have never deep-sixed anything by the reader Turmarion, even though he can be a sharp critic of mine. Yet the spam filter seems to grab an unusual number of his comments. I can’t explain it. He’ll write me privately, and I’ll go digging through spam, find his comment, and free it. I feel badly for him, but if I could make this not happen, I would. As imperfe

Y’all don’t see the things that I don’t post, so you (understandably) don’t know how hard I work to try to keep this forum interesting, readable, and a place where all kinds of people can find something interesting there. That requires pruning. I like it when I have time to add an NFR (Note From Rod), but there are times when I’d love to do that, but I’m too busy. There is no rhyme or reason to why I add NFRs. All it means is that I had the time to do it right then.

Anyway, I bring this all up after having just read Scott Alexander’s long explanation for why he stopped the Culture War thread, a subreddit affiliated with his popular Slate Star Codex blog. Alexander (a pseudonym) is a psychiatrist who is generally liberal in his politics and cultural outlook, but whose blog attracts all kinds of readers because it’s really smart and interesting. He had a far, far more difficult job moderating his comments section (and he had volunteer helpers!) than I do, but his explanation for why you have to moderate them is quite good, and I urge you to read it.

In this part, he talks about why a blogger really has to be careful about what he allows into his comments section. It can destroy the blog and the brand of the publication that hosts the blog:

The thing about an online comment section is that the guy who really likes pedophilia is going to start posting on every thread about sexual minorities “I’m glad those sexual minorities have their rights! Now it’s time to start arguing for pedophile rights!” followed by a ten thousand word manifesto. This person won’t use any racial slurs, won’t be a bot, and can probably reach the same standards of politeness and reasonable-soundingness as anyone else. Any fair moderation policy won’t provide the moderator with any excuse to delete him. But it will be very embarrassing for to New York Times to have anybody who visits their website see pro-pedophilia manifestos a bunch of the time.

“So they should deal with it! That’s the bargain they made when deciding to host the national conversation!”

No, you don’t understand. It’s not just the predictable and natural reputational consequences of having some embarrassing material in a branded space. It’s enemy action.

Every Twitter influencer who wants to profit off of outrage culture is going to be posting 24-7 about how the New York Times endorses pedophilia. Breitbart or some other group that doesn’t like the Times for some reason will publish article after article on New York Times‘ secret pro-pedophile agenda. Allowing any aspect of your brand to come anywhere near something unpopular and taboo is like a giant Christmas present for people who hate you, people who hate everybody and will take whatever targets of opportunity present themselves, and a thousand self-appointed moral crusaders and protectors of the public virtue. It doesn’t matter if taboo material makes up 1% of your comment section; it will inevitably make up 100% of what people hear about your comment section and then of what people think is in your comment section. Finally, it will make up 100% of what people associate with you and your brand. The Chinese Robber Fallacy is a harsh master; all you need is a tiny number of cringeworthy comments, and your political enemies, power-hungry opportunists, and 4channers just in it for the lulz can convince everyone that your entire brand is about being pro-pedophile, catering to the pedophilia demographic, and providing a platform for pedophile supporters. And if you ban the pedophiles, they’ll do the same thing for the next-most-offensive opinion in your comments, and then the next-most-offensive, until you’ve censored everything except “Our benevolent leadership really is doing a great job today, aren’t they?” and the comment section becomes a mockery of its original goal.

If you think that I have been too quick to spam-bucket your comment on “race realism,” Jewish influence, and suchlike, I want you to understand that there’s a reason for that, and Alexander explains why. On the other hand, if you think that I allow too much material that comes right up to the line, understand that there’s a reason for that too (that is, I don’t want to run a comments section that turns into an Our Benevolent Leadership forum). This is not easy work. I tell you that not because I want you to give me a cookie, but because I want you to know that I take seriously my job curating these comments, and I want to get better at it. But I know that I can’t please everybody, and often can’t even please myself.

Anyway, though Alexander is a secular liberal, he listens to and takes seriously people who don’t agree with him. In the post I’m quoting from here, he shows a pie chart of a survey he took of SSC subreddit readers. Most of them are left-wing, but not a big majority. This is entirely understandable, given how intelligent and interesting Alexander’s comments are. But, he says, eventually the most contentious categories (“Trump, gender, race, the communist menace, the fascist menace, etc”) began taking over the comments sections.

And then the trolls got involved. Alexander writes:

People settled on a narrative. The Culture War thread was made up entirely of homophobic transphobic alt-right neo-Nazis. I freely admit there were people who were against homosexuality in the thread (according to my survey, 13%), people who opposed using trans people’s preferred pronouns (according to my survey, 9%), people who identified as alt-right (7%), and a single person who identified as a neo-Nazi (who as far as I know never posted about it). Less outrageous ideas were proportionally more popular: people who were mostly feminists but thought there were differences between male and female brains, people who supported the fight against racial discrimination but thought could be genetic differences between races. All these people definitely existed, some of them in droves. All of them had the right to speak; sometimes I sympathized with some of their points. If this had been the complaint, I would have admitted to it right away. If the New York Times can’t avoid attracting these people to its comment section, no way r/ssc is going to manage it.

But instead it was always that the the thread was “dominated by” or “only had” or “was an echo chamber for” homophobic transphobic alt-right neo-Nazis, which always grew into the claim that the subreddit was dominated by homophobic etc neo-Nazis, which always grew into the claim that the SSC community was dominated by homophobic etc neo-Nazis, which always grew into the claim that I personally was a homophobic etc neo-Nazi of them all. I am a pro-gay Jew who has dated trans people and votes pretty much straight Democrat. I lost distant family in the Holocaust. You can imagine how much fun this was for me.

People would message me on Twitter to shame me for my Nazism. People who linked my blog on social media would get replies from people “educating” them that they were supporting Nazism, or asking them to justify why they thought it was appropriate to share Nazi sites. I wrote a silly blog post about mathematics and corn-eating. It reached the front page of a math subreddit and got a lot of upvotes. Somebody found it, asked if people knew that the blog post about corn was from a pro-alt-right neo-Nazi site that tolerated racists and sexists. There was a big argument in the comments about whether it should ever be acceptable to link to or read my website. Any further conversation about math and corn was abandoned. This kept happening, to the point where I wouldn’t even read Reddit discussions of my work anymore. The New York Times already has a reputation, but for some people this was all they’d heard about me.

Some people started an article about me on a left-wing wiki that listed the most offensive things I have ever said, and the most offensive things that have ever been said by anyone on the SSC subreddit and CW thread over its three years of activity, all presented in the most damning context possible; it started steadily rising in the Google search results for my name. A subreddit devoted to insulting and mocking me personally and Culture War thread participants in general got started; it now has over 2,000 readers. People started threatening to use my bad reputation to discredit the communities I was in and the causes I cared about most.

Some people found my real name and started posting it on Twitter. Some people made entire accounts devoted to doxxing me in Twitter discussions whenever an opportunity came up. A few people just messaged me letting me know they knew my real name and reminding me that they could do this if they wanted to.

Some people started messaging my real-life friends, telling them to stop being friends with me because I supported racists and sexists and Nazis. Somebody posted a monetary reward for information that could be used to discredit me.

One person called the clinic where I worked, pretended to be a patient, and tried to get me fired.

And then Scott Alexander, psychiatrist, had a nervous breakdown. I’m not kidding. Read his post to learn more about it.

Please understand that I don’t face anything like the pressures Alexander did, because he’s a much bigger deal than I am. Still, I’m on that spectrum. When I meet readers of this blog in person, they usually say, “How do you deal with all the nasty stuff that comes at you in the comments section?” I tell them truthfully that I don’t notice it. I’ve been an opinion journalist for almost all of the last 30 years, and I have developed dinosaur hide. If you see it posted here, rest assured that it does not bother me. Mostly I laugh at it. And I confess to taking malicious pleasure in sending to oblivion the comments, and the ability to comment, of anonymous keyboard would-be badasses who think they’re standing up to The Man by insulting me in comments that will never see the light of day. They think they’re being all Jack Nicholson (“You can’t handle the truth!”), but I see them as more along the lines of Elmer Fudd, and you see them … not at all.

You’re welcome.

On the up side, I hear from people all the time who praise this comments section, and who say it’s the only one they read. I’m talking about national print and broadcast journalists. One even asked me to put him in touch with commenter Matt in VA, so he could invite Matt to be on TV (Matt had to decline, to protect his job, which depends on his anonymity). Point is, you who work hard to play by the rules and offer good comments should take heart that lots of people read them. This blog gets well over a million unique readers each month; only the tiniest fraction of those readers ever post here … but a lot of those readers do read you regular commenters too. I want to thank you for what you do here, and for helping to make this blog so popular. It’s your blog too.

I love how big you’ve made this blog by your loyalty and your good comments over the years, but when the day comes when I have to spend more time moderating comments as I do writing them, it’s either going to be time to get an intern to help out, or I’m going to have to shut down the comments section, for all the reasons Scott Alexander cut off his connection with the Culture War subreddit. Happily, I don’t see that day coming soon. But note this passage from Alexander’s long post about the effect trolls had on him:

Second, I wanted there to be at least one of these “here’s why we’re removing your ability to comment” articles that was honest, not made of corporate-speak, and less patronizing than “we’re removing the comment section because we value your speech so much and want to promote great conversations”. Hopefully this will be the skeleton key that helps you understand what all those other articles would have said if they weren’t run through fifty layers of PR teams. I would like to give people another perspective on events like Tumblr banning female-presenting nipples or Patreon dropping right-wing YouTubers or Twitter constantly introducing new algorithms that misfire and ban random groups of people. These companies aren’t inherently censorious. They’re just afraid. Everyone is afraid.

That made me rethink my own outrage over Facebook’s having banned for several hours my TAC blogpost about Jussie Smollett’s hate hoax. Yes, it was wrong of them to ban it (as Facebook itself recognized by re-instating the ability of users to share the post a few hours later), but whatever algorithm snared the post was one no doubt designed to stop authentically bad people from sharing evil material. What happened to me was probably about the same thing as this blog’s spam filter capturing a legitimate comment because some word in it triggered the algorithm.

Still, I’m glad the whole thing spurred me to at long last delete Facebook. I will miss the opportunity to reconnect with friends I haven’t been in touch with for ages, but mostly, it wasn’t worth it. If I didn’t need Twitter to do my job, I’d be off of it too. Social media is not good for us. A (secular liberal middle-aged) friend e-mailed this morning to say:

Here’s a serious question/hypothesis absolutely impervious to any kind of scientific-method investigation: How much do digital native kids form their entire identity—the reflexive condescension, the automatic politicization, the ahistorical two-dimensional opinions, the insolent sanctimony, etc.—on social media? From the first, embryonic stages of self-awareness, they have to forge a presentational self that is both distinctive and conformist simultaneous—or even prior—to the organic, fumbling process of constructing an identity IRL, interacting with real people, parents, adults, peers. The result is that these kids are walking, talking Twitter avatars. They are uninterested in listening, closed to temperate persuasion, because they are looking for followers and likes, even in person.

What a great observation. Most everybody who comments in my comments section uses a handle, and that’s fine with me. Given how dangerous it can be to state your actual opinion in public, even if it’s an anodyne one, I don’t blame people for wanting to hide behind fake names. Still, I wonder how our dialogues here would go if people had to say the things they say here face to face. I know for a fact that I would state things differently, or not at all, because when I’m actually with people, I don’t want to do much more than have a drink and a laugh. Partly because of the way I was raised (Southerners are taught to do whatever is necessary to put your guests at ease), and partly because personal conflict exhausts me, I’m a different person in real life than I am on this blog. I say things here that I wouldn’t say to people’s faces, and I’ve gotten much more like that, in part because of having to deal with all the extreme nastiness over the years (i.e., I don’t want to talk about controversial things in conversation with strangers, because dealing with anonymous people in comments sections and on social media makes one EXTREMELY guarded, for reasons Scott Alexander eloquently illuminates).

It’s always a balancing act, having to toggle between being honest in offering my opinions and thoughtful in remembering that actual human beings read those opinions, so I should resist the temptation to be a bloggy version of Addison DeWitt . Anyway, I’ve said enough.

UPDATE: Ah, the moment after I posted this, I checked my e-mail, and a reader had sent in this Verge piece about “the secret life of Facebook moderators”. Here’s how it starts:

The panic attacks started after Chloe watched a man die.

She spent the past three and a half weeks in training, trying to harden herself against the daily onslaught of disturbing posts: the hate speech, the violent attacks, the graphic pornography. In a few more days, she will become a full-time Facebook content moderator, or what the company she works for, a professional services vendor named Cognizant, opaquely calls a “process executive.”

For this portion of her education, Chloe will have to moderate a Facebook post in front of her fellow trainees. When it’s her turn, she walks to the front of the room, where a monitor displays a video that has been posted to the world’s largest social network. None of the trainees have seen it before, Chloe included. She presses play.

The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking.

Returning to her seat, Chloe feels an overpowering urge to sob. Another trainee has gone up to review the next post, but Chloe cannot concentrate. She leaves the room, and begins to cry so hard that she has trouble breathing.

No one tries to comfort her. This is the job she was hired to do. And for the 1,000 people like Chloe moderating content for Facebook at the Phoenix site, and for 15,000 content reviewers around the world, today is just another day at the office.

My God, this piece. The conditions that the Facebook moderators have to work under, and the impossibility of getting it right all the time, given the ever-changing rules. And having to live with the things you see on Facebook daily. More:

But over time, he grew concerned for his mental health.

“We were doing something that was darkening our soul — or whatever you call it,” he says. “What else do you do at that point? The one thing that makes us laugh is actually damaging us. I had to watch myself when I was joking around in public. I would accidentally say [offensive] things all the time — and then be like, Oh shit, I’m at the grocery storeI cannot be talking like this.

Jokes about self-harm were also common. “Drinking to forget,” Sara heard a coworker once say, when the counselor asked him how he was doing. (The counselor did not invite the employee in for further discussion.) On bad days, Sara says, people would talk about it being “time to go hang out on the roof” — the joke being that Cognizant employees might one day throw themselves off it.

One day, Sara said, moderators looked up from their computers to see a man standing on top of the office building next door. Most of them had watched hundreds of suicides that began just this way. The moderators got up and hurried to the windows.

The man didn’t jump, though. Eventually everyone realized that he was a fellow employee, taking a break.

Read it all.

The poor soul who flagged my TAC piece about Jussie Smollett’s hate hoax was probably just applying the Facebook community standard that says you can’t question a hate crime. It’s a stupid standard, especially in the case of someone like Smollett, who, by the time I posted that, had been exposed as a likely hoaxer (though he had not yet been charged). The FB censor should have let my comment through, but I can see why he or she made the call they did, out of fear that I might have legitimately broken the community standard rule. That person might have been fired. As Scott Alexander said, “They’re just afraid. Everyone is afraid.”

I was right to complain about FB’s banning my post. But I wonder now if some badly paid SOB working under harsh conditions lost his job because of it.

UPDATE: I really appreciate the kind compliments from you readers.

Advertisement