I’m going to withdraw from the afternoon from commenting, because I have a lot of work to do this afternoon. I’ll be updating the comments regularly, though. Here’s a thought experiment for the room. Nassim Nicholas Taleb writes:
The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship upon them. Explanations bind facts together. They make them all the more easily remembered; they help them make more sense. Where this propensity can go wrong is when it increases our impression of understanding.
I think about the way I used to go round and round with my mom and dad, and with my sister, over a couple of big issues. They had in their mind the narrative they wished to believe to explain the world, and they simply refused to admit any facts that challenged that narrative. It was the most frustrating thing in the world, because they were epistemically closed to anything that undermined a narrative that made sense to them, and that felt emotionally true.
The thing is, we all do this, but we are not aware that we do it, because the minute we become aware that our views on this or that issue could be subject to the Narrative Fallacy, we subject our views to critical analysis, even if that analysis doesn’t go beyond, “Maybe I’m wrong about this; let me think about it.”
I am reminded of myself in 2002, during the march-up to the Iraq War. I lived in New York City, and had watched the first of the Twin Towers collapse. I was utterly convinced that we needed to go to war in Iraq, even though I believed that there was no real reason to think that Iraq had anything to do with the war. I thought anti-war critics were operating in bad faith; they were either moral cowards, or in some other way bad people. I went to watch the big antiwar demonstration in New York, and had everything I wanted to believe about the antiwar people confirmed by what I saw there (e.g., weirdos walking around with Bush = Hitler signs).
Of course, I was wrong about the war, and later came to see how my own emotions having to do with rage over 9/11 allowed me to filter out, unconsciously, information that challenged what I wanted to believe. The thing is, I didn’t realize what I was doing at the time. To me, the narrative I chose wasn’t a choice; it was simply a fact, and if those antiwar people had the moral courage or presence of mind to see the plain facts of the case, they would agree with me.
This has a lot to do with why I am today morbidly fascinated with the question of epistemology — that is, how we know what we know.
So, a thought experiment for the room. Consider your own “side” — political, religious, cultural, whatever. The people and/or the cause with which you most identify. What are the components of the Narrative Fallacy to which you and your side are most susceptible to? That is, which elements of the Story You Use To Explain The World are most likely to mislead you into thinking you understand the world, when in fact they are precisely the elements that keep you from understanding the world as it really is?
It’s a harder thought experiment to do than it seems, because the most dangerous (in the sense Taleb means) part of the Narrative Fallacy is the fact that it is wholly concealed. Still, I’m interested in your thoughts as to which part of your own side’s Narrative (or your own personal Narrative) have you begun having second thoughts about?