a study where The researchers were interested in how people jump to conclusions based on limited information. Previous work: people are “radically insensitive to both the quantity and quality of information that gives rise to impressions and intuitions,” so the researchers knew, of course, that we humans don’t do a particularly good job of weighing the pros and cons. But to what degree? Just how bad are we at assessing all the facts?
The key part of the experiment was that the participants were fully aware of the setup; they knew that they were only hearing one side or the entire story. But this didn’t stop the subjects who heard one-sided evidence from being more confident and biased with their judgments than those who saw both sides. That is, even when people had all the underlying facts, they jumped to conclusions after hearing only one side of the story.
The good news is that … simply prompting participants to consider the other side’s story reduced their bias … but it certainly did not eliminate it. Their study shows us that people are not only willing to jump to conclusions after hearing only one side’s story, but that even when they have additional information at their disposal that would suggest a different conclusion, they are still surprisingly likely to do so. The scientists conclude on a somewhat pessimistic note: “People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.”
… in reality – especially in the Internet era – people have access to a limitless amount of information that they could consider. As a result, we rely on rules of thumb, or heuristics, to take in information and make decisions. These mental shortcuts are necessary because they lessen the cognitive load and help us organize the world – we would be overwhelmed if we were truly rational.
This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.
But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”
I.e., we have a tendency to remain incompletely informed, or ignorant, especially if we are left to ourselves.
On the other hand, the type of question, “What more could I know?” is abstract and sophisticated. It’s reasonable to expect some degree of training or education be necessary in order to have the ability to form such questions. Training / education, or clearly above average curiosity and cleverness, maybe also skepticism.
The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.
“Jumping to conclusions” here seems to mean the same as prematurely assimilating some new information into one’s current framework of understanding. Again, “prematurely” is relative – because it’s often the case we can never know absolutely everything about something – but it’s easy to take for granted how sophisticated a skill it is to reflect on one’s own process of assimilation and understanding. If a person is purely rational, then they would never reach the point of thinking, “Now I have enough information to form an understanding.”
Tyler Cowen made a similar point in a TED lecture a few months ago. … The crux of the problem, as Cowen points out, is that it’s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful.
(I’m getting the sense that the author is a little over-excited, feeling high and mighty about discovering these shortcomings of the human mind.) ‘Using irrationalities to understand irrationalities’ is as reasonable and predictable a statement as ‘using our (limited) minds to understand our minds’, ‘using our current (incomplete) information to understand new information’, and ‘using our current tools to learn more about and build new, better, more accurate, more effective tools’.
To be sure, there’s an important difference between the bias that comes from hearing one side of an argument and (most) narratives. A corrective like “consider the other side” is unlikely to work for narratives because it’s not always clear what the opposite would even be. So it’s useful to avoid jumping to conclusions not only by questioning narratives (after all, just about everything is plausibly a narrative, so avoiding them can be pretty overwhelming), but by exposing yourself to multiple narratives and trying to integrate them as well as you can.
Having multiple narratives in one’s experience/memory gives one at least the raw material to ask oneself questions about any one particular narrative. The raw material is in the form of reference points. It’s still up to the individual to form questions and observations with those reference points, to examine and refine one’s own narrative.
The above mentioned “integrating” is also a task that is up to the individual alone, and makes one to deal with the various reference points. Some common examples are: points that are similar but not quite the same – so one can process whether they are truly the same, and how they are the same or different; points that seem to be contradictory – one can process how they are non-contradictory within their respective narratives, or contexts, bringing one to examine aspects of each context also; points that seem to exist, or are emphasized, in one narrative but not the other – one can examine what role or function they play in their respective narratives, and imagine how it would be if they were imported into other contexts.
… readers of popular psychology books on rationality must recognize that there’s a lot they don’t know, and they must be beware of how seductive stories are. The popular literature on cognitive biases is enlightening, but let’s be irrational about irrationality; exposure to X is not knowledge and control of X. Reading about cognitive biases, after all, does not free anybody from their nasty epistemological pitfalls.
Being in the realms of martial arts and psychology, one encounters many individuals who haphazardly confuse understanding and control, or mastery. Furthering one’s knowledge is usually easier and more comfortable than furthering one’s mastery, so it’s understandable why we’d lean toward one more than the other. We just have to beware of this tendency to lean.
The Greek playwright Euripides was right: Question everything, learn something, answer nothing.