Recently I had some dialogue with a person on the blog, and it became obvious quickly that this person had almost exclusively read material written by one side of a debate. Not only was he not aware of arguments and evidence on the other side, but he was way overconfident in the conclusions he had drawn from his reading.
While I was attending seminary, the idea that we must read the other side in an argument was drummed into us constantly. One of my seminary professors even told us that he would read atheist writers as devotional material in order to constantly remind himself what atheists think.
It turns out that there is a good psychological reason to do this as well. Our minds have a strong tendency to jump to conclusions with little evidence. Psychologist Daniel Kahneman describes this tendency in his book Thinking, Fast and Slow. The first problem is that our minds tend to only offer up ideas that are fresh in our memory.
An essential design feature of the associative machine is that it represents only activated ideas. Information that is not retrieved (even unconsciously) from memory might as well not exist. System 1 excels at constructing the best possible story that incorporates ideas currently activated, but it does not (cannot) allow for information it does not have.
Recall from earlier blog posts that System 1 is the part of the human mind that is automatic and unconscious. It is constantly working behind the scenes to support System 2, which is the part of our mind that actually does intense thinking and analysis. Kahneman continues:
The measure of success for System 1 is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant. When information is scarce, which is a common occurrence, System 1 operates as a machine for jumping to conclusions.
Without reading the other side in a debate, System 1 will simply serve up coherent stories from the data it has from one side and jump to conclusions.
And there also remains a bias favoring the first impression. The combination of a coherence-seeking System 1 with a lazy System 2 implies that System 2 will endorse many intuitive beliefs, which closely reflect the impressions generated by System 1. Of course, System 2 also is capable of a more systematic and careful approach to evidence, and of following a list of boxes that must be checked before making a decision— think of buying a home, when you deliberately seek information that you don’t have. However, System 1 is expected to influence even the more careful decisions. Its input never ceases.
Because System 2 is lazy (we don’t want to think if we don’t have to), System 1 just keeps on serving up conclusions based on the one-sided evidence it has received. Kahneman offers up an abbreviation for this phenomenon:
Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking , and comes up so often in this book, that I will use a cumbersome abbreviation for it: WYSIATI, which stands for what you see is all there is. System 1 is radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions.
Why are we humans programmed with WYSIATI?
WYSIATI facilitates the achievement of coherence and of the cognitive ease that causes us to accept a statement as true. It explains why we can think fast, and how we are able to make sense of partial information in a complex world. Much of the time, the coherent story we put together is close enough to reality to support reasonable action.
However, one of the problems WYSIATI causes is overconfidence.
As the WYSIATI rule implies, neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the possibility that evidence that should be critical to our judgment is missing— what we see is all there is. Furthermore, our associative system tends to settle on a coherent pattern of activation and suppresses doubt and ambiguity.
This is why we must read the other side. Without doing so, we become overconfident in our views and we actively suppress doubt and ambiguity. One of the great lessons to be learned in life is that we have to learn to be more humble in our viewpoints, and we have to live with less confidence and more ambiguity. Otherwise, we are simply jumping to conclusions.