Tag Archives: Daniel Kahneman

Why Must You Read the Other Side’s Arguments?

Recently I had some dialogue with a person on the blog, and it became obvious quickly that this person had almost exclusively read material written by one side of a debate. Not only was he not aware of arguments and evidence on the other side, but he was way overconfident in the conclusions he had drawn from his reading.

While I was attending seminary, the idea that we must read the other side in an argument was drummed into us constantly. One of my seminary professors even told us that he would read atheist writers as devotional material in order to constantly remind himself what atheists think.

It turns out that there is a good psychological reason to do this as well. Our minds have a strong tendency to jump to conclusions with little evidence. Psychologist Daniel Kahneman describes this tendency in his book Thinking, Fast and Slow. The first problem is that our minds tend to only offer up ideas that are fresh in our memory.

An essential design feature of the associative machine is that it represents only activated ideas. Information that is not retrieved (even unconsciously) from memory might as well not exist. System 1 excels at constructing the best possible story that incorporates ideas currently activated, but it does not (cannot) allow for information it does not have.

Recall from earlier blog posts that System 1 is the part of the human mind that is automatic and unconscious. It is constantly working behind the scenes to support System 2, which is the part of our mind that actually does intense thinking and analysis. Kahneman continues:

The measure of success for System 1 is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant. When information is scarce, which is a common occurrence, System 1 operates as a machine for jumping to conclusions.

Without reading the other side in a debate, System 1 will simply serve up coherent stories from the data it has from one side and jump to conclusions.

And there also remains a bias favoring the first impression. The combination of a coherence-seeking System 1 with a lazy System 2 implies that System 2 will endorse many intuitive beliefs, which closely reflect the impressions generated by System 1. Of course, System 2 also is capable of a more systematic and careful approach to evidence, and of following a list of boxes that must be checked before making a decision— think of buying a home, when you deliberately seek information that you don’t have. However, System 1 is expected to influence even the more careful decisions. Its input never ceases.

Because System 2 is lazy (we don’t want to think if we don’t have to), System 1 just keeps on serving up conclusions based on the one-sided evidence it has received. Kahneman offers up an abbreviation for this phenomenon:

Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking , and comes up so often in this book, that I will use a cumbersome abbreviation for it: WYSIATI, which stands for what you see is all there is. System 1 is radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions.

Why are we humans programmed with WYSIATI?

WYSIATI facilitates the achievement of coherence and of the cognitive ease that causes us to accept a statement as true. It explains why we can think fast, and how we are able to make sense of partial information in a complex world. Much of the time, the coherent story we put together is close enough to reality to support reasonable action.

However, one of the problems WYSIATI causes is overconfidence.

As the WYSIATI rule implies, neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the possibility that evidence that should be critical to our judgment is missing— what we see is all there is. Furthermore, our associative system tends to settle on a coherent pattern of activation and suppresses doubt and ambiguity.

This is why we must read the other side. Without doing so, we become overconfident in our views and we actively suppress doubt and ambiguity. One of the great lessons to be learned in life is that we have to learn to be more humble in our viewpoints, and we have to live with less confidence and more ambiguity.  Otherwise, we are simply jumping to conclusions.

Are We Just Meat Robots?

Post Author: Bill Pratt 

I have an interest in psychology and behavioral economics research. That’s why I write posts on books like Predictably Irrational and Thinking, Fast and Slow. But as I read these kinds of books, I am always keeping an eye on the big question.

Will the author say that human beings are completely irrational in our  thoughts and behavior, completely rational in our thoughts and behavior, or a combination of the two? By rational I mean the use of reason, evidence, and logic to draw true conclusions about reality.

There are atheist thinkers such as Alex Rosenberg who argue that humans are completely irrational, that we are meat robots whose thoughts and behavior are 100% determined by physics. Physics knows nothing of reason, evidence, and logic. Rosenberg says that once you take science seriously, there is no other possible conclusion.

The problem with Rosenberg’s position is that it is hopelessly self-contradictory. He is saying, in essence, “I know rationally that nobody knows anything rationally.” If he knows that rationally, then his statement is false. If he doesn’t know anything rationally, then his statement is irrational and can be safely ignored.

In the world of psychological and economics literature, though, I see flirtation with Rosenberg’s position. Let’s look at two examples.

Michael Sliwinski, an entrepreneur and creator of the software application Nozbe, said this about the latest online magazine  issue of Productivity:

Reading this issue provoked deep reflection within myself, and I hope it will do the same for you, too. Practically every article shows the well-known and scientifically proved (but often forgotten) fact that psychological mechanisms—usually unconscious—rule the human world.

Sliwinski says it is a fact that “psychological mechanisms—usually unconscious—rule the human world.” Is he taking Rosenberg’s position? Probably not, but he’s leaning in that direction. If he said that only psychological mechanisms rule the human world, his position would be self-contradictory as well. There is enough ambiguity to avoid the charge of Rosenberg-ism.

Dan Ariely concludes his book Predictably Irrational:

IF I WERE to distill one main lesson from the research described in this book, it is that we are pawns in a game whose forces we largely fail to comprehend. We usually think of ourselves as sitting in the driver’s seat, with ultimate control over the decisions we make and the direction our life takes; but, alas, this perception has more to do with our desires— with how we want to view ourselves— than with reality.

Let’s stop here. Ariely is skating on the edge of self-contradiction. He says that we are pawns without ultimate control over our decisions. But clearly Ariely believes that he was not a pawn when he wrote this sentence or the rest of his book. He believes that he did have control over the decisions he made to write about predictable irrationality, did he not? If he was able to pull this off, then why can’t we?

He continues:

The point is that our visual and decision environments are filtered to us courtesy of our eyes, our ears, our senses of smell and touch, and the master of it all, our brain. By the time we comprehend and digest information, it is not necessarily a true reflection of reality. Instead, it is our representation of reality, and this is the input we base our decisions on. In essence we are limited to the tools nature has given us, and the natural way in which we make decisions is limited by the quality and accuracy of these tools.

Ariely almost goes Rosenberg on us again. He says that when we comprehend and digest information, “it is our representation of reality” and not “necessarily a true reflection of reality.” If he is saying that every time we digest information, we are not getting at true reality, then how is it that Ariely has managed to bypass this problem and get at true reality?

It is helpful to re-write his statement in the following way: “By digesting information, I have arrived at the true reality that nobody who digests information can arrive at true reality.” If his statement is true, then it is false, unless he doesn’t want to include himself in the population of all human beings.

In closing, consider the following:

  • If we are irrational meat robots, then we can’t know rationally that we are meat robots.
  • If unconscious psychological mechanisms control our every thought, then we can’t consciously (or rationally) think that unconscious psychological mechanisms control our every thought.
  • If we have no control over our decisions, then we can’t control the decision to think that we have no control over our decisions.
  • If we can’t comprehend true reality, then we can’t comprehend that true reality is incomprehensible.

Rosenberg self-contradiction syndrome is always lurking. We have to be careful not to stretch the findings of psychology and behavioral economics beyond where they should go. If you ever want to make any claim about reality that you think is true, then you cannot hold that we are merely meat robots. That, my friend, is a flagrant contradiction.

What Are the Implications of the Halo Effect?

Post Author: Bill Pratt 

In the previous post, we looked at the halo effect, as explained by psychologist Daniel Kahneman, in his book Thinking, Fast and Slow. We saw that the halo effect causes us to overweight our first impressions of a person so that subsequent impressions are largely influenced by those first impressions.

If we like a person when we first meet them, then we will consistently look for reasons to like everything about them as time goes on. If we don’t like a person when we first meet them, then we will consistently look for reasons to not like anything about them as time goes on.

The halo effect has many implications for apologetics and evangelism. Say you want to discuss the gospel with someone. If that person already sees you as likable, based on their positive initial impressions of you, then when you present the gospel message, they will most likely be receptive.

If, however, the person with whom you want to discuss the gospel dislikes you, based on their initial negative reactions to you, then they will most likely reject anything you say to them about Christianity. They will simply assume that you are wrong about everything because of the halo effect.

I have had many skeptical visitors to the blog over the years who, after interacting with me initially, decide that they just don’t like me. In their minds, I lie, I don’t understand evidence and rational thinking, and I’m just not someone who can be trusted. How do I know? Because they tell me. Once these people have formed their initial opinions, I know that no matter what I say to them, no matter how I say it, they will never accept anything coming from me. This is the halo effect.

On the other hand, there are people who interact with me and immediately like me; they find me to be trustworthy and reasonable. With those people, the halo effect works in my favor. They are quite willing to hear what I have to say, even when we don’t agree on everything.

If a person doesn’t like me, for whatever reason, they are not going to listen to what I have to say about the gospel. I can rest assured, however, that God will bring along someone else who that person does like. There is usually no point in me banging my head against the halo effect to change that person’s impression of me. They have formed their opinion and it is probably not going to change, at least not without substantial effort on my part and theirs.

I think the halo effect is one reason that Billy Graham was such an amazing evangelist. Most people, after first seeing or listening to him for just a few minutes, immediately like him. There is just something about him that people like. The halo effect, undoubtedly, helped him bring thousands and thousands of people to Christ.

Alas, we all can’t be Billy Graham. This is a hard pill to swallow for an apologist or evangelist, but swallow it we must. Most of us know at least some people, even in our families,  who just don’t like us a great deal. The fact is, we probably cannot reach those people, but, we do need to reach those who do like and trust us. They are ready to hear what we have to say.

What Is the Halo Effect?

Post Author: Bill Pratt 

It has nothing to do with the popular video game, but like confirmation bias, it is a concept you need to understand because it impacts all of us, and we are mostly unaware.

Psychologist Daniel Kahneman, in his book Thinking, Fast and Slow, describes the halo effect.

If you like the president’s politics, you probably like his voice and his appearance as well. The tendency to like (or dislike) everything about a person— including things you have not observed— is known as the halo effect. The term has been in use in psychology for a century, but it has not come into wide use in everyday language. This is a pity, because the halo effect is a good name for a common bias that plays a large role in shaping our view of people and situations. It is one of the ways the representation of the world that System 1 generates is simpler and more coherent than the real thing.

Kahneman provides a concrete example:

You meet a woman named Joan at a party and find her personable and easy to talk to. Now her name comes up as someone who could be asked to contribute to a charity. What do you know about Joan’s generosity? The correct answer is that you know virtually nothing, because there is little reason to believe that people who are agreeable in social situations are also generous contributors to charities.

But you like Joan and you will retrieve the feeling of liking her when you think of her. You also like generosity and generous people. By association, you are now predisposed to believe that Joan is generous. And now that you believe she is generous, you probably like Joan even better than you did earlier, because you have added generosity to her pleasant attributes.

Real evidence of generosity is missing in the story of Joan, and the gap is filled by a guess that fits one’s emotional response to her. In other situations, evidence accumulates gradually and the interpretation is shaped by the emotion attached to the first impression.

Impressions of a person are gained over a period of time, but the halo effect causes us to overweight first impressions over later impressions. Here is the problem stated:

The sequence in which we observe characteristics of a person is often determined by chance. Sequence matters, however, because the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.

Early in my career as a professor, I graded students’ essay exams in the conventional way. I would pick up one test booklet at a time and read all that student’s essays in immediate succession, grading them as I went. I would then compute the total and go on to the next student. I eventually noticed that my evaluations of the essays in each booklet were strikingly homogeneous. I began to suspect that my grading exhibited a halo effect , and that the first question I scored had a disproportionate effect on the overall grade.

The mechanism was simple: if I had given a high score to the first essay , I gave the student the benefit of the doubt whenever I encountered a vague or ambiguous statement later on. This seemed reasonable. Surely a student who had done so well on the first essay would not make a foolish mistake in the second one! But there was a serious problem with my way of doing things. If a student had written two essays, one strong and one weak, I would end up with different final grades depending on which essay I read first. I had told the students that the two essays had equal weight, but that was not true: the first one had a much greater impact on the final grade than the second.

In the next post, we will look at some implications of the halo effect.

What Is Confirmation Bias?

Post Author: Bill Pratt 

Confirmation bias is a concept you need to understand because it impacts all of us, and we are mostly unaware.

Psychologist Daniel Kahneman, in his book Thinking, Fast and Slow, describes confirmation bias in the context of a person being presented with a statement that they can choose to believe or not believe. Kahneman begins, “The initial attempt to believe is an automatic operation of System 1 , which involves the construction of the best possible interpretation of the situation. Even a nonsensical statement . . . will evoke initial belief.” (emphasis added)

Kahneman explains that unbelieving is an operation of System 2, but we already know that System 2 requires additional cognitive energy to get engaged. So what does this mean?

The moral is significant: when System 2 is otherwise engaged, we will believe almost anything . System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy. Indeed, there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted.

And now comes the concept of confirmation bias:

The operations of [System 1] associative memory contribute to a general confirmation bias. When asked, “Is Sam friendly?” different instances of Sam’s behavior will come to mind than would if you had been asked “Is Sam unfriendly?” A deliberate search for confirming evidence, known as positive test strategy, is also how System 2 tests a hypothesis.

Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold. The confirmatory bias of System 1 favors uncritical acceptance of suggestions and exaggeration of the likelihood of extreme and improbable events.

Unless we are paying close attention and engaging System 2, our bias is to believe what we are told. System 1 will pull memories and ideas out of our mind to confirm whatever is being presented to us. It is only when we pause, think, and consider what is being said, that System 2 can start to methodically test what is being presented to us.

As someone who reads a tremendous amount of anti-Christian material, I am aware of this process happening to me all the time. I will read statements that say, in effect, “This aspect of the Christian worldview is totally wrong,” and my initial reaction, if I don’t have my mind really engaged, is almost always to agree! In fact, if I just uncritically read any author, I will find myself wanting to agree with most of what the author is saying.

I don’t think this reaction is all bad, though. The best way to understand another person’s viewpoint is to immerse yourself in their ideas as best you can, and try to see the world as they see it. If you stop to critically analyze every sentence, you will quickly exhaust yourself and never see as the other person sees.

So my recommendation is to let System 1 have its way when you are reading new material, at least for a while. Once you’ve uncritically read enough to understand the main point of the author, then go back and bring System 2 into the game. Analyze, critique, question what you’ve read.

The situation where System 1 can really be dangerous for a person is when that person only reads material that already confirms their previous beliefs, and reads without ever engaging System 2 to analyze, critique, and question what they’ve read. If this happens over and over again for years, you have the making of a dogmatic and stubborn individual, someone who is rarely thinking about what they believe.

Why Do We See Causality All Around Us?

Post Author: Bill Pratt 

Psychologist Daniel Kahneman, in his book Thinking, Fast and Slow, describes the concept of intentional causality. According to Kahneman,

Your mind is ready and even eager to identify agents, assign them personality traits and specific intentions, and view their actions as expressing individual propensities. Here again, the evidence is that we are born prepared to make intentional attributions: infants under one year old identify bullies and victims, and expect a pursuer to follow the most direct path in attempting to catch whatever it is chasing.

Intentional causality is contrasted with physical causality. Physical causality is perceived when we see physical objects interacting with each other, such as one billiard ball hitting another and causing it to move.

Kahneman assigns the ability of human beings to see both kinds of causality to System 1 and believes there might be an evolutionary reason for why System 1 is so ready and adept at seeing both intentional and physical causality in the world around us.

The experience of freely willed action is quite separate from physical causality. Although it is your hand that picks up the salt , you do not think of the event in terms of a chain of physical causation. You experience it as caused by a decision that a disembodied you made, because you wanted to add salt to your food. Many people find it natural to describe their soul as the source and the cause of their actions.

The psychologist Paul Bloom, writing in The Atlantic in 2005, presented the provocative claim that our inborn readiness to separate physical and intentional causality explains the near universality of religious beliefs. He observes that “we perceive the world of objects as essentially separate from the world of minds, making it possible for us to envision soulless bodies and bodiless souls.”

The two modes of causation that we are set to perceive make it natural for us to accept the two central beliefs of many religions: an immaterial divinity is the ultimate cause of the physical world, and immortal souls temporarily control our bodies while we live and leave them behind as we die. In Bloom’s view, the two concepts of causality were shaped separately by evolutionary forces, building the origins of religion into the structure of System 1.

These two kinds of causality are important to understand, for they stand in the center of the battle between two major worldviews: atheism and theism. Atheists affirm physical causality, but deny intentional causality (they claim it is just an illusion and that only physical causality is really operating). Theists affirm both physical and intentional causality.

Almost every debate about the origin of the universe, or the fine-tuning of the physical constants in the universe, or the design of biological organisms, comes down to whether you believe that intentional causality is real or illusory. There is no doubt that most human beings believe that both are real, and that this belief is hard-wired into us, but that doesn’t settle the debate.

For those who want to claim that the concept of intentional causality is not real because it is produced by evolution, that argument doesn’t fly. Where the ability to see intentional causality came from is not directly relevant to whether there really are intentional causes.  Pressing this claim would be a case of the genetic fallacy. The source of an idea cannot tell you whether an idea is true or false.

And besides, if you believe evolution caused human beings to see intentional causality, then you must also believe that evolution caused human beings to see physical causality, and almost nobody wants to say that physical causality is unreal.

Why Do We Answer Questions We Weren’t Asked?

Post Author: Bill Pratt 

Psychologist Daniel Kahneman, in his book Thinking Fast and Slow, has introduced us to the two processes going on inside our minds: System 1 and System 2. We’ve already looked at the fact that System 1 kicks in first when we are approached by a situation with which we aren’t familiar. System 1 has lots of shortcuts it likes to take instead of dealing completely rationally and thoughtfully with what is being presented (that’s System 2’s job, after all).

One of these shortcuts is that System 1, instead of answering the question that is being posed, will substitute an easier question and answer that instead. Kahneman explains:

The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it. Whether you state them or not, you often have answers to questions that you do not completely understand, relying on evidence that you can neither explain nor defend.

How can this be? How can we have answers ready for everything that comes our way, without even giving the questions much thought?

I propose a simple account of how we generate intuitive opinions on complex matters. If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution. I also adopt the following terms:

The target question is the assessment you intend to produce.

The heuristic question is the simpler question that you answer instead.

The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions.

Kahneman is arguing that when we are presented with a complex or abstract question, instead of slowly thinking about it, our minds immediately offer up a solution by answering a simpler and different version of the question. The table below gives some examples.

Target Question Heuristic Question
How much would you contribute to save an endangered species? How much emotion do I feel when I think of dying dolphins?
How happy are you with your life these days? What is my mood right now?
How popular is the president right now? How popular will the president be six months from now?
How should financial advisers who prey on the elderly be punished? How much anger do I feel when I think of financial predators?
This woman is running for the primary. How far will she go in politics? Does this woman look like a political winner?

Kahneman points out that

System 2 has the opportunity to reject this intuitive answer, or to modify it by incorporating other information. However, a lazy System 2 often follows the path of least effort and endorses a heuristic answer without much scrutiny of whether it is truly appropriate. You will not be stumped, you will not have to work very hard, and you may not even notice that you did not answer the question you were asked. Furthermore, you may not realize that the target question was difficult, because an intuitive answer to it came readily to mind.

As a Christian sharing the gospel and sharing evidences and arguments that  show Christianity is true, I have to be aware that substitution is going on. Here is another table that illustrates what I’m talking about.

Target Question Heuristic Question
Do you believe Christianity is true? Do I like the Christians I know?
Are you convicted by your sins? Am I basically a good person?
What do you think of the historical evidence of the resurrection? Do I think that miracles can ever occur?
Would you consider following Christ? Do I want to be associated with the Christians I know?

It sometimes takes great effort to convince your friend to actually answer the questions you’re posing to him. Be aware of what is going on and keep bringing your friend back to the real question, not the question he simply substitutes because it’s easier for him to answer.

How Do We React When We Encounter Something New? (Not Rationally)

Post Author: Bill Pratt 

Have you ever noticed the reactions you get when you present a new concept to someone, a new argument, a new piece of unexpected data? Unless the person with whom you are speaking is already familiar with what you are saying, you often get some kind of emotional or irrational response that indicates the person is not really getting what you’re saying.

Why is this? I see this happen in-person and on-line all the time. Psychologist Daniel Kahneman explains what happens in these circumstances in his book Thinking Fast and Slow. The first responder to our environment is our System 1 (see previous blog post to see explanation of System 1 and System 2). So what does System 1 do?

Kahneman gives an example of a typical System 1 reaction by presenting the reader with the following words side by side: bananas vomit. Take a minute and note your reaction to those words. Then read on.

The events that took place as a result of your seeing the words happened by a process called associative activation: ideas that have been evoked trigger many other ideas, in a spreading cascade of activity in your brain. The essential feature of this complex set of mental events is its coherence.

Each element is connected, and each supports and strengthens the others. The word evokes memories, which evoke emotions, which in turn evoke facial expressions and other reactions, such as a general tensing up and an avoidance tendency. The facial expression and the avoidance motion intensify the feelings to which they are linked, and the feelings in turn reinforce compatible ideas. All this happens quickly and all at once, yielding a self-reinforcing pattern of cognitive, emotional, and physical responses that is both diverse and integrated— it has been called associatively coherent.

Kahneman continues:

In a second or so you accomplished, automatically and unconsciously, a remarkable feat. Starting from a completely unexpected event, your System 1 made as much sense as possible of the situation— two simple words, oddly juxtaposed— by linking the words in a causal story; it evaluated the possible threat (mild to moderate) and created a context for future developments by preparing you for events that had just become more likely; it also created a context for the current event by evaluating how surprising it was. . . .

An odd feature of what happened is that your System 1 treated the mere conjunction of two words as representations of reality. Your body reacted in an attenuated replica of a reaction to the real thing, and the emotional response and physical recoil were part of the interpretation of the event. As cognitive scientists have emphasized in recent years, cognition is embodied; you think with your body, not only with your brain. The mechanism that causes these mental events has been known for a long time: it is the association of ideas.

Kahneman then explains what he means by “ideas” in the mind. An idea can be

concrete or abstract, and it can be expressed in many ways: as a verb, as a noun, as an adjective, or as a clenched fist. Psychologists think of ideas as nodes in a vast network, called associative memory, in which each idea is linked to many others. There are different types of links: causes are linked to their effects (virus → cold); things to their properties (lime → green); things to the categories to which they belong (banana → fruit).

Psychologists and philosophers used to believe that ideas followed one after another in your mind, chronologically. Kahneman says that this view no longer holds:

In the current view of how associative memory works, a great deal happens at once. An idea that has been activated does not merely evoke one other idea. It activates many ideas, which in turn activate others. Furthermore, only a few of the activated ideas will register in consciousness; most of the work of associative thinking is silent, hidden from our conscious selves. The notion that we have limited access to the workings of our minds is difficult to accept because, naturally, it is alien to our experience, but it is true: you know far less about yourself than you feel you do.

Whenever a person is confronted with new data, System 1 takes over and delivers the first response. This response is largely unconscious and automatic, and it is based on all of the ideas in your mind that are unconsciously associated with the new data you’ve just been presented. Thus the strange reactions we often get when we present new ideas to someone.

At first, they are not able to think completely rationally and carefully about what you’re saying. They are just reacting based on their life experiences. Kahneman is not saying that we can never think clearly and rationally. System 2 can be brought to bear on any situation, but until it is, you are having to deal with a whole list of associations in the other person of which you are completely ignorant (unless you know that person really well).

Why Don’t People Listen to Your Reasoning?

Post Author: Bill Pratt 

Christian apologists try to convince other people that Christianity is true (all Christians are supposed to be doing this, by the way). We have excellent arguments and we have powerful evidence from philosophy, science, and history to support those arguments. That is why Christian apologetics is in a golden age. Yet, more often than not, these arguments fall on deaf ears. Why?

Meet Daniel Kahneman. He is a world-renowned, Nobel-prize-winning psychologist who wrote a book called Thinking Fast and Slow. The book argues that there are two systems operating in your mind: system 1 and system 2. Kahneman describes the two systems as follows:

System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.

System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

Here are some of the activities attributed to system 1:

  • Detect that one object is more distant than another.
  • Orient to the source of a sudden sound.
  • Complete the phrase “bread and…”
  • Make a “disgust face” when shown a horrible picture.
  • Detect hostility in a voice.
  • Answer to 2 + 2 = ?
  • Read words on large billboards.
  • Drive a car on an empty road.
  • Find a strong move in chess (if you are a chess master).
  • Understand simple sentences.
  • Recognize that a “meek and tidy soul with a passion for detail” resembles an occupational stereotype.

Here are some activities attributed to system 2:

  • Brace for the starter gun in a race.
  • Focus attention on the clowns in the circus.
  • Focus on the voice of a particular person in a crowded and noisy room.
  • Look for a woman with white hair.
  • Search memory to identify a surprising sound.
  • Maintain a faster walking speed than is natural for you.
  • Monitor the appropriateness of your behavior in a social situation.
  • Count the occurrences of the letter a in a page of text.
  • Tell someone your phone number.
  • Park in a narrow space (for most people except garage attendants).
  • Compare two washing machines for overall value.
  • Fill out a tax form.
  • Check the validity of a complex logical argument.

Before I proceed, I want to point out that most apologists are trying to interact with system 2 and not system 1. All of our arguments generally require the person we are communicating with to activate their system 2.

So what’s the problem? System 2 requires effort and system 1 does not. More specifically, Kahneman notes that “it is now a well-established proposition that both self-control and cognitive effort are forms of mental work.”

Kahneman cites the work of Roy Baumeister and his team:

The most surprising discovery made by Baumeister’s group shows, as he puts it, that the idea of mental energy is more than a mere metaphor. The nervous system consumes more glucose than most other parts of the body, and effortful mental activity appears to be especially expensive in the currency of glucose. When you are actively involved in difficult cognitive reasoning or engaged in a task that requires self-control, your blood glucose level drops. The effect is analogous to a runner who draws down glucose stored in her muscles during a sprint.

Listening to and trying to understand an argument that is new to you requires significant self-control and cognitive effort. This effort actually depletes your energy. It actually makes you tired.

Here is a big takeaway: human beings will tend to use system 1 whenever we possibly can in order to avoid mental effort. We use system 2 far less than we’d like to believe. Kahneman describes this in the following way:

A general “law of least effort” applies to cognitive as well as physical exertion. The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action. In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and costs. Laziness is built deep into our nature.

At this point, you may be thinking, “Big deal. I already know that thinking is hard and people are lazy.” But there is so much more to the interplay of system 1 and system 2. Kahneman spends the next 38 chapters in the book detailing experimental research into their interaction.

He looks into what happens when a person is confronted with new concepts, when they are asked to make quick decisions about topics with which they aren’t familiar. He also digs into the kinds of decisions system 1 is actually good at making, which is important since system 1 is the mind’s default way of thinking.

I hope you can see why a Christian apologist would want to gain an understanding of these concepts. Kahneman’s research (and the research of other behavioral economists and psychologists) is providing us with a bountiful set of new concepts and data that can help us make our case. We want non-believers to know the truth, and that is what this research can help us do.