What a Fool Believes (Confirmation Bias)

1 Comment

Here’s the transcript for episode 2 of season 3 of our podcast, titled ‘What a Fool Believes (Confirmation Bias)’. You can listen to it here on our website or on your favourite podcast player.

Ken: Hi there. This is How to Choose, the show that helps you make better decisions and improve your judgment. Thanks for joining us. I’m Ken.

Tessa: And I’m Tessa. Welcome to season three. In this, our third season of How to Choose, we’re exploring the topic of Thinking Problems.

Ken: Yes, that is is correct. We’re focusing on several common thinking problems that undermine our ability to judge well and to make good decisions. We’re going to tell you some ways to identify if you’re succumbing to these errors and how you can reduce the impact of them in your life.

Tessa: So what’s our topic today, Ken?

Ken: Well, today we’re looking at confirmation bias. Confirmation bias is one of the most common thinking problems. We all succumb to it and it can have disastrous implications. But let’s start with a fairly trivial example to illustrate. Have you ever said to yourself something like, I always get red lights when I am running late? What is it that confirms that belief in your mind? Well, you’re noticing the times that you’re forced to stop at red lights, and every time that happens, it reinforces your belief, but what you’re not doing is paying attention to the number of times that you get a green light. Now, I asked some of our listeners this question. What is one annoying thing that always seems to happen to you more than to other people? Let’s have a listen to what they said.

Peter: I’m a terrible sleeper, so random sounds or changes in temperature or disturbances will wake me up and don’t seem to wake anyone else up.

Nick: That’s an interesting question. I would say sometimes when I go shopping, when you pick up the bung trolley, you seem to feel that you’ve got the one trolley that doesn’t work, that has the wheel that doesn’t roll very well. I’d say that’s probably the closest thing that happens to me. And you look around and everybody else has a beautifully functional trolley and you’re the one there with wheel that keeps catching on things.

Isha: I don’t know, maybe I just have like an outlook that excludes those parts of my life, but I don’t think that there’s anything that happens to me, particularly I think I’ve really left out in the friends department. I’ll say that I think it’s really nice that I feel like whenever I go, like I’ve lived in different countries, and whenever I move, I feel like I’m able to make good friends, which is really nice.

Brian: It always seems to happen to me at work that’s electronics go down, computers go down, or printing prices.

Erin: So despite my best efforts, the thing that seems to happen to me, despite all the attempts and all the research and all the practice, is I am a terrible, terrible gardener. So I’m known to be able to kill mint through a great combination of tough love and too much love. So between the over watering and the gross negligence, it tends to just curl up and die.

Lucy: The first thing that popped in my head was, I am always locking my keys in my car. I don’t have one of those clicky cars where you can lock it with the button. I have to manually lock my car. And because I do that, I just push a button down and close my door. And my keys always left in my ignition. And I’m always calling roadside assist or Mum to come and help me get my keys out of my car. I have such a shocking memory, and it’s happened to me so many times. Now you’d think I learn.

Becca: Parking tickets, undeserved parking ticket.

Ken: I like the qualification!

Ken: There was some very interesting answers to that question. Was there one that particularly grabbed your attention?

Tessa: I was curious about the person who doesn’t seem to notice negative things. I think it’s wonderful that she’s an optimist, and I think we both have that bent, Ken. But maybe if you take it to the extreme, you might become oblivious to some of the negative things or things that might be a warning to you in your life.

Ken: Good point. Can I tell you a story?

Tessa: I’d love to hear one.

Ken: So the story is about a journalist named Bethany Brookshire. One morning, Bethany Brookshire sat down at her computer, and she noticed that she received two replies from scientists that she had requested to interview. The female scientist had responded, addressing her as Dr. Brookshire. The male scientist addressed her as Ms. Brookshire. Now, Brookshire shared this on Twitter with the comment to the effect ‘My signature block clearly indicates that I have a PhD. Men respond ‘Dear Bethany’ or ‘Dear Ms. Brookshire’, whereas women respond, ‘Dear Dr. Brookshire’. It’s not 100%, but it’s a very clear division. Her tweet was very well received, and many women indicated that they were unsurprised and had experienced similar treatment. But as the expressions of sympathy and solidarity piled up, Brookshire started to question herself. Was I actually accurate in my claim? It was my impression, but I hadn’t checked the evidence, and the evidence is sitting right there in my inbox. Now, when she ran the numbers, she found, interestingly enough, that she was wrong. 8% of men called her Doctor. That’s not very many. But only 6% of women called her Doctor. So her assertion was incorrect. But there’s another part to this story. Having realised she was wrong, what did she do? Well, I think what she did was quite impressive. She jumped onto Twitter and she acknowledged her error and demonstrated to us that she has a lot of integrity and commitment to letting the data shape her views. Now, this is a story told in one of my favoruite books called The Scout Mindset by Julia Galef. And Julia Galef points out that this doesn’t indicate that there is no gender bias in science. But it does show that Brookshire’s claim that women were much more likely to address her as Dr. Brookshire was inaccurate in this instance.

Tessa: This is such a great example, Ken, and I think we’ve all been in that situation where we’ve really already formed an opinion, and the next thing that comes along that supports it only serves to strengthen our view. But the part of this story that impresses me the most is what you just described is that she then publicly corrected herself and admitted that her data didn’t support her claim.

Ken: Yeah, it’s very impressive. And I think it demonstrates that she was really interested in that question of ‘Why do we reach the conclusions that we do? Are we really looking for the data or are we sort of jumping to conclusions?’

Tessa: It’s so easy to form a view and then look for and find information that supports our beliefs, especially with the Internet at our fingertips, while we ignore or dismiss the evidence that says we’re wrong.

Ken: And this is the tendency that we call confirmation bias. Rather than carefully examining our beliefs to see if they’re accurate or not, we tend to search for information that confirms what we believe. And by information, I don’t necessarily mean reliable information or carefully gathered data. It could just be anecdotes or opinions from dubious sources. Now, at this point, I’d like to reference some particularly insightful thinkers from the 1970s. This group of men reflected carefully on this confirmation bias and its impact in relationships. And they were, of course, the Doobie Brothers.

Tessa: I’ve got a confession to make, Ken. I don’t think I could name a single Doobie Brothers song.

Ken: I am horrified, Tessa. But you’ll be able to name one after this episode. In their 1979 hit ‘What a Fool Believes’, the Doobie Brothers made this profound observation about a lovestruck man who refused to accept that his love for a woman is unrequited.

Clip plays from song ‘What a Fool Believes’

Ken: Oh, how good was that? Well, I hope everyone enjoys that earworm for the rest of the day. Let’s just reflect on those lyrics again. “As he rises to her apology anybody else would surely know he’s watching her go, but what a fool believes he sees no wise man has the power to reason away: what seems to be is always better than nothing.” I think the original working title for the song might have been ‘Confirmation Bias as Demonstrated in Romantic Relationships.’ But some senior producer who was so confident in their belief that snappy titles would sell more records insisted that they change it to: ‘What a Fool Believes’.

Tessa: This really is an example, though, isn’t it, of what we’re talking about today? We get an idea fixed in our minds, and then we subsequently find evidence and evidence in quotation marks there that supports that belief. Ignoring the data that shows our belief is incorrect. Perhaps in the debut brothers songs, repeated rejections was not enough evidence.

Ken: Yeah, that’s right. Yeah, we see it everywhere. Look, one example, and it’s just one example is in a church context. Church leaders – and I know this from personal experience – can have a tendency to search for what are called proof texts. Essentially a proof text is a verse or a passage that supports a point that you want to make. For example, if a preacher wants to communicate the message that God is patient, and if that preacher isn’t particularly patient and is in a big hurry, then they might decide to do a word search for patient or patience. And then they’re going to find a whole bunch of biblical passages that will communicate and prove their point. What often happens is that proof text is taken out of context and quoted to prove their point.

Tessa: And I’m sure you could do the opposite of that very easily by searching for vengeful and strike and find things that might challenge that exact sermon.

Ken: And look, we know that we do this in many areas of life, and we don’t have to think super hard to realise why this is a problem.

Tessa: Not at all. And I think it’s obvious. I mean, this is the definition of prejudice. We’ve made up our minds already, or we’ve formed a prejudgment, and that impacts how we view every bit of information we encounter. If we’re not careful, we become blinded to the truth because we refuse to listen to the facts when we encounter them. And inevitably, we’ll make incorrect and probably often harmful judgments, won’t we?

Ken: Yeah, that’s right. Look, I’ll give you a silly example, but imagine there’s a scientist who decides to prove her theory that all birds like to eat birdseed. What does she do? She sets up a bird feeder out in a field. She fills it up with bird seed, and then she sets up some cameras to record the birds that come along. And just as she expected, every bird that comes to the feeder eats some seed. Her theory has been proven correct! Well, obviously not. And immediately we can see how absurd this is because she’s only looking at the birds that are drawn to the bird seed. What about all the other birds in the area that maybe are not interested, that fly past and go and eat something else?

Tessa: This example actually calls to mine an episode of The Simpsons. And I don’t know if you’re a big fan, Ken, Lisa hands Homer a rock and tells him that it keeps tigers away. And she looks around and says, there’s no tigers as evidence that the rock is functional. She might even charge him a few dollars from it for memory, too? But this is actually what I love about the scientific method: everything is a theory that does need to be tested. And even when you seemingly have proven your theory, you should still apply rigorous skepticism.

So what do we do about this?

Ken: Well, look, here’s a couple of things that we can do that I think will help us.

The first one is not easy, but try to train yourself to notice when you make an assertion or a generalisation. And then when you notice that, take a close look at it. What are you actually saying and what data is this based on? Are you actually looking for data, or at least looking carefully at data that disproves your belief?

And a good question that you can ask yourself, and you know this as an analyst Tessa, is ‘What would convince me that I was wrong?’

So let’s go back to that assumption that we always hit red lights when we’re driving to work and we’re running late. What would be a way that we could test that theory?

Tessa: So I guess you could leave home at the same time every day and count the number of red lights versus green lights. And actually, to make it even more fair, maybe get someone else to do the same as well and see if a pattern forms.

Ken: And something else to think about is what are some of the other factors that might be contributing to your experience of red lights and green lights in the morning when you’re driving to work? Well, we kind of know that if you’re on the main road and there’s other roads intersecting that road, that the traffic on the other roads is going to trigger a red light. So, yes, okay, in the morning when you’re traveling to work, you will hit more red lights than you would if you were traveling in the middle of the night.

So look, here’s another thing too. I think it’s very important to think about how you go about asking other people for input and advice when you are making a decision. It’s often a case that we will demonstrate confirmation bias in the way that we ask for input. So ask yourself, what is your motivation?

  • Are you objectively looking for more data to inform your decision, even if it challenges what you believe and even if it forces you to rethink your decision?, or…
  • Are you just really looking for people to support and endorse a decision that you’ve already made?

Now, that can become obvious when you look at how we phrase the question. If you want input to help with a difficult decision, don’t ask people leading questions that will push them to try and find data that supports your point of view. This is called a positive test strategy and it encourages people to have biased thinking. So, for example, let’s say I’m thinking about whether I should go and join the public service, and I’m kind of inclined that it might be a good idea. So I could come and ask you the question Tessa: ‘Do public service jobs offer better pay and conditions than in the private sector?’ That might be a valid question, but it’s kind of leading you to think of examples that might support what I’ve just said. Instead, I could ask a more neutral question, and that could be ‘Tess, what do you think are the pros and cons of a job in the public service?’ Right. It’s going to open it up and get you to think in both sides of the equation.

Tessa: That’s a good example, Ken. And let’s try and illustrate how confirmation bias might manifest itself in the home. Let’s imagine a scenario where your friends and family are all expressing concerns about your partner, who they say treats you really badly. You disagree, and you tell them that he’s actually very kind and thoughtful. And to support your belief, you think of a few examples of where he’s unexpectedly bought you little gifts this year. And it’s easy in this situation to just remember the examples that support your belief. If you’re able to remember that we are all susceptible to confirmation bias, then it might help you to pause and ask yourself a different question and maybe even something going down the opposite line, such as, ‘How often does my partner speak to me in a way that I think is unacceptable?’ Starting with a different question and acknowledging your biases can sometimes help you see things differently.

Ken: Yeah, we could be describing a pretty serious scenario there, couldn’t we? It’s easy to get trapped in a fairly negative relationship and be unable to see the problems that exist there. But we can also see this at work. Let’s imagine that your opinion is that you have been overlooked for opportunities at work consistently, and to support that belief, you cite a few examples of times where others were given opportunities that you had hoped to get. But stop and ask yourself, are you being objective? How about thinking of good opportunities that you’ve been given? And maybe ask a trusted colleague for their opinion and listen to what they have to say: and ask them an open question, not a leading question. It’s very hard to be objective when we’re feeling aggrieved about being treated unfairly. We’re emotionally engaged, and again, as we talked about in episode one, the parts of our brain, the frontal cortex that does that more reasoned, rational thinking is not engaged. Some of us fundamentally believe that we don’t get a fair go in life, and that will colour how we experience all parts of life, including work. We notice the times we miss out, and we overlook the times that we’re rewarded.

Tessa: Yeah, it’s not a nice way to go through life, Ken, is it?

Ken: No.

Tessa: So really understanding ourselves is the first step here, isn’t it? Reflecting, being honest with yourself and seeking feedback from people who have insight and will be objective and aren’t scared to tell you what they think. They’re very hard to find. But once you do latch onto them and maybe the conclusion is that your partner does generally treat you really well, or that you are being treated unfairly at work. But be willing to test your beliefs before you become dogmatic about it. And before asking for someone’s input, ask yourself honestly, am I really willing to accept that I might be wrong and to change my beliefs? If not, then it’s questionable as to how much use it is to ask for other people’s opinions at all.

Ken: Yeah, I couldn’t agree more. That’s an excellent point, Tess. So what is your key takeaway then, from today’s episode?

Tessa: For me, the point about positive test strategy encouraging biased thinking was really useful to keep in mind. So next time I seek advice, I’m going to double check that I’m not really just asking them to confirm my belief, by the way, that I am sneakily asking the question, what was yours, Ken?

Ken: Yeah, look, I agree with that one. That’s one I find really helpful. I think another one is also training ourselves to ask questions like, ‘What would convince me that I was wrong?’

Tessa: Yeah, some easy takeaways for our listeners. And Ken, we know that evidence clearly shows that being able to teach a theory to someone else will help it stick. So please do yourself a favour and tell someone about something from this episode. It’ll help you learn something and use it.

Ken: Yeah, if we think back to that story, Julia Galef’s story about Bethany Brookshire, the willingness to voluntarily prove ourselves to have been wrong, sometimes I think we can be reticent because we’re worried about our reputation. But in my experience, it’s more likely that people will respect you more if they know that you’re willing and able to identify your errors and acknowledge them.

Well, listen, thanks so much for joining us in this episode of season three of how to choose. Make sure you jump in and listen to episode three, where we will be talking about anchoring bias.

Tessa: Bye for now.

Previous Post
Adaptable Decision Making
Next Post
Weighted Down by Decisions? (Anchoring Bias)

1 Comment. Leave new

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed

Menu

Subscribe now to get our latest posts as soon as they drop

Share via
Copy link
Powered by Social Snap