Grab your FREE CHEAT SHEET summarizing the Four-Point Case for Christianity (scroll to the bottom)
CHECK OUT OUR YOUTUBE CHANNEL WITH OVER 614,000 SUBSCRIBERS!!!
DON'T FORGET TO SUBSCRIBE TO OUR WEEKLY NEWSLETTER!!!
JOIN THE CROSSEXAMINED COMMUNITY FOR BONUS RESOURCES, LIKE-MINDED PEOPLE, AND MONTHLY COMMUNITY ZOOM MEETINGS!
Grab your FREE CHEAT SHEET summarizing the Four-Point Case for Christianity (scroll to the bottom)
CHECK OUT OUR YOUTUBE CHANNEL WITH OVER 614,000 SUBSCRIBERS!!!
DON'T FORGET TO SUBSCRIBE TO OUR WEEKLY NEWSLETTER!!!
JOIN THE CROSSEXAMINED COMMUNITY FOR BONUS RESOURCES, LIKE-MINDED PEOPLE, AND MONTHLY COMMUNITY ZOOM MEETINGS!
Blog

By Jonathan McLatchie

Have you ever wondered why some people are able to think about the world clearer, forming more balanced and nuanced views about controversial topics, than others? Have you ever pondered what thinking patterns are most conducive to good reasoning and well supported conclusions, and how one might avoid the pitfalls of confirmation bias and self-deception? In her bookĀ The Scout Mindset: Why Some People See Things Clearly and Others Donā€™t, Julia Galef (host of the podcast ā€œRationally Speakingā€ and co-founder of the Center for Applied Rationality) attempts to answer these questions. [i]In the first half of this essay, I shall summarize Galefā€™s insights; in the latter half, I shall discuss what lessons we as Christian scholars and apologists can glean from the book.

A Summary of The Scout Mindset

Galef distinguishes between what she dubs ā€œthe soldier mindsetā€ and ā€œthe scout mindset.ā€ According to Galef, the soldier mindset, also known as motivated reasoning, leads us to loyally defend the stronghold of our belief commitments against intellectual threats, come what may. This involves actively seeking out data that tends to confirm our beliefs, while rationalizing or ignoring contrary data that tends to disconfirm them. On the other hand, the scout mindset attempts to honestly determine how the world really is ā€“ as Galef defines it, the scout mindset is ā€œthe motivation to see things as they are, not as you wish they were,ā€ (p. ix).

For the one in soldier mindset, argues Galef, reasoning is like defensive combat ā€“ ā€œitā€™s as if weā€™re soldiers, defending our beliefs against threatening evidence,ā€ (p. 7). For the soldier, to change oneā€™s mind ā€“ to admit that one was wrong ā€“ is seen as surrender and failure, a sign of weakness. Oneā€™s allegiance is to oneā€™s cherished beliefs rather than to the truth, even if those beliefs conflict with the balance of evidence. For the soldier, determining what to believe is done by asking oneself ā€œCan I believe this?ā€ or ā€œMust I believe this?ā€, depending on oneā€™s motives. For the one in scout mindset, by contrast, reasoning may be likened to mapmaking, and discovering that you are wrong about one or more of your beliefs simply means revising your map. Thus, scouts are more likely to seek out and carefully consider data that tends to undermine oneā€™s own beliefs (thereby making oneā€™s map a more accurate reflection of reality), deeming it more fruitful to pay close attention to those who disagree with their own opinions than to those whose thinking aligns with them.

The prevalence of soldier mindset in society today is aptly demonstrated by a sobering study, cited by Galef, in which participants were tested in regard to their ā€œscientific intelligenceā€ with a set of questions.[ii] Questions were divided into four categories ā€“ basic facts; methods; quantitative reasoning; and cognitive reflection. Remarkably, when conservative republican and liberal democrat participants were also asked whether they affirmed the statement that there is ā€œsolid evidenceā€ of recent global warming due ā€œmostlyā€ to ā€œhuman activity such as burning fossil fuels,ā€ there was a positive correlation between ā€œscientific intelligenceā€ and divergent opinion. That is to say, the higher oneā€™s scientific intelligence, the more likely a liberal democrat was to affirm the statement and the more likely a conservative republic was to disagree with it. This is not the only study to reveal the tendency for more educated people to diverge in opinion on controversial topics. Another study surveyed peopleā€™s views on ideologically charged topics, including stem cell research, the Big Bang, human evolution, and climate change.[iii] Their finding was that ā€œIndividuals with greater education, science education, and science literacy display more polarized beliefs on these issues,ā€ though they found ā€œlittle evidence of political or religious polarization regarding nanotechnology and genetically modified foods.ā€ Galef summarizes the implications of those studies: ā€œThis is a crucially important result, because being smart and being knowledgeable on a particular topic are two more things that give us a false sense of security in our own reasoning. A high IQ and an advanced degree might give you an advantage in ideologically neutral domains like solving math problems or figuring out where to invest your money. But they wonā€™t protect you from bias on ideologically charged questions,ā€ (p. 48).

Though there is an element of scout and soldier in all of us, Galef argues, ā€œsome people, in some contexts, are better scouts than most,ā€ being ā€œmore genuinely desirous of the truth, even if itā€™s not what they were hoping for, and less willing to accept bad arguments that happen to be convenient. Theyā€™re more motivated to go out, test their theories, and discover their mistakes. Theyā€™re more conscious of the possibility that their map of reality could be wrong, and more open to changing their mind,ā€ (pp. 14-15). On the flip side of the coin, often ā€œ[w]e use motivated reasoning not because we donā€™t know any better, but because weā€™re trying to protect things that are vitally important to us ā€“ our ability to feel good about our lives and ourselves, our motivation to try hard things and stick with them, our ability to look good and persuade, and our acceptance in our communities,ā€ (p. 26). For example, if we are being honest, how often do we, when considering a claim, ā€œimplicitly ask ourselves, ā€˜What kind of person would believe a claim like this, and is that how I want other people to see me?ā€™ā€ (p. 23). Such thinking fuels soldier mindset. In practice, we cannot eliminate soldier mindset from our reasoning processes entirely. After all, it is our default mentality. By nature, we like having our beliefs confirmed. But we can take intentional steps towards cultivating more of a scout mindset.

What are some of the key characteristics that distinguish scout from soldier mindset? In chapter four, Galef gives five features that define a scout. The first is the ability to tell other people when you realize that they were right. Galef caveats this quality by noting that ā€œTechnically, scout mindset only requires you to be able to acknowledge to yourself that you were wrong, not to other people. Still a willingness to say ā€˜I was wrongā€™ to someone else is a strong sign of a person who prizes the truth over their own ego.ā€ The second quality is reacting well to criticism. Galef explains, ā€œTo gauge your comfort with criticism, itā€™s not enough just to ask yourself, ā€˜Am I open to criticism?ā€™ Instead, examine your track record. Are there examples of criticism youā€™ve acted upon? Have you rewarded a critic (for example, by promoting him)? Do you go out of your way to make it easier for other people to criticize you?ā€ (p. 52). The third quality that marks out a scout is the ability to prove oneself wrong. Galef asks, ā€œCan you think of any examples in which you voluntarily proved yourself wrong? Perhaps you were about to voice an opinion online, but decided to search for counterarguments first, and ended up finding them compelling. Or perhaps at work you were advocating for a new strategy, but changed your mind after you ran the numbers more carefully and realized it wouldnā€™t be feasible,ā€ (p. 54). The fourth feature of scout mindset is to avoid biasing oneā€™s information. ā€œFor example,ā€ writes Galef, ā€œwhen you ask your friend to weigh in on a fight you had with your partner, do you describe the disagreement without revealing which side you were on, so as to avoid influencing your friendā€™s answer? When you launch a new project at work, do you decide ahead of time what will count as a success and what will count as a failure, so youā€™re not tempted to move the goalposts later?ā€ (p. 56). The fifth feature that Galef lists is being able to recognize good critics. Galef comments, ā€œItā€™s tempting to view your critics as mean-spirited, ill-informed, or unreasonable. And itā€™s likely that some of them are. But itā€™s unlikely that all of them are. Can you name people who are critical of your beliefs, profession, or even choices who you consider thoughtful, even if you believe theyā€™re wrong? Or can you at least name reasons why someone might disagree with you that you would consider reasonable (even if you donā€™t happen to know of specific people who hold those views)?ā€ (p. 57). In summary, Galef notes, ā€œBeing able to name reasonable critics, being willing to say ā€˜The other side has a point this time,ā€™ being willing to acknowledge when you were wrong ā€“ itā€™s things like these that distinguish people who actually care about truth from people who only think they do,ā€ (p. 57).

Chapter 5 of the book offers five tests of bias in our reasoning. The first test is the double standard test, which essentially asks whether we apply the same standards to ourselves that we would apply to others. The second test is the outsider test, which attempts to determine how you would assess the same situation or data if you had no vested interest in the outcome. The third test is the conformity test, which attempts to discern the extent to which oneā€™s opinion is in fact oneā€™s own. Galef explains, ā€œIf I find myself agreeing with someone elseā€™s viewpoint, I do a conformity test: Imagine this person told me that they no longer held this view. Would I still hold it? Would I feel comfortable defending it to them?ā€ (p. 66). The fourth test is the selective skeptic test ā€“ ā€œImagine this evidence supported the other side. How credible would you find it then?ā€ (p. 68). The final test is the status quo bias test ā€“ ā€œImagine your current situation was no longer the status quo. Would you then actively choose it? If not, thatā€™s a sign that your preference for your situation is less about its particular merits and more about a preference for the status quo,ā€ (p. 69).

Another thing that marks out a scout, according to Galef, is oneā€™s attitude towards being wrong. Scouts, explains Galef, ā€œrevise their opinions incrementally over time, which makes it easier to be open to evidence against their beliefs,ā€ (p. 144). Further, ā€œthey view errors as opportunities to hone their skill at getting things right, which makes the experience of realizing ā€˜I was wrongā€™ feel valuable, rather than just painful,ā€ (p. 144). Galef even suggests that we should drop the whole ā€œwrong confessionā€ altogether and instead talk about ā€œupdatingā€. Galef explains, ā€œAn update is routine. Low-key. Itā€™s the opposite of an overwrought confession of sin. An update makes something better or more current without implying that its previous form was a failure,ā€ (p. 147). Galef points out that we should not think about changing our minds as a binary thing ā€“ rather, we should think of the world in ā€œshades of greyā€, and think about changing our mind in terms of an ā€œincremental shiftā€ (p. 140). Galef notes that thinking about revising oneā€™s beliefs in this way makes ā€œthe experience of encountering evidence against one of your beliefs very differentā€ since ā€œeach adjustment is comparatively low stakesā€ (p. 140). For example, ā€œIf youā€™re 80 percent sure that immigration is good for the economy, and a study comes out showing that immigration lowers wages, you can adjust your confidence in your belief down to 70 percent,ā€ (p. 140).

Galef also points out that, when it comes to intentionally exposing ourselves to content representing the ā€˜other sideā€™ of a debate in which we are interested, people tend to make the mistake of always ending up ā€œlistening to people who initiate disagreements with us, as well as the public figures and media outlets who are the most popular representatives of the other side,ā€ (p. 170). However, as Galef explains, ā€œThose are not very promising selection criteria. First of all, what kind of person is most likely to initiate a disagreement? A disagreeable person. (ā€˜This article you shared on Facebook is complete bullshit ā€“ let me educate youā€¦ā€™) Second, what kind of people or media are likely to become popular representatives of an ideology? The ones who do things like cheering for their side and mocking or caricaturing the other side ā€“ i.e., you,ā€ (pp. 170-171). Instead, Galef suggests, ā€œTo give yourself the best chance of learning from disagreement, you should be listening to people who make it easier to be open to their arguments, not harder. People you like or respect, even if you donā€™t agree with them. People with whom you have some common ground ā€“ intellectual premises, or a core value that you share ā€“ even though you disagree with them on other issues. People whom you consider reasonable, who acknowledge nuance and areas of uncertainty, and who argue in good faith,ā€ (p. 171).

Lessons We Can Draw from The Scout Mindset

To what extent are we, as Christian scholars and apologists, cultivating a scout mindset? Too often debates between theists and atheists devolve into tribalism, an ā€˜us vs. themā€™ mentality, and a smug condescension towards those who disagree with us. But what if we saw those with whom we disagree not as enemies but as colleagues in our quest to attain a better map of reality? Our critics are those who are best placed to discover flaws in our own reasoning, which may be invisible to us. We ignore them at our peril. By listening carefully to our critics, we can construct a more nuanced, more robust worldview. And which critics of our faith are we seeking out to represent the dissenting view? Are we primarily engaging with popular but less-than-nuanced critics of Christianity, or are we actively seeking out the very best, most erudite and well-informed critics of our faith, even if less well known? Can we name some of our critics as honest and thoughtful? How are we positioning ourselves to be in the best place possible to find out we are wrong, if we are in fact wrong? If we are wrong about one or more of our beliefs, can we honestly say that we value truth enough to want to know? How do our answers to the foregoing questions bear on that latter question?

Perhaps at this juncture it should be clarified what exactly apologetics is, since there is regrettably much confusion surrounding this word, both inside and outside of the Christian community. It is commonly thought that the exercise of apologetics is contrary to open-ended inquiry where the conclusion is not stipulated a priori. However, this view is quite mistaken. While apologetics is not identical to open-ended inquiry, it is co-extensive with it in the sense that apologetics is what happens after the results of open-ended inquiry are in, and the time has come to publicize our interpretation of the data. Thus, though the term is seldom used in this context, every publication of a scientific paper is an exercise in apologetics. Charles Darwinā€™sĀ Origin of SpeciesĀ was an exercise in apologetics since he sought to sell his interpretation of the observations that he had made on the Galapagos islands. It is common to think of apologists as playing the role of a criminal defence attorney who is committed to defending his client, come what may. In reality, however, a more apt parallel is to an investigative journalist, reporting for popular consumption the results of a fair and balanced inquiry.

Being an apologist of the gospel is no light responsibility. We are asking people to pledge their allegiance to Jesus Christ and dedicate every aspect of their life to His service. This may cost them greatly ā€“ even their life. The weight of this responsibility is emphasized by the apostle Paul himself, who stated that, if Jesus was not in fact raised, ā€œWe are even found to be misrepresenting God, because we testified about God that he raised Christ, whom he did not raise if it is true that the dead are not raised,ā€ (1 Cor 15:15). We therefore owe it to those to whom we preach to study diligently the facts and arguments on both sides of the debate to ensure that the gospel is in fact true. We also owe it to those with whom we share the gospel to fully and completely inform them, as far as is possible, concerning the facts of the case. Too often I have seen apologists present popular arguments for Christianity but omit relevant facts that undermine the force of their argument. For some examples of this, see my recent conversation with Wesley Huff onĀ arguments Christians should NOT use.[iv] Whenever you encounter an argument that is supportive of a position that you like, you should always, before publicly repeating the argument, conduct a thorough search for any relevant data that might reduce the evidential force of the argument. At the very least you should determine whether any academic publications, especially those critical of your beliefs, have already addressed the argument. This is but one of several ways in which you can reduce the negative effects of confirmation bias on your reasoning.Ā 

What other steps can we take to mitigate against confirmation bias? I try to make it my habit to expose myself to more material ā€“ whether that be books, articles, podcasts, videos or other media ā€“ that arguesĀ againstĀ my beliefs than those which argueĀ forĀ them. This reduces the likelihood of me fooling myself, and forces me to think deeper and more carefully about my beliefs, and to develop a more nuanced expression of them. It also puts me in a strong position to find out that I am wrong if I am in fact wrong about any of my beliefs. A first step towards stepping outside of your intellectual echo chamber can be recognizing that smart people can argue in good faith and yet disagree with you.

I am sometimes asked how a newcomer to religious debates may discern which apologists to listen to and whom to disregard. Of course, the difficulty here is that, in order to discern which apologists can be trusted to give reliable content, one must have already attained a certain level of knowledge about the subject. But in order to arrive at that threshold of knowledge concerning the subject, one must first determine who to receive information from. How might we escape this dilemma? One criterion of several that I often give is to be wary of anyone who asserts thatĀ allĀ of the evidence supports their own personal view and that there is none which tends to disconfirm it. Whenever anyone tells me, concerning any complex topic (whether that be theism, Christianity, evolution or anything else), that all of the evidence is on the side of their own personal view, it leads me to reduce my confidence in their objectivity with the data, and I begin to think that confirmation bias is particularly prominent in this individualā€™s reasoning process. It is an intellectual virtue to be able to admit that one or more pieces of evidence tends to disconfirm your own view. Of course, presumably you also maintain that the evidence that tends to confirm your view is stronger, on balance, than that which tends to disconfirm it. Nonetheless, recognizing the existence of difficult or anomalous data is a mark of scout mindset. And how might we go about determining whether a given datum confirms or disconfirms our Christian beliefs? For each piece of data we encounter, we should ask ourselves whether that datum, considered in isolation, is more probable given Christianity or given its falsehood. If the former, then it is evidence that is confirmatory of Christianity; if the latter, then it is evidence against. Too often I see people reason that, if a set of data can be madeĀ compatibleĀ with their beliefs, then they have neutralized the objection to their beliefs. However, this approach is quite simplistic. It is nearly always possible to make discordant data compatible with your beliefs. But that does not mean that the data is not better predicted given that your beliefs are false than that they are true, or that you should not lower your confidence in those beliefs. The appropriate question, when confronted with discordant data, is not to ask ā€œCan IĀ believe I am still right?ā€ Galef rightly points out that ā€œMost of the time, the answer is ā€˜Yes, easily,ā€™ā€ (p. 141). Rather, we should ask to what extent our confidence in our beliefs needs to be updated in response to this new data.

Another criterion of a credible apologist is that he or she is willing to offer critiques of arguments presented by others on his or her own side of the debate. Are they even-handed in subjecting arguments for their own view to the same scrutiny as those put forward by those on the other side of the debate? This reveals that they are discerning and have a genuine concern for factual accuracy. How one responds to criticism, both friendly critique as well as that from dissenting voices, is also a measure of oneā€™s concern for correct representation of information. An ability to publicly retract false or misleading statements and issue corrections goes a long way to establish oneā€™s credibility. When we encounter a new contributor to the debate, with whose work we have not hitherto interacted, we should also fact-check their statements, going, if possible, back to the primary sources ā€“ especially when they stray into territory outside of our own domain of expertise. If they are able to sustain a track record of being reliable in their reportage of information and fully informing the audience about the relevant facts, one ought to be more inclined to trust them as a credible authority. If on the other hand they have a habit of getting things factually incorrect, one should be very hesitant to take anything they say on their word.

One should also be wary of apologists who exaggerate the strength of their argument, over-pushing the data beyond that which it is able to support. It is always better to understate the merits of oneā€™s arguments and pleasantly surprise by overproviding, than to overstate the merits of the argument and disappoint by underproviding. This is why in my writing and public speaking I prefer to use more cautious-sounding statements like ā€œthis tends to confirmā€ or ā€œthis suggestsā€ rather than bolder statements like ā€œthis provesā€ or ā€œthis demonstrates.ā€ Similarly, I will speak of being ā€œconfidentā€ rather than ā€œcertainā€ of my conclusions.

My enthusiastic advocacy for integrity and nuance in apologetics, together with my insistence on subjecting arguments advanced in support of Christianity to the same scrutiny that we would subject contrary arguments to, has on occasion been misconstrued ā€“ by atheists as well as by Christians ā€“ as an indication of my losing confidence in the truth of Christianity. However, this does not at all follow and, frankly, it saddens me that Christian apologetics has come to be associated, in the minds of many, with a soldier rather than scout mindset. Clearly, it is possible to be convinced by the evidence that Christianity is true and yet still be committed to the honest presentation of information. It is also possible to believe that Christianity is well supported while also maintaining that many of the arguments advanced in support of Christianity are fundamentally flawed or dramatically overstated. I believe it is a virtue rather than a vice to recognize oneā€™s own confirmation bias and thus take steps in the direction of reducing its negative effects on oneā€™s reasoning. The principles that I have advocated in this essay are germane to apologists of any position, regardless of how convinced of that position they are. Otherwise, it is too easy to deceive ourselves, apply double standards, cherry pick data, and inoculate ourselves against finding out that we are mistaken in regards to one or more of our beliefs.

One may of course object to the principles advocated in this essay that, if unsound data or overstated arguments leads people to embrace the gospel, then the end justifies the means. I recall complaining, on more than one occasion, about the presentation of factually erroneous information in defence of Christianity at a University-affiliated Christian society in the United Kingdom. The response with which I was met, lamentably, was that it is very unlikely that any other of the attendees would know enough about the subject to pick up on the errors in the presentation, and we should rejoice that they heard the gospel. This thinking, however, is flawed for at least two reasons. First, we claim to represent the one who identified Himself as truth itself (Jn 14:6). Plenty of Biblical texts condemn the employment of deceptive methods (e.g. Exod 20:16; Ps 24:3-5; 101:7; Prov 10:9; 11:3; 12:22; 24:28; Col 3:9; Eph 4:25). It is therefore not honouring of God when we perpetuate misinformation, even in defence of the gospel. Second, if one with whom we have shared the gospel later does his or her own research to determine whether the things we have said are in fact true, much like the Bereans are commended for doing in regards to Paulā€™s preaching (Acts 17:11), we are responsible for having placed another obstacle between them and the gospel. This is a grave thing to be responsible for.

In summary, cultivating a scout mindset, and minimizing soldier mindset, can help us to think more clearly and with greater intellectual honesty about our beliefs and our reasons for holding them. I cannot recommend any more highly Julia Galefā€™s bookĀ The Scout Mindset. I would also recommend her presentation for TEDx Talks, ā€œWhy ā€˜scout mindsetā€™ is crucial to good judgment.ā€[v]

Footnotes

[i] Julia Galef,Ā The Scout Mindset: Why Some People See Things Clearly and Others Donā€™tĀ (New York: Porfolio, 2021).

[ii] Dan M. Kahan, ā€œOrdinary science intelligenceā€™: a science-comprehension measure for study of risk and science communication, with notes on evolution and climate change,ā€Ā Journal of Risk ResearchĀ 20, no. 8 (2017), 995-1016.

[iii] Caitlin Drummond and Baruch Fischoff, ā€œIndividuals with greater science literacy and education have more polarized beliefs on controversial science topics,ā€Ā Proceedings of the National Academy of SciencesĀ 114, no. 36 (Sep, 2017), 9587-9592.

[iv] https://www.youtube.com/watch?v=rVad8BE5A6c

[v] https://www.youtube.com/watch?v=3MYEtQ5Zdn8

Recommended resources related to the topic:

I Don’t Have Enough Faith to Be an Atheist (Paperback), and (Sermon) by Norman Geisler and Frank TurekĀ 

Stealing From God by Dr. Frank Turek (Book, 10-Part DVD Set, STUDENT Study Guide, TEACHER Study Guide)

_____________________________________________________________________________________________________________________________________________________

Dr. Jonathan McLatchie is a Christian writer, international speaker, and debater. He holds a Bachelorā€™s degree (with Honors) in forensic biology, a Mastersā€™s (M.Res) degree in evolutionary biology, a second Master’s degree in medical and molecular bioscience, and a Ph.D. in evolutionary biology. Currently, he is an assistant professor of biology at Sattler College in Boston, Massachusetts. Dr. McLatchie is a contributor to various apologetics websites and is the founder of the Apologetics Academy (Apologetics-Academy.org), a ministry that seeks to equip and train Christians to persuasively defend the faith through regular online webinars, as well as assist Christians who are wrestling with doubts. Dr. McLatchie has participated in more than thirty moderated debates around the world with representatives of atheism, Islam, and other alternative worldview perspectives. He has spoken internationally in Europe, North America, and South Africa promoting an intelligent, reflective, and evidence-based Christian faith.

Original Blog Source: https://bit.ly/3iKor6wĀ 

Facebook Comments

Recent Videos

Spanish Blog

Contact Cross Examined

Have General Questions?

Contact Cross Examined

SCHEDULE A CROSS EXAMINED SPEAKER

Click to Schedule

Pin It on Pinterest