5 Barriers to Critical Thinking
What holds us back from thinking critically in day-to-day situations?
Posted January 18, 2019 | Reviewed by Davia Sills
Quite often, discussions of Critical Thinking (CT) revolve around tips for what you or your students should be doing to enhance CT ability. However, it seems that there’s substantially less discussion of what you shouldn’t be doing—that is, barriers to CT.
About a year ago, I posted "5 Tips for Critical Thinking" to this blog, and after thinking about it in terms of what not to do, along with more modern conceptualizations of CT (see Dwyer, 2017), I’ve compiled a list of five major barriers to CT. Of course, these are not the only barriers to CT; rather, they are five that may have the most impact on how one applies CT.
1. Trusting Your Gut
Trust your gut is a piece of advice often thrown around in the context of being in doubt. The concept of using intuitive judgment is actually the last thing you want to be doing if critical thinking is your goal. In the past, intuitive judgment has been described as "the absence of analysis" (Hamm, 1988); and automatic cognitive processing—which generally lacks effort, intention, awareness, or voluntary control—is usually experienced as perceptions or feelings (Kahneman, 2011; Lieberman, 2003).
Given that intuitive judgment operates automatically and cannot be voluntarily "turned off," associated errors and unsupported biases are difficult to prevent, largely because reflective judgment has not been consulted. Even when errors appear obvious in hindsight, they can only be prevented through the careful, self-regulated monitoring and control afforded by reflective judgment. Such errors and flawed reasoning include cognitive biases and logical fallacies.
Going with your gut—experienced as perceptions or feelings—generally leads the thinker to favor perspectives consistent with their own personal biases and experiences or those of their group.
2. Lack of Knowledge
CT skills are key components of what CT is, and in order to conduct it, one must know how to use these skills. Not knowing the skills of CT—analysis, evaluation, and inference (i.e., what they are or how to use them)—is, of course, a major barrier to its application. However, consideration of a lack of knowledge does not end with the knowledge of CT skills.
Let’s say you know what analysis, evaluation, and inference are, as well as how to apply them. The question then becomes: Are you knowledgeable in the topic area you have been asked to apply the CT? If not, intellectual honesty and reflective judgment should be engaged to allow you to consider the nature, limits, and certainty of what knowledge you do have, so that you can evaluate what is required of you to gain the knowledge necessary to make a critically thought-out judgment.
However, the barrier here may not necessarily be a lack of topic knowledge, but perhaps rather believing that you have the requisite knowledge to make a critically thought-out judgment when this is not the case or lacking the willingness to gain additional, relevant topic knowledge.
3. Lack of Willingness
In addition to skills, disposition towards thinking is also key to CT. Disposition towards thinking refers to the extent to which an individual is willing or inclined to perform a given thinking skill, and is essential for understanding how we think and how we can make our thinking better, in both academic settings and everyday circumstances (Norris, 1992; Siegel, 1999; Valenzuela, Nieto, & Saiz, 2011; Dwyer, Hogan & Stewart, 2014).
Dispositions can’t be taught, per se, but they do play a large role in determining whether or not CT will be performed. Simply, it doesn’t matter how skilled one is at analysis, evaluation, and inference—if they’re not willing to think critically, CT is not likely to occur.
4. Misunderstanding of Truth
Truth-seeking is one such disposition towards thinking, which refers to a desire for knowledge; to seek and offer both reasons and objections in an effort to inform and to be well-informed; a willingness to challenge popular beliefs and social norms by asking questions (of oneself and others); to be honest and objective about pursuing the truth, even if the findings do not support one’s self-interest or pre-conceived beliefs or opinions; and to change one’s mind about an idea as a result of the desire for truth (Dwyer, 2017).
Though this is something for which many of us strive or even just assume we do, the truth is that we all succumb to unwarranted assumptions from time to time: that is, beliefs presumed to be true without adequate justification. For example, we might make a judgment based on an unsubstantiated stereotype or a commonsense/belief statement that has no empirical evidence to justify it. When using CT, it’s important to distinguish facts from beliefs and, also, to dig a little deeper by evaluating "facts" with respect to how much empirical support they have to validate them as fact (see "The Dirtiest Word in Critical Thinking: 'Proof' and its Burden").
Furthermore, sometimes the truth doesn’t suit people, and so, they might choose to ignore it or try and manipulate knowledge or understanding to accommodate their bias. For example, some people may engage in wishful thinking, in which they believe something is true because they wish it to be; some might engage in relativistic thinking, in which, for them, the truth is subjective or just a matter of opinion.
In one of my previous posts, I lay out "5 Tips for Critical Thinking"—one of which is to play Devil’s Advocate, which refers to the "consideration of alternatives." There’s always more than one way to do or think about something—why not engage such consideration?
The willingness to play Devil’s Advocate implies a sensibility consistent with open-mindedness (i.e., an inclination to be cognitively flexible and avoid rigidity in thinking; to tolerate divergent or conflicting views and treat all viewpoints alike, prior to subsequent analysis and evaluation; to detach from one’s own beliefs and consider, seriously, points of view other than one’s own without bias or self-interest; to be open to feedback by accepting positive feedback, and to not reject criticism or constructive feedback without thoughtful consideration; to amend existing knowledge in light of new ideas and experiences; and to explore such new, alternative, or "unusual" ideas).
At the opposite end of the spectrum, closed-mindedness is a significant barrier to CT. By this stage, you have probably identified the inherent nature of bias in our thinking. The first step of CT is always going to be to evaluate this bias. However, one’s bias may be so strong that it leads them to become closed-minded and renders them unwilling to consider any other perspectives.
Another way in which someone might be closed-minded is through having properly researched and critically thought about a topic and then deciding that this perspective will never change, as if their knowledge will never need to adapt. However, critical thinkers know that knowledge can change and adapt. An example I’ve used in the past is quite relevant here—growing up, I was taught that there were nine planets in our solar system; however, based on further research, our knowledge of planets has been amended to now only consider eight of those as planets.
Being open-minded is a valuable disposition, but so is skepticism (i.e., the inclination to challenge ideas; to withhold judgment before engaging all the evidence or when the evidence and reasons are insufficient; to take a position and be able to change position when the evidence and reasons are sufficient; and to look at findings from various perspectives).
However, one can be both open-minded and skeptical. It is closed-mindedness that is the barrier to CT, so please note that closed-mindedness and skepticism are distinct dispositions.
Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. UK: Cambridge University Press.
Dwyer, C.P., Hogan, M.J. & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43-52.
Hamm, R. M. (1988). Clinical intuition and clinical analysis: expertise and the cognitive continuum. In J. Dowie & A. Elstein (Eds.), Professional judgment: A reader in clinical decision making, 78–105. Cambridge: Cambridge University Press.
Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.
Lieberman, M. D. (2003). Reflexive and reflective judgment processes: A social cognitive neuroscience approach. Social Judgments: Implicit and Explicit Processes, 5, 44–67.
Norris, S. P. (Ed.). (1992). The generalizability of critical thinking: Multiple perspectives on an educational ideal. New York: Teachers College Press.
Siegel, H. (1999). What (good) are thinking dispositions? Educational Theory, 49, 2, 207–221.
Valenzuela, J., Nieto, A. M., & Saiz, C. (2011). Critical thinking motivational scale: A contribution to the study of relationship between critical thinking and motivation. Journal of Research in Educational Psychology, 9, 2, 823–848.