The Maddening Inconsistency of Human Rationality
Our species is infuriating in being at once so intelligent and yet so gullible.
Posted October 18, 2021 Reviewed by Devon Frye
Key points
- We tend to have more accurate beliefs about the immediate physical zone around us. Less so about the world beyond our immediate experience.
- Human reasoning is often more swayed by winning an argument than representing reality.
- Our intuitions, emotions, cognitive biases, and generally abysmal grasp of scientific and statistical thinking easily lead us astray.
- People overestimate their knowledge and understanding of complex issues.
“We children of the Enlightenment embrace the radical creed of universal realism: we hold that all our beliefs should fall within the reality mindset... But as desirable as that creed is, it is not the natural human way of believing... The human mind is adapted to understanding remote spheres of existence through a mythology mindset... Submitting all one’s beliefs to the trials of reason and evidence is an unnatural skill, like literacy and numeracy, and must be instilled and cultivated. And for all the conquests of the reality mindset, the mythology mindset still occupies swaths of territory in the landscape of mainstream belief.”
—Steven Pinker, Rationality1

In case you hadn’t previously realized the scope of the problem, the last couple of years of wildly popular conspiratorial belief movements will likely have left you in no doubt that vast numbers of people believe some pretty weird things.2 Weird beliefs such as implausible conspiracy theories are not a fringe phenomenon, and most of the time they are not symptoms of mental illness.
It’s easy to become depressed or cynical about human irrationality. But the Harvard psychologist Steven Pinker takes a more charitable view. Pinker is well-known for his enduring optimism about humankind and progress. Now, in his latest book, Rationality: What It Is, Why It Seems Scarce, Why It Matters, he argues that despite the rampant prevalence of irrational beliefs and the ubiquitous instances of irrational, illogical, and cognitively biased thinking to which we are all prone, we humans are not as stupid as we might appear. After all, we’re pretty good practical problem-solvers—even the least intellectually-minded of us. And collectively, as Pinker notes, we’ve “discovered the laws of nature, transformed the planet, lengthened and enriched our lives, and not least, articulated rules of rationality that we so often flout.”3
Pinker defines rationality as the use of knowledge to attain a goal. He argues that we are adept at “ecological” rationality, which is to say: our rationality functions best in the natural environment in which our species evolved and in which we as individuals grew up and learned—the environment in which we carry out our day-to-day problem-solving and social interactions.
Reality and Mythology
Pinker explains that we think in ways that are sensible in the low-tech contexts in which we live most of our lives. We tend to be more rational and have more accurate, realistic beliefs about the immediate physical zone of the objects around us and the people we interact with face to face. Our survival depends on maintaining a “reality mindset” in this zone.
But our beliefs about the abstract world beyond our immediate experience—the zone of “the distant past, the unknowable future, faraway peoples and places, remote corridors of power, the microscopic, the cosmic, the counterfactual, the metaphysical”4—are more susceptible to erroneous assumptions. In trying to understand that more inscrutable world, most people believe some strange, irrational things.
"Beliefs in these zones are narratives, which may be entertaining or inspiring or morally edifying. Whether they are literally 'true' or 'false' is the wrong question. The function of these beliefs is to construct a social reality that binds the tribe or sect and gives it a moral purpose. Call it the mythology mindset."5 Conspiracy theories are merely one example. Even most intelligent people have at least a few bizarre beliefs. In the quote at the top of this blog post, Pinker’s next line (not included there) was: “The obvious example is religion.”
It’s not just mainstream religion that illustrates this very human tendency. In the last few decades, it has been fashionable to call oneself spiritual but not religious, and to presume that this identity somehow conveys intellectual gravitas and skepticism. However, the spiritual-but-not-religious movement has been closely associated with a kaleidoscope of paranormal mysticism, parapsychological pseudoscience, and the folk psychology idea that consciousness exists "out there," independent of a physical nervous system (just requiring a brain to receive and transmit it—like a radio in relation to radio waves). Paranormal beliefs appear to be on the rise in America.6 Even astrology is making a comeback among young people.7
Health and Politics
Another broad arena in which the majority of us are prone to believe things contradicted by scientific evidence is health/medicine. This is an area where it is absolutely crucial to mistrust subjectivity and eliminate sources of bias.
Widespread lack of scientific literacy among the general public has resulted in much unfounded faith in “alternative” medical treatments—a massive source of profiteering in the hugely lucrative alternative/natural/holistic health industry. This faith stems from a basic failure to appreciate the severe limitations of subjective perception or to understand how easy it is to misperceive cause-effect relationships. The results are often tragic for vulnerable, suggestible people whose illnesses could be cured or controlled by more scary but evidence-based conventional pharmaceuticals. This happens all too often in a field like cancer care.8
During the COVID-19 pandemic, the wide prevalence of vaccine hesitancy and outright anti-vax belief has illustrated on a magnified scale just how poorly people understand and appraise relative risk.
And then there’s politics, which in the pandemic has become entwined with beliefs about health/medicine. Pinker attributes this in part to the “myside bias,” which is all about identity, group affiliation, and loyalty to a cause. And let’s be clear: beliefs running counter to scientific evidence are by no means limited to the political right-wing.
Human reasoning is often more swayed by winning an argument than representing reality. The cognitive scientists Hugo Mercier and Dan Sperber, in their argumentative theory of reasoning,9 posit that human powers of reason evolved in a thoroughly social context: not as a solitary process toward achieving more accurate beliefs and making better decisions, but rather as a way of justifying our beliefs and actions to others, convincing them through argumentation, and evaluating the arguments that others present to us. Pinker suggests that “we evolved not as intuitive scientists but as intuitive lawyers.”10, 11
When Smart People Believe Unsmart Things
It would be easy to suggest that the extreme distance between the impressive feats of human rationality and the abundant cringe-worthy examples of irrationality (“rationality inequality,” as Pinker puts it) is merely accounted for by the fact that individual humans vary greatly in their intelligence. But while intelligence, and similarly education, certainly do correlate with rationality, they are imperfectly and incompletely correlated.
Smart people are also susceptible to believing unsmart things. Michael Shermer, an expert in the methods of rigorous scientific skepticism who has written extensively about why people believe weird things, explains the phenomenon this way: “Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.”12 Smart people may also tend to be overconfident about the beliefs they’ve rationalized, falling prey to the same kind of biases to which the rest of us are susceptible, such as the notorious confirmation bias (as the social psychologist Tom Gilovich puts it, when people want to believe something, they ask themselves, “Can I believe this?” But when they don’t want to believe something, they ask “Must I believe this?”13 We use motivated reasoning to rationalize our beliefs).
Shermer adds that another problem is that smart people might be smart in only one field—their knowledge and expertise are domain-specific. Underlining this point are a number of embarrassing examples of Noble Prize winners going on to adopt some pretty implausible beliefs about things outside of the field in which they won their prize.14
Logic and Reasoning
Pinker tries patiently and good-naturedly to teach us how to be on our guard for the many slippery ways in which our intuitions, emotions, cognitive biases, and generally abysmal grasp of scientific and statistical thinking lead us astray.15 About two-thirds of the chapters in Rationality comprise lucid and entertaining explanations of some of the most important rules of sound reasoning. A lot of the material will be very familiar to scientific skeptics16 and followers of the cognitive bias literature and its trailblazer Daniel Kahneman. Still, I have to admit that I fell for quite a few of the logic traps.
Hubris and Humility
Most people want to think of themselves as rational and as having arrived at their beliefs through reason and evidence. With the information explosion and its easy accessibility on the internet, we have come to believe that we can all independently develop informed opinions about complex subjects. It’s easy for people to mistakenly believe that they have all the necessary data and the ability to interpret that data—"I've done my own research." People overestimate their knowledge and understanding of complex issues.
When my wife was diagnosed with life-threatening cancer some years ago, despite 14 years of medical training I knew that as a psychiatrist I was hopelessly out of my depth. I took the oncologist’s advice to stop trying to research her condition and treatment options. We trusted his recommendation to go with the experimental treatment drug whose probability of saving her substantially exceeded its risk of killing or disabling her through its toxic adverse effects.
There is frequently mistrust of mainstream experts and public institutions. When evaluating scientific information, many people have difficulty differentiating credible from non-credible experts, falling prey to a false illusion that there are "two sides" to a debate in which fringe quacks and mavericks are pitted against the mainstream consensus, or imagining that those outliers are the new Galileo or Einstein.
More fundamentally, people lack a depth of understanding of how scientific knowledge constantly evolves and self-corrects through meticulous empirical testing and a careful process of peer review and independent replication. Science is harder than most people think.
Perhaps it would be too much to expect most people to be rigorous critical thinkers—Pinker’s lofty ideal. A somewhat more achievable, though still difficult goal might be to help people figure out how to know the difference between credible and non-credible sources of information.
A little self-conscious uncertainty and humility about the limitations of our understanding of complex subjects might help too.
References
1. Steven Pinker, Rationality: What It Is, Why It Seems Scarce, Why It Matters (New York: Viking, 2021), p. 301.
2. For example:
- fifteen percent of Americans agreed with the QAnon statement that the U.S. government, media and financial worlds “are controlled by a group of Satan-worshipping pedophiles who run a global child sex trafficking operation.” (nbcnews.com).
- Nearly nine months after the 2020 U.S. presidential election, two thirds of Republicans still believed that “the election was rigged and stolen from Trump” (Yahoo News/YouGov poll).
- Twenty percent of Americans believed the conspiracy theory that microchips are inside the COVID-19 vaccines (YouGov / The Economist study). The conspiracy theory that vaccines are a tool to implant trackable microchips into people has been widespread across the globe.
- Three quarters of Americans believe in at least one phenomenon that defies the laws of science, e.g. psychic healing (55%), extrasensory perception (41%), haunted houses (37%). (Rationality, p. 6, citing 2005 Gallup data)
3. Pinker, Rationality, p. xiv.
[Click 'More' to view footnotes 4-16].
4. Pinker, Rationality, p. 299-300.
5. Ibid.
6. "Paranormal America 2018," Chapman University Survey of American Fears, October 16, 2018. https://blogs.chapman.edu/wilkinson/2018/10/16/paranormal-america-2018/. Similarly: Pew Forum on Religion and Public Life 2009 https://www.pewforum.org/2009/12/09/many-americans-mix-multiple-faiths/.
7. Julie Beck, "The New Age Of Astrology," The Atlantic, January 16, 2018. https://www.theatlantic.com/health/archive/2018/01/the-new-age-of-astrology/550034/. Granted, for some, their renewed interest in astrology is mostly entertainment, but there is a large amount of credulous belief too, and even partial / uncertain belief in astrology should be a reason to blush.
8. Adapted from: Ralph Lewis, Finding Purpose in a Godless World: Why We Care Even If The Universe Doesn’t (Amherst, NY: Prometheus Books, 2018), pp.62-63. As I explained there, I am often consulted by my oncologist colleagues to assess the factors contributing to a patient’s irrational treatment decisions. Typically, these involve refusal of potentially lifesaving treatments such as chemotherapy, based on fears about toxicity, in favor of alternative treatments supported by glowing testimonials that amount to little more than quackery. It can be exasperating to attempt (with as much sensitivity as possible) to impress upon these patients the gravity of the decision they are making.
9. Hugo Mercier, Dan Sperber. Why do humans reason? Arguments for an argumentative theory.. Behavioral and Brain Sciences, Cambridge University Press (CUP), 2011, 34 (2), pp.57-74; discussion 74-111. ff10.1017/S0140525X10000968ff. ffhal-00904097.
10. Rationality, p. 291.
11. From an evolutionary point of view, part of the adaptive function of beliefs is to forge our sense of identity and social cohesion in our group, rather than to just accurately model external reality. People’s beliefs are shaped by others in their group—people they trust and people in authority in their group. Both social group identity (tribal identity) and accurate modelling of external reality are crucial to our survival as social animals, and there’s often a trade-off between these two functions. Certainly, when beliefs that affect survival stray too far from reality, it does matter. We’re seeing this play out tragically as COVID-19 death rates soar in “red states” with low vaccination rates. As Pinker says, reality exerts a powerful selection pressure (on human reasoning abilities).
12. Michael Shermer, Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time, rev. ed. (New York: Holt Paperbacks, 2002), Ch. 18: Why Smart People Believe Weird Things. https://michaelshermer.com/weird-things/excerpt/
13. Julie Beck, "This Article Won’t Change Your Mind: The facts on why facts alone can’t fight false beliefs," The Atlantic, March 13, 2017. https://www.theatlantic.com/science/archive/2017/03/this-article-wont-change-your-mind/519093/
14. Candice Basterfield, Scott O. Lilienfeld, Shawna M. Bowes, Thomas H. Costello, “The Nobel Disease: When Intelligence Fails to Protect against Irrationality,” Skeptical Inquirer, May / June 2020. https://skepticalinquirer.org/2020/05/the-nobel-disease-when-intelligence-fails-to-protect-against-irrationality/
15. Pinker explains that we often do badly at formal reasoning tasks because our rational faculty is not an abstract logical skill. That kind of skill is not intuitive and has to be explicitly learned and trained. Furthermore, humans are terrible statisticians—people have to be taught how to understand and follow the data and how to override the intuitive power of anecdote, narrative and cognitive bias (e.g. confirmation bias, hindsight bias, availability bias, anchoring bias, base rate neglect, conjunction fallacy, gambler’s fallacy, hot hand fallacy, framing effect, loss aversion, and many others).
16. Note that we are talking here about scientific skepticism—synonymous with critical thinking applying the scientific method. Not to be confused with pseudoscience posing as skepticism, e.g. "climate skeptics" or "vaccine skeptics" and conspiracy-mongers calling themselves skeptics. Skepticism requires substantial training in the scientific method, otherwise it easily descends into conspiracy mindedness. Lacking methodological rigor, the conspiracy theorist over-confidently becomes convinced that he or she alone has seen through the official version of events, believing that everyone else is gullible.