Skip to main content

Verified by Psychology Today


Why People Believe What They Are Predisposed to Believe

Assumptions and preformed beliefs can lead to erroneous conclusions.

Key points

  • Preformed beliefs formed by years of social, family, and media influences create mental templates.
  • People often favor information confirming their beliefs and disregard facts contradicting them.
  • Humans are not as objective in making interpretations or reaching conclusions as we'd like to think.
  • There are some things people can do to prevent thinking errors, including being aware of their own biases.
carlosalvarenga / Pixabay
Source: carlosalvarenga / Pixabay

Human beings commonly tend to favor, interpret, or selectively seek information (even if erroneous) that confirms or supports their preexisting beliefs or worldviews, while discounting and disregarding facts and evidence contrary to their beliefs (1, 2). Even if surrounded by facts and evidence contrary to one's beliefs, the brain might sometimes resist facts. You may have noticed that sometimes, when you try to share facts, people may draw interpretations that only serve to confirm their pre-formed beliefs (even if erroneous and opposite to the facts you just shared).

Preformed beliefs formed through years of social, family, and media influences create mental templates. Moreover, the mind gives disproportionate weight to the first information it receives and uses it to make subsequent judgments, known as the anchoring effect. An anchor can be anything, such as a comment by someone about someone else. Assumptions based on second hand information are rarely accurate. Once our brain has a certain hypothesis in mind, it tries to confirm it. Next, it may make further assumptions to fill in the blanks and reach the conclusion that confirms its initial hypothesis and matches your preformed mental templates.

These beliefs and flawed assumptions can lead to errors in how the mind interprets conversations, behavior, or situations, thereby, hurting interpersonal interactions and may even cause harm. This tendency to make assumptions, interpretations, and judgments that fit with one's preformed mental templates can make it difficult to accept facts or new information, once an initial interpretation has been made. When subsequently presented with facts that challenge the initial interpretation, there can be a tendency to defend and adhere to one's initial interpretations even more strongly. This is known as the backfire effect. This is because new information that challenges previously held beliefs can feel like a threat, activating an emotional reaction.

Many smart people take pride in being objective and logical. The prospect of being wrong or having made an erroneous conclusion or judgment can even feel like a threat to one's identity with a group or self-concept, let alone feel uncomfortable and unpleasant. There can be an emotional investment in wanting those beliefs to be right.

Studies also show that this resistance occurs to a greater degree when people are highly confident of or have a strong conviction in their beliefs. (3) This level of confidence makes people less likely to change their minds when presented with facts that don't match their beliefs.

Another reason why it may be harder to accept new information is because the brain likes the status quo, as has been shown by several studies. It's not fond of change. Any new information requires reprocessing and re-sorting, which can feel too tedious for the brain, especially given time pressures and an emotional pull toward wanting to be right.

Relying on erroneous initial interpretations and assumptions can negatively affect how a person treats another person (e.g., put-downs), leading to self-fulfilling prophecies, as the person on the receiving end may be surprised by these interactions and may experience suffering.

There can be positive assumptions about people too, especially if one's preformed beliefs/templates/past or early experiences have been positive. This may result in unconsciously attributing positive or favorable characteristics to people who match the template of past positive experiences.

All human beings are prone to these hidden thinking errors or decision making traps, including well-intentioned people. Studies show that those who think they have no thinking errors are more likely to act upon them because they may not be self-reflecting or open to the possibility that they may have those errors.

Evolutionarily, the brain likes to categorize or group people quickly as "safe" or a "threat." It is required to make hundreds of quick judgments in short spans of time. It uses the lens of its preformed mental templates, maps, and associations, to make quick judgments and decisions. This can at times, lead to inadvertently dismissing or ignoring facts that do not align with one's preformed beliefs, while lending more weight to any sliver of information that seems to support one's preformed templates and beliefs. Having a history of trauma may exacerbate these tendencies, due to a hypervigilant brain being on a greater lookout for any possible threat.

Hastorf and Cantril's case study in 1954, examining responses from students from two schools who watched a football game, demonstrated that each person perceived a different version of happenings in the game. Out of the many occurrences in a football game, participants from each school selected and recalled more instances that showed their school in a favorable light and the other school in a poor one (3). It concluded that participant perceptions were skewed and were affected by their motivations in believing that their group was better. (3)

Despite the best intentions, the human brain has an unconscious tendency to favor those it identifies with or has more commonalities with, assigning those people more positive and desirable traits and characteristics than others it doesn't see much in common with (2, 5). This has been shown in neuroscience research. (6) A more ventral part of the anterior cingulate cortex (ACC) of the brain is thought to be involved in dispositional preference towards in-group members and self-referential reasoning related to in-group people, and a more dorsal part of ACC is thought to be involved in the regulation of emotional tensions related to persons from the outgroup. (6)

Commonalities and familiarity signal safety for the brain that is evolutionarily geared towards predictability. When encountering someone unfamiliar, it may automatically treat that person with doubt. It then tries to categorize them by searching through its preformed templates for any matches, which may lead to incorrectly categorizing people, drawing hasty, erroneous conclusions about them, and attributing unfavorable characteristics to them. Once a negative assumption is made, it may give lead to other unfavorable assumptions by association, akin to a domino-like effect. These thinking errors can amplify when one surrounds oneself only with familiar people who do not challenge these or who share the same type of beliefs.

Why care about these thinking errors?

These thinking errors create hidden traps in decision-making. Because they lead to conclusions based on wrong information, they can have a detrimental impact on one's decisions, judgments, and interpersonal interactions. When one approaches a situation with preformed beliefs in mind, one might selectively and unconsciously look for any piece of information to reinforce and confirm one's beliefs through subsequent interactions with an individual, thereby negatively affecting interactions with that individual. The individual on the receiving end may be completely unaware of one's preformed beliefs or assumptions and may be surprised by the interactions. This can erode interpersonal trust and relationships with people. Believing the wrong information can even lead to poor health decisions.

How to prevent and address thinking errors

  1. Awareness and noticing: Being aware that our brains tend to create these cognitive errors is the first step towards catching and reducing the likelihood of acting on them.
  2. Self-reflection: It can be helpful to take a mindful pause and ask oneself questions such as: How did I reach this conclusion? Did someone else suggest it? Do I know this for a fact? Did I ask this person questions that would disprove, rather than confirm, my current hypothesis? Do I not like this person for some reason? (7)
  3. Consider the alternative perspective: Encouraging oneself to think of reasons why one's preferred hypothesis may be incorrect, and allowing oneself to consider the alternative perspective and why it might be true, are steps one can take to prevent acting on these errors. This can soften the rigidity with which one is attached to one's own ideas.
  4. Emotion: If there seems to be a block in considering the alternative perspective, it can be helpful to consider if there is an emotional need one may have for wanting one's perspective to be true and where that need might be coming from.
  5. Humility: Examining one's assumptions and beliefs with openness and genuine humility makes it less likely that one will act on mistaken assumptions.
  6. Do not Assume: Assumptions are often incorrect and can be dangerous in decision making and interpersonal relationships. It is wise to be wary of second hand information that is based on hearsay which one hasn't seen oneself, and to notice when we are about to make an assumption.
  7. Avoid jumping to conclusions: Cultivating a habit of taking a mindful pause, being thoughtful and considerate of various perspectives before arriving at any conclusion, can be helpful. Practicing empathy- putting oneself in the shoes of others, and imagining how it would feel if others were to jump to similar conclusions, are other steps that can help reduce these errors.
  8. Exposure: Increasing one's awareness about and exposure to different people can help one be more attuned with what's going on and recognize one's blindspots, reducing the likelihood of getting stuck in thinking errors.
  9. Active listening with the goal of improving awareness and understanding, without preconceived opinions or agendas, can help.
  10. Empathy and Compassion: If you become aware of your thinking errors, treat yourself (and others) with gentleness, kindness, and forgiveness. It's human to err. Depending on the situation, sometimes, people are able to make repairs with the person on the receiving end. Consider it if the situation is appropriate to do so. Positive regard for others, empathy, and kindness, can go a long way in helping individual and collective well-being.

Copyright Richa Bhatia 2023 Note: The views expressed in this article do not represent the views of any organization that the author may be affiliated with. This article is for informational purposes only and is not intended to provide medical or psychiatric advice or recommendations, or diagnostic or treatment opinion. This is not a complete review or description of this subject.

If you or someone you love is contemplating hurting self or others, seek help immediately. For help 24/7 dial 988 for the National Suicide Prevention Lifeline, or reach out to the Crisis Text Line by texting TALK to 741741. To find a therapist near you, visit the Psychology Today Therapy Directory.


1.,one's%20prior%20beliefs%20or%20values. Accessed online on Nov 19, 2023

2. Accessed online on Nov 19, 2023

3. Hastorf, A. H., & Cantril, H. (1954). They saw a game; a case study. The Journal of Abnormal and Social Psychology, 49(1), 129–134.

4. Rollwage M, Loosen A, Hauser TU, Moran R, Dolan RJ, Fleming SM. Confidence drives a neural confirmation bias. Nat Commun. 2020 May 26;11(1):2634. doi: 10.1038/s41467-020-16278-6. PMID: 32457308; PMCID: PMC7250867.

5. Leyens, Jacques‐Philippe, et al. "Psychological essentialism and the differential attribution of uniquely human emotions to ingroups and outgroups." European Journal of Social Psychology 31.4 (2001): 395-411.

6. Shkurko AV. Is social categorization based on relational ingroup/outgroup opposition? A meta-analysis. Soc Cogn Affect Neurosci. 2013 Dec;8(8):870-7. doi: 10.1093/scan/nss085. Epub 2012 Jul 30. PMID: 22847948; PMCID: PMC3831554.

7. Gopal DP, Chetty U, O'Donnell P, Gajria C, Blackadder-Weinstein J. Implicit bias in healthcare: clinical practice, research and decision making. Future Healthc J. 2021 Mar;8(1):40-48. doi: 10.7861/fhj.2020-0233. PMID: 33791459; PMCID: PMC8004354.

More from Richa Bhatia M.D.
More from Psychology Today