Skip to main content

Verified by Psychology Today

Bias

5 Reasons Why People Don’t Think More Psychologically

It’s only human to assume we know more than we actually do.

Key points

  • The more we comprehend another’s thoughts and behavior, the more we can build a secure connection with them.
  • If we exhibit the Dunning-Kruger Effect, we’ll assume we know more about something than we do.
  • Having a confirmation bias makes our decisions more emotional and less rational and objective than we imagine.
  • Motivated reasoning involves accepting evidence that supports our viewpoint while rejecting what doesn’t.
AI generated/123RF
Confused by Another's Motives
Source: AI generated/123RF

We all think psychologically, however naively. Moreover, all of us understand implicitly that knowing what makes a person tick—how their psyche operates—will assist us in dealing with them effectively.

And we employ this knowledge both to maximize our influence over them and to protect ourselves from their possibly negative influence on us.

But in exploring the more intricate, or convoluted, patterns of another’s behavior, many people put the brakes on—disallowing themselves to delve any deeper. As substantial research on decision-making has demonstrated, not that many people are willing to invest the mental energy to reflect on the feelings, interests, motives, and values of those whose words and deeds significantly affect them. And their myriad reasons for such avoidance mirror human tendencies more or less universal.

Here are the major obstacles to psychological thinking—or dispassionate, nonpartisan psychological thinking—worth considering. (And some overlap among them is inevitable.)

1. Ignorance of Basic Psychological Concepts. Undeniably, the field of psychology and psychotherapy is complex, involving innumerable suppositions, theories, and (often contradictory) research findings. Many people in occupations having little or nothing to do with mental health simply aren’t comfortable—or don’t trust themselves enough—to learn about and apply psychological principles to grasp and assess others’ behavior.

Moreover, they may perceive devoting the time to develop more sophistication about such principles as simply not relevant enough to their lives to be worth it.

2. The Dunning-Kruger Effect. This extremely inaccurate, self-flattering bias revolves around an individual’s believing they’re more knowledgeable and competent than they really are. Largely unaware of their inadequacies, they’re also unable to recognize the skill level and competency of others, gratuitously regarding themselves as more skilled and capable.

Their false sense of superiority makes them incapable of learning from their mistakes, which from their one-sided perspective aren’t really erroneous. Not emotionally intelligent enough to look at subjective matters objectively, educating themselves more about the social sciences is generally of little interest to them.

It feels unnecessary, redundant—even below their grade level. Feeling they already know a lot about the dynamics of human behavior, they’re unaware of the illuminating insights deeper study could afford them.

It’s something like taking an introductory course in a subject and then thinking you’re now an expert in it. Although your knowledge is only rudimentary, you harbor the illusion that it’s advanced, and that learning more about it is superfluous.

3. Confirmation Bias. Oddly similar to the Dunning-Kruger effect, when we bring frequently outdated, but firmly entrenched biases in judging others’ behavior, our conclusions can’t be objective. Conversely, our considerations, mostly unconscious and automatic, will be governed—or rather, distorted—by preexistent beliefs.

To think about someone psychologically is quite different from thinking about them prejudicially. So when your thought processes are determined primarily by never-reevaluated cognitive habits, the results won’t offer you new, potentially valuable insights into them.

In an article entitled “Why Facts Don’t Change Our Minds,” one author defines this bias as fundamentally “the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.” And adds:

Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. (E. Kolbert, 2017)

Hence the resistance to further exploring psychological findings contrary to one’s prejudices accounts for the dismissal of what, if better apprehended, could help them more deeply understand why others—and they themselves—do what they do.

Finally, Sara and Jack Gorman’s book Denying to the Grave (Oxford, rev. ed., 2021) discusses the physiological component of confirmation bias in citing research indicating the pleasurable dopamine production people experience when they’re processing data that (seemingly, at least) validates their viewpoint. As they conclude: “It feels good to ‘stick to our guns’ even if we are wrong.”

4. Societal Bias Against Psychology. We’ll be less inclined to estimate things psychologically if we live in a society that, however subtly, betrays skepticism of the discipline. The way we’re educated and socialized (both formally and informally) materially affects our existential perspective.

We may have learned to be suspicious of psychological theories—say, they’ve been disparaged as pretentious “psychobabble.” In which case, we’ll hardly be disposed to delve into the abundant psychological literature for answers to questions we have about someone’s personality, incentives, or propensities.

So you might ask yourself the extent to which psychology was presented to you positively—as an intellectually valuable, and practical, resource of information. Or rather, whether the messages you received were that it had little of substance to offer you—and therefore a waste of your cognitive resources.

5. Motivated Reasoning. Overlapping with confirmation bias is the more inclusive concept of motivational reasoning. That’s a term almost identical to what is familiarly known to the lay person as rationalization, and which on the whole social scientists have seen fit to replace.

For the newer designation highlights the essential irrationality of how a “congealed” belief system can easily defeat our rational faculties.

In various contexts, people who seem to be inspecting data objectively are liable to reach skewed or flagrantly illogical conclusions. And one explanation given for this anomaly is based on human evolution.

In their book The Enigma of Reason (2019), Mercier and Sperber speculate that the biggest advantage humans had over other species was their ability to cooperate to advance the interests of the group.

It’s their contention that reason evolved not to logically solve abstract problems but, pragmatically, to avoid difficulties deriving from taking exception to their group’s collective will. So conformist thought processes (which account for so much motivated reasoning) are adaptive, whereas more creative, individualistic thinking can imperil a group’s cohesiveness.

Here are a couple of quotations illuminating this inclination to ignore, throw out, or disregard as irrelevant facts that, if examined earnestly, would threaten our core beliefs—indeed, our very sense of self. For basically, these facts oppose the moral standards we zealously (though typically unawares) feel bound to:

Motivation, identity and ideology combine to undermine human judgment. . . . We rely on a biased set of cognitive processes to arrive at a given conclusion or belief. This natural tendency to cherry pick, and twist the facts to fit with our existing beliefs, is known as motivated reasoning—and we all do it. (Weir, 2017)

People are capable of being thoughtful and rational, but our wishes, hopes, fears and motivations often tip the scales to make us more likely to accept something as true if it supports what we want to believe. (Ditto, et al, 2019)

And, as regards the unfortunate ramifications of all this self-interested “fact-finding,” consider this quote centering on today’s social media:

These are wonderful times for motivated reasoners. The internet provides an almost infinite number of sources of information from which to choose your preferred reality. There’s an echo chamber out there for everyone. (Hornsey & Fielding, 2017)

Science fosters a distrustful stance toward arguments based much more on subjective feelings than objective evidence. But considering the contrary positions we’re prone to take toward capital or corporal punishment, immigration, vaccinations, abortion, and climate change, it’s clear that our thought processes are influenced more by our emotions and personal history than by science's deservedly lofty ideals.

© 2023 Leon F. Seltzer, Ph.D.

References

Atir S. (2018). Thinking About Self and Others in the Context of Knowledge and Expertise. https://ecommons.cornell.edu/items/838c483c-e712-42ce-b553-3e04f5de1052

Ditto, P. H., Clark, C. J., Liu, B. S., Wojcik, S. P., Chen, E. E., Grady, R. H., Celniker, J. B., & Zinger, J. F. (2019). Partisan bias and its discontents. Perspectives on Psychological Science, 14(2), 304–316. https://doi.org/10.1177/1745691618817753

Ehrlinger, J., & Dunning, D. (2003, Jan). How Chronic Self-Views Influence (and Potentially Mislead) Estimates of Performance. Journal of Personality and Social Psychology, 84(1), 5-17. https://pubmed.ncbi.nlm.nih.gov/12518967/

Friesen, J. P., Campbell, T. H., & Kay, A. C. (2015, Mar). The Appeal of Untestable Religious and Political Ideologies, Journal of Personality & Social Psychology, 108(3), 515-529. https://pubmed.ncbi.nlm.nih.gov/25402678/ ; doi: 10.1037/pspp0000018. Epub 2014 Nov 17.

Gorman, S. E. & Gorman, J. (2021). Denying to the Grave: Why We Ignore the Science That Will Save Us (Revised & Updated). New York: Oxford Univ. Press.

Hornsey, M. J., & Fielding, K. S. (2017, Jul-Aug). Attitude Roots and Jiu Jitsu Persuasion: Understanding and Overcoming the Motivated Rejection of Science, 72(5), 459-473. https://pubmed.ncbi.nlm.nih.gov/28726454/ ; DOI: 10.1037/a0040437

Kolbert, E. (2017, Feb 27) Why Facts Don’t Change Our Minds. https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our…

Kruger J, & Dunning D. (1999).Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6):1121-1134. doi:10.1037/0022-3514.77.6.1121

Liu, B. S., & Ditto, P. H. (2013). What Dilemma? Moral Evaluation Shapes Factual Belief. Social Psychology and Personality Science, 4(3), 316-323. https://doi.org/10.1177/1948550612456045

McIntosh, R.D, Fowler E.A, Lyu T, Della Sala, S. (2019). Wise up: Clarifying the Role of Metacognition in the Dunning-Kruger effect. Journal of Experimental Psychology: General, 148(11):1882-1897. doi:10.1037/xge0000579

Mercier, H., & Sperber, D. The Enigma of Reason (2019). Cambridge, Mass.: Harvard Univ. Press.

Pennycook, G, Ross, R. M, Koehler, D.J., Fugelsang, J. A. (2017). Dunning–Kruger effects in reasoning: Theoretical implications of the failure to recognize incompetence. Psychonomic Bulletin & Review., 24 (6),1774-1784. doi:10.3758/s13423-017-1242-7

Seltzer, L. F. (2008, Aug 8). Trust Your Feelings? . . . Maybe Not. https://www.psychologytoday.com/us/blog/evolution-the-self/200808/trust…

Sommer, J., Musolino, J., & Hemmer, P. (2023). Updating, evidence evaluation, and operator availability: A theoretical framework for understanding belief. Psychological Review, Advance online publication. https://doi.org/10.1037/rev0000444

Weir, K. (2017, May). Why We Believe Alternative Facts, 48(5). https://www.apa.org/monitor/2017/05/alternative-facts

advertisement
More from Leon F Seltzer PhD
More from Psychology Today