Skip to main content

Verified by Psychology Today

Education

100 Brief Tips and Findings Regarding Critical Thinking

Celebrating 100 ‘Thoughts on Thinking’ posts.

Key points

  • There are three core critical thinking skills: analysis, evaluation, and inference.
  • The knowledge we store in our heads isn’t necessarily correct; it's just how we understood something.
  • We can’t always be politically correct if we want to think critically.

In celebrating the 100th "Thoughts on Thinking" post on Psychology Today, let’s focus on the bread-and-butter of this blog: Here are 100 tips and findings regarding critical thinking (CT) and higher-order cognition:

  1. CT is a metacognitive process (i.e., thinking about thinking).
  2. CT consists of skills and dispositions, whilst working in conjunction with reflective judgment.
  3. Reflective judgment refers to taking one’s time with a decision while engaging epistemological understanding.
  4. Epistemological understanding refers to the nature of knowledge, the limits and certainty of knowing, and how this affects related reasoning.
  5. There are three core CT skills: analysis, evaluation, and inference.
  6. Having CT skills alone is not sufficient; one must have a positive disposition toward CT.
  7. CT disposition refers to an inclination, tendency, or willingness to perform a given thinking skill.
  8. CT dispositions include concepts like open-mindedness, organisation, truth-seeking, and skepticism.
  9. Even educators have a tough time defining CT.
  10. Critical thinking can be enhanced through appropriate training.
  11. Explicit CT training is necessary if educators want to see CT improve and flourish across domains.
  12. There are many types of illogical argumentation and fallacious reasoning that can disrupt appropriate thinking.
  13. Play "devil’s advocate" to truly see "both sides of the story."
  14. If we truly care about a topic or decision, we should apply CT.
  15. Likewise, we probably should only apply CT when we care about the topic or decision.
  16. It is far from "virtuous" to force emotion-based opinions of virtue/value onto others who do not necessarily subscribe to the same ideology.
  17. "Leave emotion at the door"—it clouds your thinking.
  18. Caring about a topic/decision is distinct from being passionate about it. The former is important for CT; the latter can hinder it.
  19. Application of CT can be categorised into five general areas: argument analysis, verbal reasoning, hypothesis testing, judging likelihood and uncertainty, and problem-solving.
  20. Humans are poor "natural" statisticians. Learn statistical analysis if you have an interest or find you use statistics often.
  21. People often don’t know what they don’t know.
  22. People with low ability in an area typically overestimate their ability in it, whereas people with high ability in an area often underestimate their ability (Dunning–Kruger Effect).
  23. There is no such thing as "proof," per se—we can only disprove things (through falsification). The word you’re looking for is "evidence" or "justification."
  24. The knowledge you store in your head isn’t necessarily correct; it's just how you understood something.
  25. Understanding refers to how a schema is constructed, not necessarily the accuracy of the information.
  26. Knowledge, in terms of what we know as a society, is theoretical.
  27. Evidence or justification for said knowledge may be debunked at a later time.
  28. Creativity is not necessary for CT but, if you conceptualise it as "synthesis," then it can be a core facilitator.
  29. People love to be right, but they’re likely to hate being wrong more.
  30. People often dislike change and, more often, dislike changing their minds.
  31. Changing your mind requires schemas (re)construction.
  32. Changing your mind might make you question long-held beliefs, which can disrupt your worldviews.
  33. Disruption of worldviews might yield uncertainty…which can frighten people.
  34. People generally do not like to be frightened or confused.
  35. People develop odd, vague, and/or over-simplified "sayings" to explain away uncertainties.
  36. People develop odd, vague, and/or over-simplified belief systems to explain away uncertainties.
  37. People often like things, such as information, simplified and organised into nice neat little packages (e.g., TL;DR).
  38. Just because you believe or wish something was true doesn’t make it so.
  39. Changing your mind might make you look weak to others in certain situations, but if you emphasise the strength necessary to do this, the weakness can be overturned.
  40. Changing one's mind requires a positive disposition toward CT.
  41. Trying to change someone’s mind is difficult and often backfires—reinforcing their previously held belief.
  42. There is no such thing as "good" CT—you either applied it or you didn’t.
  43. If you applied CT wrong, then is it actually CT?
  44. Just because you often apply CT in requisite situations doesn’t mean you do it in all requisite situations.
  45. It is difficult to measure CT ability. Simply self-reporting that you are "good at it" or "often do it" does not make it so.
  46. "It depends" is an acceptable answer—just ensure you know a few examples of upon what it depends.
  47. Numbers don’t lie, but people do—and not even intentionally. It takes a human to interpret numbers, and the interpretation may be incorrect.
  48. All of our decisions are made with some level of bias; try your best to curtail it as much as possible.
  49. It’s OK to say "I don’t know," and it is actually a good indicator of intellectual honesty.
  50. Be intellectually honest.
  51. A person said what they said, not how you interpret what they said—if clarity is lacking, ask for clarification.
  52. You can’t always be politically correct if you want to think critically—controversial topics often require the most CT!
  53. Argument mapping can facilitate CT.
  54. Active learning is a fundamental component of CT instruction.
  55. Only worry about things you can change.
  56. Keep perspective and be thankful for the things you have.
  57. Cynicism is not the same as skepticism.
  58. It’s ill-conceived and dangerous to treat perspectives that you value as global virtues or a moral code that everyone else should value, too.
  59. Don’t trust your gut—intuition can often be correct; but when it’s off, it’s way off. Instead, engage reflective judgment (see #3).
  60. Despite what Oprah says, you do not have your own truth. Truth isn’t relativistic in a shared reality.
  61. There is a need for general, secondary-school training in CT-related processes.
  62. People often ignore truths that don’t suit them or try to manipulate them to accommodate their bias.
  63. Relying on personal experience to make decisions is lazy thinking.
  64. People overestimate the value of their experience in decision-making, which is particularly scary when they have a lot of experience doing things wrong.
  65. Your mistakes are often unacknowledged, so you may not know you’ve been doing something wrong.
  66. It’s OK to be wrong—it’s a learning experience. Own up to it.
  67. Do not underestimate the effect social media is having on your cognitive processing.
  68. Be open-minded toward others.
  69. Dispositions of open-mindedness and skepticism complement each other, not contradict.
  70. People don’t recognise their own irrationality.
  71. Intelligence and rationality are distinct traits.
  72. "Learning styles" are a debunked myth.
  73. "Do your research" is not a thing—20 hours of Googling is not putting anyone on par with expert perspectives.
  74. Nevertheless, be aware of empirical evidence and broaden your knowledge for topics that matter to you.
  75. We rationalise poor decisions because we don’t want to look irrational and/or they, in some way, yield an outcome we actually want.
  76. When engaging in argumentation, be sure of how key issues are defined.
  77. Difficulty in definition yields difficulty in evaluation.
  78. Draw and report your conclusions with caution—you could be wrong. Accept that.
  79. Don’t just read a headline—dig deeper. Read the full article and assess the sources of the claims.
  80. Ask yourself, are all the reasons presented to you for believing something actually relevant to the central claim?
  81. Question an author’s intentions and ask, What is the purpose of this piece?
  82. We are cognitively lazy as a default—put time and effort into your thinking.
  83. Information can evoke and breed emotions like fear and anger in the reader or listener. If you’re emotional, you’re not thinking rationally.
  84. The more we have been exposed to certain information, the more likely we are to believe that information—regardless of truth (i.e., the illusory truth effect).
  85. Peer pressure isn’t just for teens; we all amend our perspectives to align with those around us (e.g., for social desirability or as a result of the illusory truth effect).
  86. Question your perspectives.
  87. Don’t jump to conclusions, regardless of how interesting, confirming, or comforting they might be.
  88. Consider the more likely and simpler solutions first—they often provide nice starting points for your decision-making.
  89. Thinking "outside the box" isn’t always helpful—it often contradicts CT and produces ideas that lack feasibility and logic.
  90. When you evaluate information, assess its credibility, relevance, logical strength, balance, and bias.
  91. A theory isn’t an educated guess; it’s an established model for how a phenomenon occurs following many observed replications (e.g., gravity).
  92. Belief in conspiracy theories might stem from a desire for closure in a complex scenario, lower ability in specific cognitive processes, and/or a function of demographics.
  93. CT requires practice—engage opportunities that require it.
  94. Sometimes, even when you know you’re right, it’s better just to leave it—Is this particular argument really worth the aggravation?
  95. Emotional intelligence—as conceptualised as a maturity that facilitates the management of emotions, with respect to their appraisal and expression—can be a useful self-regulatory tool for CT.
  96. Priortise the things that matter in your life—your thinking will follow in light of blocking out the noise.
  97. Context is key for all decision-making.
  98. Cognitive reframing can be difficult, but it is often necessary for CT and maintaining mental well-being.
  99. Heuristics, schemas, biases, and intuitions are all the same—automatic, gut-level decision-making sources that are a risky means of making decisions that require CT.
  100. Read more.
advertisement
More from Christopher Dwyer Ph.D.
More from Psychology Today
More from Christopher Dwyer Ph.D.
More from Psychology Today