Skip to main content

Verified by Psychology Today

Intelligence

How Decision-Makers Can Handle Uncertainty

Analytical methods don't fare well amid confusion and ambiguity.

Key points

  • Uncertainty is the norm for decision-makers.
  • Uncertainty can stem from missing information, ambiguity, unreliable data, or contextual complexity.
  • For making decisions, analytical methods that work so well with clearly defined data elements become less useful in the face of uncertainty.

Business schools and other professional programs teach powerful analytical methods for using information to make decisions. These methods are important and need to be learned.

But what happens when uncertainty dominates? Strong analytical methods often run into trouble.

We most often think of uncertainty as resulting from a lack of information. Missing information can lead to uncertainty and ambiguous, conflicting, unreliable, or complex information. Whatever the cause, the bottom line is confusion about what is going on and how to react. Now it becomes much harder to apply the analyses, which are only as good as the amount and quality of the information driving them.

What can decision-makers do?

Here is an example. My colleague John Schmitt often runs decision-making exercises for military officers. He uses tactical decision games to present a scenario demanding an immediate decision. Schmitt deliberately introduces uncertainty, usually in the various forms mentioned above.

The young officers will ask for additional information and clarifications about the enemy. Schmitt provides ambiguous intelligence that is a few hours old. They complain that they really can’t plan well if they don’t have the necessary information. Schmitt agrees, with a smile on his face. “And yet,” he says. A lot of the information they ask for is not even relevant to the decision they have to make, but they ask for it anyway. “It’s like a security blanket,” Schmitt says.

The young officers, trained to plan by making calculations using clear and abundant information, are paralyzed in the face of uncertainty. They are gripped by "rationalist fever dreams"—the mindset that all difficult problems can be decomposed and analyzed so that responses can be calculated. (D.E. Klein, Woods, G. Klein, & Perry, 2018)

But the good news is that the young officers adapt after several tactical decision games. They get used to the uncertainty. They learn to appreciate what information is reasonable to expect and what information is not. And they develop strategies for coping with uncertainty.

What lessons can business leaders and others draw from examples such as these?

1. Accept uncertainty as the norm. Don’t think everything will be clear-cut, with occasional outbreaks of uncertainty. Expect to assess and plan and act under conditions of uncertainty. Especially in two-sided settings, recognize that the other side is dealing with uncertainty also, and use that to your advantage.

2. Don’t keep an open mind about what is going on. Yes, I know people are advised to keep an open mind to prevent fixation, but if you need to collect important information, an open mind won’t guide you. Don’t just pull in all the information you can, but search for the specific pieces of information that will corroborate or invalidate your working hypotheses. You will need to speculate to get any traction.

3. Don’t fall into the information-gathering trap. Most people naturally seek to reduce uncertainty by collecting more data, and some data can be helpful. Too often, we collect data rather than trying to sort out the already existing data. And too often, we collect data we don’t actually need. Data collection can become an excuse for avoiding sense-making. Besides, the value of additional data decreases the more you collect. Many people continue to collect data beyond the point at which its marginal utility falls below the costs (e.g., time and effort) needed to collect it.

4. Don’t fall into the fixation trap. It’s a good idea to speculate right off the bat, but it’s not a good idea to get stuck on that initial speculation. One way to escape fixation is to be alert to anomalies that don’t fit our beliefs instead of explaining the anomalies away. If our initial ideas are wrong, the anomalies should keep mounting. They are our ticket out.

5. Look for anomalies. They are things you weren’t expecting and might lead to insights that cut through the uncertainty.

6. Engage in story-building. The stories you construct will help you speculate and navigate the uncertainty; they provide a logical explanation that bridges the gaps between the known points. However, you will also have to gauge the plausibility of the stories so you don’t deceive yourself.

7. Develop flexible plans that do not depend on one specific set of circumstances but will work in various situations. Develop plans that include a hedge against uncertainty, like holding some resources in reserve to deal with unexpected events, whether crises or opportunities. In general, the greater the uncertainty, the fewer resources you should commit initially and the more you should keep in reserve.

8. Take action to reduce uncertainty. We typically think that sensemaking comes first, and you act only then. But sometimes, you can take actions that are designed to cause the situation to clarify. Do something that forces the situation to respond, thereby revealing itself.

9. There are a few ways to strengthen your expertise. You can learn from the experiences faced by others, questioning them about how they handled uncertain conditions. You can learn from your own experiences, reflecting on what you did and, in hindsight, what you could have done. You can use scenario-based training, like the young officers John Schmitt led through tactical decision games so that you get used to making decisions in the face of uncertainty.

10. Wake up from rationalist fever dreams. You are on your way to shifting your mindset from dreading uncertainty to a mindset of expecting it, enjoying the challenge of managing it successfully, and maybe even using it to your advantage.

References

Klein D., Woods D., Klein G., & Perry S. (2018). EBM: Rationalist fever dreams. Journal of Cognitive Engineering and Decision Making, 12(3), 227-230.

advertisement
More from Gary Klein Ph.D.
More from Psychology Today