Dealing with Risk and Uncertainty

Uncertainty Graphic

Newsletter 170 - November 1, 2023

 

From the Directors Heidi Burgess and Guy Burgess

This is the fourth in a series of posts on the nature of facts, sources of uncertainty, and strategies for making the most of available scientific and technical expertise, while also dealing sensibly with the limitations of that expertise. Other posts in this series includeSharp vs. Fuzzy Feedback — The Distinction That Explains Why Society Can Be Both Astonishingly Smart and Incredibly StupidLiving with Uncertainty in the COVID-19 Era, and Fact, Value, Lie, or Uncertainty? How Do We Tell? 

 

Understanding the Nature of Risk and Uncertainty

We all have a difficult time balancing risk and accepting uncertainty.  Former Senator Ed Muskie is famously quoted as saying "Will somebody find me a one-handed scientist?!!" when the scientists testifying before his Senate committee kept hedging their testimony by with statements saying "on the other hand..." [something else might be true].  As Hank Campbell says, "The one-handed side [of a debate] often wins because they have an [definitive] answer, and scientists too often say, 'it depends.'" That is because many people have a cognitive bias that favors those who confidently proclaim that they know the answer (even when they don't). This is, of course, especially pronounced when they are telling us what we want to hear.

But science studies complex phenomena, that we can only measure inexactly.  When we are talking about social science, things get even more difficult, because people aren't nearly as predictable as atoms and molecules. So scientists can do their best to measure what is going on, and to predict what will happen in the future based on those measurements and their knowledge of causal relationships and trends, but they often cannot be certain enough to put their reputations on the line — or people's lives at risk — because they don't fully understand how the system works.

In May of 2020, (eons ago in pandemic time), we wrote a blog post entitled "Living with Uncertainty in the Covid-19 Era." Scientists have learned a lot about COVID since that was written, but the guidelines we offered then are still valid, with respect to COVID, and to many other issues in which uncertainty is significant (including the efficacy of various strategies for limiting climate change or the escalation pathways associated with the world's many ongoing military confrontations).

In that article, we said, successfully living with "uncertainty will require us to develop the skills needed to: 1) distinguish trustworthy from untrustworthy information; 2) understand the practical implications of that information; and 3) encourage and support those who produce trustworthy, understandable, and useful information. While we can't expect to be completely successful, we can do much to protect ourselves by taking steps to avoid a few simple cognitive traps that relate to the ways we think about risk and uncertainty. Those include:

The Narrowcasting Trap

This is the trap we talked about earlier: getting all our information from a narrowly focused cluster of sources all catering to the cognitive biases and social prejudices of a particular group. In the earlier post we argued "We would all be a lot better off if we would look outside of these narrowcast information bubbles and seek out and honestly consider credible views offered from other perspectives."
 

The Contradictory Expert Trap

There are now so many experts saying so many different and contradictory things that it's easy to conclude that experts don't really know anything, so you might as well go with your gut-level, (and generally self-serving) assessment of the situation. Unfortunately, once you do this, you pretty much lose all of the potential benefits of genuine expertise. Much better to listen to experts from different, credible sources and compare what they say.  What makes sense to you? What doesn't? (This is why it is so very important that experts write and talk in language ordinary people can understand.) What evidence do the scientists use in forming their opinions? Who do others find most trustworthy?  In the many cases where competing arguments are too complex for you, as a layperson, to follow, focus on experts who seriously engage competing views and explain the rationale behind their decisions. And, stay away from those who  simply demonize and scoff that anyone who offers a dissenting opinion. Be open to the possibility that your previous understandings might be wrong, and that even things "experts told you" in the past might be wrong now.  As we said before, science develops. Knowledge gets better.  People learn.  So don't immediately dismiss scientists — or politicians, for that matter — who "flip-flop."  Rather trust them more for being willing to learn.

The Disinformation Warfare Trap

As we discussed in several previous posts, many "bad-faith actors" are using the strategy Steve Bannon described as "flooding the zone with sh*t." The Rand Corporation has used the term "firehose of falsehood." Practitioners of this kind of information warfare use all sorts of high-tech tools (and now AI) to flood our newsfeeds with often completely fictitious stories of experts saying a full range of contradictory things. They are not trying to sell a particular point of view, they are trying to discredit the whole idea that there is an objectively determinable reality. The goal of this approach is to totally discredit both experts and the media, leaving political leaders free to spin whatever fiction they think their constituents will find most attractive. Combating this sort of thing is proving to be extremely difficult at the society-wide level. But at the individual level, things are easier. All one has to do is to understand the game that is being played, discount information coming from those sources, and seek out more reliable sources of information — sources that, as a general rule, score pretty high on the Media Bias Chart.

The Risk-Balancing Traps

This, actually is a whole set of traps. One is the "zero risk bias" which arises from  our preference for absolute safety. We tend to opt for situations that promise to completely eliminate risk, even when that promise is not credible. We also tend to underestimate the risk of activities over which we think we have control (such as driving a car) in comparison to activities in which we are dependent on others (such as riding in a plane). This is why many people, like me, get nervous flying, but not driving, even when accident statistics show driving is far riskier

This cognitive bias is exacerbated by a gigantic "worry-industrial complex" that specializes in injecting into our newsfeeds lots of click-bait stories about things that aren't really all that dangerous, but are made out to be so. The fear mongering about vaccines—not only COVID, but flu, measles, and others that have been shown to be safe for years—is one example. So is the fear about nuclear power, which is so extreme that a very promising approach to managing climate risk is completely off the table, so much so that existing nuclear power plants are being decommissioned and replaced with much dirtier coal-burning plants. 

Another risk-balancing trap is the "cautious shift," in which, when faced with uncertainty, decision makers make the most cautious decision possible. Sometimes this makes sense.  But other times, elected officials try to prove their superiority by demonstrating that they will do more than their rivals to protect their constituents. Trouble arises when this escalates into a biding war that ratchets up protective measures to the point where they cost enormous amounts of money, while producing modest, or sometimes, insignificant levels of risk reduction. They may even do significant damage. In the US, a particularly spectacular example of this phenomena is the race to appear tougher on crime than one's political opponents. During the 1980s and 90s, this produced a "mass incarceration society" with a 500% increase in prison populations. This greatly harmed both the inmates (who might not have been incarcerated before), but also their families. In addition, the whole society lost potentially productive members, and turned them, instead into wards of the state whose maintenance costs tens of thousands of dollars a year of taxpayer money.

However, it is also possible to fall into the trap of the "risky shift," when people compete by trying to show that they are braver than their opponent (who is painted as a coward).  This is the kind of bravado that can easily lead to unnecessary and tragic war.

A third risk-balancing trap is what we call "tunnel vision." This involves focusing entirely on one risk, while ignoring others. There are stories, for example, about people who were so worried about catching COVID, that they didn't go to the hospital when they had a heart attack.  There are also people who see climate change, racial reckoning, deteriorating support for traditional Christian values, Chinese expansionism, or a host of other issues as the defining issue of our time — an issue to which all other issues must be subordinated.  But, successful societies have to be able to balance efforts to deal with these often conflicting issues simultaneously.  This is an inevitably messy process that, at best, produces imperfect solutions.

Still, focusing on only one issue leaves you exposed to a lot of other dangers that are proceeding apace, as you are focused entirely on something else. (Israel's focusing on its internal conflict over judicial reform and attacks on settlers in the West Bank, for example, led them to fail to adequately address the growing threat in Gaza.) While we should expect different individuals and institutions to focus on the types of risks that they care most about, we, as a society, need to think carefully about how best to balance those risks in particular situations.

The Delay/Default Trap

Another common and frequently problematic response to uncertainty is to delay decisions about how best to respond until we learn enough to reduce uncertainties to the point where the wisest course of action becomes clear. The problem here is that is that these decisions aren't really being deferred. Instead, people are making an unconscious decision to pursue the default alternative — the continuation of business-as-usual practices, which can often be worse than any of the response options under consideration.  For example, as we wrote in 2021, and it is still true, some people delayed getting the COVID vaccine until it is firmly proven to be safe and effective. While the desire to further evaluate the safety and effectiveness of the vaccines is understandable, it needs to be balanced against the known and quite substantial risk of illness, disability, and death posed by COVID.  

A similar issue revolves around responding to climate change.  Many on the right will argue that we are still not sure about the effects of climate change, and until we are sure, we shouldn't do anything.  Of course, the longer we delay and don't do anything, we are almost certainly making the effects worse. So delaying is actually making a policy choice, and one that often is worse than the otherwise worst alternative. There are many times when society's decision-makers have to make important decisions on the basis of incomplete information — information that further investigation may well prove to be erroneous. We have no choice but to look at the available evidence and make a good-faith judgment.

Uncertainty, Not Risk 

It is important to distinguish between risk and uncertainty. With risk, the possible outcomes and the probabilities associated with each outcome are known with sufficient accuracy to make it possible to generate actuarial tables. This is what makes insurance a viable business. In uncertain situations, by contrast, we don't know the probabilities associated with each possible outcome or even the all of the possible outcomes.  The truth is that most of society's big conflicts involve a high degree of uncertainty — what former Defense Secretary Donald Rumsfeld used to call "unknown unknowns." This is, to a considerable degree, what makes "wicked problems" wicked.

Much of this uncertainty is due to the fact that the social system is so big and so complex with so many things happening at once is very difficult to figure out what causes what. This results in the "fuzzy feedback" problem that we talked about in an earlier post in this series.  Still, we are not powerless in the face of such uncertainty. We do know enough to enable us to significantly improve the odds of positive outcomes.  We just need to apply that knowledge. As Kenneth Boulding used to say, "while we should be prepared to be surprised about the future, we don't need to be dumbfounded."  

Flexibility

Beyond avoiding the above traps, the most important strategy for dealing with uncertainty is staying flexible.  The further you look into the future, the more uncertain things become. This means that decisions that permanently lock you into a particular course of action can be particularly dangerous.  Long-term decisions should be made with the understanding that you will monitor the situation carefully and, if things start going in an unexpected direction, you can make appropriate adjustments.  This enables you to take advantage of new information as it becomes available and respond to unexpected actions taken by others (which may create unanticipated threats or opportunities).  

As one example, systems thinkers have argued that providing such flexibility is one of the big keys to improving the effectiveness of peacebuilding efforts. In Making Peace Last, Rob Ricigliano talks about the PAL (planning-acting-learning) cycle where you study the conflict structure, attitudes, and transactions (which he calls the SAT model), and based on that, you plan an intervention that you think will work, and you carry it out. (That's the planning-acting part.) Then you watch what happens, and you learn more about the system (part 3 of the PAL approach). You see what worked as expected and what didn't, and on the basis of that new knowledge, you tweak your original plan or an develop an entirely different response, as needed. He urges funders to give peacebuilders sufficient flexibility in their grant requirements that they can make such adjustments.

A last point of advice from our May 2020 blog post still applies: "We must resist the temptation to evaluate decision-makers (and ourselves) based on outcomes. Instead, you should judge decision-makers on the basis of the information that they had when they made their decision. To do otherwise risks placing too much confidence in reckless decision-makers who "luck out" and not enough confidence in prudent decision-makers who just weren't so lucky. 


Please Contribute Your Ideas To This Discussion!

In order to prevent bots, spammers, and other malicious content, we are asking contributors to send their contributions to us directly. If your idea is short, with simple formatting, you can put it directly in the contact box. However, the contact form does not allow attachments.  So if you are contributing a longer article, with formatting beyond simple paragraphs, just send us a note using the contact box, and we'll respond via an email to which you can reply with your attachment.  This is a bit of a hassle, we know, but it has kept our site (and our inbox) clean. And if you are wondering, we do publish essays that disagree with or are critical of us. We want a robust exchange of views.

Contact Us


About the MBI Newsletters

Once a week or so, we, the BI Directors, share some thoughts, along with new posts from the Hyper-polarization Blog and and useful links from other sources.  We used to put this all together in one newsletter which went out once or twice a week. We are now experimenting with breaking the Newsletter up into several shorter newsletters. Each Newsletter will be posted on BI, and sent out by email through Substack to subscribers. You can sign up to receive your copy here and find the latest newsletter here or on our BI Newsletter page, which also provides access to all the past newsletters, going back to 2017.

NOTE! If you signed up for this Newsletter and don't see it in your inbox, it might be going to one of your other emails folder (such as promotions, social, or spam).  Check there or search for beyondintractability@substack.com and if you still can't find it, first go to our Substack help page, and if that doesn't help, please contact us

If you like what you read here, please ....

Subscribe to the Newsletter