Decisions, decisions

To quote a former Secretary of Defense notable for a lack of insight and introspection, there are always ‘unknown unknowns’ when going into a situation. But a retrospective analysis sometimes shows clear signs of error, hubris, or willful incompetence leading us to ask: “what were you thinking?”

From the world of backcountry skiing comes the term ‘heuristic traps‘, coined to describe how humans become ensnared in a montage of avoidable mistakes. Such traps seduce us into accepting risks unwisely. The six key characteristics of such traps related to outdoor adventures can be found in the websites of Wilderlife magazine (New Zealand) and this outdoors shop in the UK.

Avalanches are not the only way to get into trouble outdoors, but in popular backcountry skiing areas, avalanche conditions are monitored closely, making it possible to check for specific signs of risk prior to the accident. The reference work on this was done by Ian McCammon in “Heuristic Traps in Recreational Avalanche Accidents: Evidence and Implications” (Avalanche News, No. 68, Spring 2004).

McCammon observes that we are prone to skew our judgement because other humans are around. Studying groups of skiers involved in avalanches, McCammon found that risk ‘exposure’ was more likely when: experts were leading groups on familiar terrain and ignored evident risks; groups were led by someone having avalanche awareness but little training or experience; groups of both men and women; and groups where expertise or effective decision-validation rules were absent.

Given a choice, most of the avalanche victims in this study apparently opted for the quick decision tool, even though it was not universally correct. Thus the challenge for educators is to offer practical alternatives to heuristic traps.

Ian McCammon, “Heuristic Traps in Recreational Avalanche Accidents”

People fall into heuristic traps because ‘knowledge-based decision strategies’ (in McCammon’s words) require careful evaluation and judgement, but we tend to like fast, simple decisions. It should be noted he focused on accidents taking place where multiple signs (sometimes literal) of avalanche risk were present. The preponderance of objective evidence should have been clear.

Below I’ve reinterpreted McCammon’s ‘traps’ or warning signs for more general application:

  • Familiarity or prior knowledge contributes to normalizing the abnormal (‘I’ve seen things similar to this and it’s always been OK’).
  • Reluctance to give up on an objective causes us to ‘stay the course’ (good money goes after bad).
  • Fear of looking like a wimp, particularly men.
  • Tendency to believe someone who seems sure of themselves, or a drive to behave decisively (we like to leave decisions to someone else, leaders fail to seek input/consensus, someone feels compelled to be decisive) .
  • Seeing others do it means it must be OK.
  • Competitive instinct roused by scarcity (‘this is the last chance’) or bragging rights (‘we can’t let them be first’).
  • Our excitement/anxiety regarding performance is exaggerated in groups.

As rephrased as above, heuristic traps seem logical and commonsensical. Attention, status, and group connections are resources we prioritize all the time, but human habits and frailty can make our choices unlucky.

McCammon’s study is compelling because of the consequences for choosing wrongly. Studying the decision-making process prior to an avalanche accident is thrilling in the way one might read about the lead-up to an airplane crash or the wildly inappropriate invasion of another country. Stories of this type tend to unfold as a series of missed cues and opportunities.

In 2012 there was a graphic article in The New York Times called “Snowfall“. Outside did one on a 2013 avalanche in Loveland Pass, Colorado. And then another piece appeared in the NYT called “What I Learned in Avalanche School“, specifically about ‘heuristic traps’.

Even though several members of the eight-person party had previously triggered avalanches and even, in one case, witnessed a fatality, they had ignored the many obvious dangers, suggesting that these skiers had become dazzled by their own expertise, falsely brightened by luck.

“What I Learned in Avalanche School” by Heidi Julavits

In McCammon’s study, group size played a role, with groups of four doing statistically better than groups of two or 6-10. One might infer that groups need a minimum number of viewpoints to see risks clearly, but decision-making becomes subject to group pressure in medium-size parties.

Interesting stories to read, not so much fun to live through.

Take a beat to…

Trust but verify

Russian proverb

Consider the heuristic trap as an epistemic problem– how do we actually know what we think we know? It’s all about verifying or validating knowledge, judgement, and belief.

A guest blog post on Wildsnow offers actionable ways to improve group decisions while skiing in groups. It might not check all the boxes that Ian McCammon would recommend, but one of the key things it does is open up decision-making to a form of validation by members of the group. I think it’s practical for doing on snow in a group, and certainly better than not reading the team at all.

Whether heuristic or epistemic, this is about what goes on inside our heads, so naturally it’s studied by psychologists. A couple of examples:

Beyond Fiasco: A Reappraisal of the Groupthink Phenomenon and a New Model of Group Decision Processes” (Psychological Bulletin 1993, Vol. 113, No. 3, 533-552) reviews a model of vulnerabilities in ‘groupthink’ developed by Irving L. Janis. Vulnerabilities in group decisions are: group cohesiveness, lack of organized leadership, and provocative situational context (something that drives urgency).

Groupthink refers to a deterioration of mental efficiency, reality testing, and moral judgment that results from in-group pressures

Janis, I. L. (1982). Groupthink (2nd ed.), pub. by Houghton Mifflin

Conditions for Intuitive Expertise: A Failure to Disagree” (American Psychologist, 2009) is a kind of ‘shoot-out’ between two models of expert judgement: ‘heuristics and biases’ or ‘naturalistic decision making’. Among the conclusions: humans have the best chance to acquire expert intuition in conditions with ‘high validity’ (there’s clear and predictable linkage between external clues and the outcome of actions); experts often cannot articulate their intuition; and it’s difficult for experts to know when conditions have changed and their expertise no longer has validity. One of the authors is the Nobel-prize winner Daniel Kahneman.

Subjective confidence is therefore an unreliable indication of the validity of intuitive judgments and decisions.

“Conditions for Intuitive Expertise: A Failure to Disagree“, Daniel Kahneman and Gary Klein, American Psychologist September 2009

Clearly there’s a tension between what we believe about our judgement and the methods we use to validate that belief. Heuristic traps don’t exactly match what I know of Kahneman’s ‘fast vs. slow thinking’, but both are about how we determine the urgency of risk, and make guesses good and bad.

There are seductive and reductive motivations at the base of the heuristic trap, fitting it into a zone between ‘fast’ and ‘slow’ thinking. As the word ‘heuristic’ implies, we learn to reduce the complexity of decisions for the sake of efficiency. And the more seductive the answer, the greater the motive to confirm the least costly-seeming judgement.

Imperfect by nature

Humans devote about 20% of our energy just to supporting our brains. And what do we get for it? McCammon and Kahneman imply that people can be bad judges of how well we’re thinking. All that, plus taking time to ponder (Kahneman’s ‘slow thinking’) is costly and people don’t like doing it, especially when there’s fresh powder in that bowl beyond the dangerously overhanging cornice. As McCammon drily notes in his conclusion “most of the avalanche victims in this study apparently opted for the quick decision tool, even though it was not universally correct.”

McCammon’s point is that the failure to recognize how we’re reacting to the people around us can be a hazard just as real as an avalanche condition.

Our ability to act as a group organism can enhance our survival through the addition of individual skills. At the same time that advantage is subverted by the an inability to know the state of our own minds, or the extent to which our reasoning is affected by the mere presence of others. We want to make an impression on others and we’re willing to cut corners, statistics be damned.

Did you play well?

Our desire not to disappoint others or ourselves can and does backfire. Whether at the group or personal level, statistics tells us that in some percentage of cases we’ll fail. In the case of backcountry skiers in an avalanche zone, the cost could be death. For a professional ski racer, prize money is involved but the most painful loss is to self-esteem. Sometimes we put too much weight on having a specific outcome, or outplaying a rival.

Alexander Bolshunov upset after losing gold medal in 50k classic in Oberstdorf, March 2021
Alexander Bolshunov after failing to win the World Championship 50k (via Youtube)

Competition is at its most dramatic with a heightened risk of failure. It is regrettable that reality TV and sporting events purposely dial up the potential. An alternative might be to practice a process of accepting a range of outcomes. For example, Jessie Diggins and the US Ski Team try to emphasize how well the team is working even if it didn’t result in a podium. Having holistic goals seems like a sound backup plan.

Not to go all Buddhist about it, but desire does seem to be at the root of unhappiness. For me there’s a unique form of satisfaction when I’m skiing in the moment without expectation for an outcome. That’s not to say any random outcome is fine, but I’ve chosen a scenario for skiing in which a range of outcomes is possible. I can be happy with it as circumstances arise, or as feelings dictate.