Risky Business

Risk Assessment and Operational Decision-Making in the Navy

Categories

Tags

Risk Assessment in the Navy

Spotting and seizing risk is central to the art of war. Yet like fire, risk misjudged, misunderstood, or misperceived can be catastrophic. The recent mishaps in Seventh Fleet illustrate this tension and prompted critical reflection on risk assessment within the Navy. The “Comprehensive Review of Recent Surface Fleet Incidents” identified “Human Performance Factors” as a systemic problem which contributed to the incidents.[i] In particular, risk assessment was a recurring issue with the report noting deficiencies “in the leader’s ability to identify, mitigate, and accept risks.”[ii] As these incidents have painfully reminded the public, naval operations are inherently dangerous and constantly require leaders to make decisions under risk and uncertainty. Such decisions push the limits of human cognition and judgment. Overseeing incident response, Vice Chief of Naval Operations (VCNO) Vice Admiral William Moran, argued that the central challenge moving forward from these incidents will be “how do we train people to assess risk and take the appropriate action given the circumstances?”[iii] This is a challenge shared by all warfare communities. We must look beyond these tragic events, seek new concepts, and leverage the lessons of history to improve risk assessment across the force.

Contemporary research in behavioral psychology can help us meet this challenge. Researchers have developed reliable models that explain decision-making under risk and uncertainty. Two in particular, the Prospect Model and Rubicon Model predict when, how, and why human judgment will fail. They are powerful concepts reshaping the field of economics—work that earned the 2012 and 2017 Nobel Prize in Economics. Integrating these concepts into our existing risk management programs, training, and planning is a first step in answering the VCNO’s challenge.

Two Models of Decision-Making

The notion that human decision-making is degraded in specific circumstances is not unfamiliar. Aspiring aviators long have been taught to constantly self-assess how stress, fatigue, and illness may impede their judgment and ability to safely fly the plane. Aviation history is littered with examples of flight crews misjudging risk and making seemingly irrational decisions that end in disaster. Unable to handle the sheer complexity of a given situation, the human mind uses mental shortcuts to lessen the cognitive load and come to actionable decisions. In daily life, these shortcuts help us preserve precious mental resources.[iv] However, in extreme situations these cognitive shortcuts can introduce predictable errors in judgment and risk assessment. Such errors become more acute in situations of stress. Two models are most relevant to operational decision-makers in the Navy: The Prospect Model and Rubicon Model.

Prospect Model: The Prospect Model explains individual choice under conditions of risk. It posits that people are more sensitive to changes in value than they are to the actual amount of a value. People are “loss averse” meaning they care more about a loss than an equivalent gain. Losses and gains are determined relative to an arbitrary reference point. Placement of the reference point is highly dependent on how a choice is presented or a decision is framed rather than on data about the actual costs, rewards, or consequences. This means risk assessments are influenced by the framing of an issue by seniors or subordinates. When probability is factored in, people tend to overweight certain events and underweight potential events in their risk assessments. The cumulative effect is that people will choose to take significant risks in the hope of avoiding a minor certain loss, even if that choice results in greater potential loss.[v]

To illustrate the Prospect Model, consider the following hypothetical: a ship is scheduled to depart port at 0800, but a critical piece of navigation equipment is inoperable. Repairing the equipment would delay departure. To the decision maker, the certain, acutely felt, quantifiable lost time will loom larger than the potential, diffusely felt, and marginal gains in capability or safety of the repair. The Prospect Model predicts that individuals will take greater (perhaps even unacceptable) risks, such as operating without the navigation equipment, than they would otherwise in order to avoid certain losses in time.

Rubicon Model: The Rubicon Model explains discrepancies in risk assessment and the sources of overconfidence in decision makers. The Rubicon Model’s primary insight is that people tend to adopt different risk assessment mindsets before and after a decision is made. Prior to a decision, people adopt a “deliberative phase” mindset where they carefully consider possible alternative options and more accurately assess their consequences. After a decision is made, the mere act of mentally “crossing the Rubicon” shifts their mindset to an “implementation phase.” In the implementation phase, people become more susceptible to tunnel vision, self-serving illusions, and are prone to selectively interpret information that affirms their decision.[vi] Importantly, the Rubicon effects occur regardless of whether a course of action is chosen by the individual, assigned by a senior, or perceived as inevitable due to circumstances—it is committing to a decision that counts. Rubicon effects are more pronounced in an organization like the Navy that prides itself on “getting the job done.” The Comprehensive Review specifically found that the “can-do” attitude characteristic of the implementation mindset caused an “organizational drift from the deliberate processes used to manage time, resources, rest and a commitment to safety,” resulting in excessive risk taking during relatively benign operations.[vii]

Taken together, the Prospect and Rubicon models formalize an understanding of cognitive pitfalls and tricks of the mind known intuitively to leaders from time immemorial. The point is not to paralyze leaders by constantly second guessing their decisions and. Warfare demands decisive leadership, and fortune rewards leaders who take calculated risks. Spotting and seizing those correct risks always will be a matter of intuition, gut feeling, and the Clausewitzean coup d’œil. Such is the nature of military genius that makes war an art rather than a science. Nevertheless, honing this intuition starts with an understanding of self. Enhanced training will show leaders of all ranks their cognitive blind spots before they make a decision that results in catastrophe. Greater awareness will allow leaders to learn and subtly adjust accordingly, just as pilots use trim to subtly adjust their aircraft for smooth flight through the air.

The incidents in Seventh Fleet prompted public discussion over risk assessment in the Navy. We must apply these concepts and leverage the lessons of history to meet the VCNO’s challenge. To demonstrate the Prospect and Rubicon models applied to decision-making, we revisit an older, better documented mishap—the deadliest in civil aviation history.

Tenerife Revisited

On 27 March 1977, two fully loaded 747 jumbo jets collided at Tenerife in the Spanish Canary Islands. KLM Flight 4805, commanded by veteran pilot Captain Jacob Van Zanten, slammed into Pan Am Flight 1736 while attempting takeoff. Disregarding warnings from the tower, Pan Am, and his own crew, Van Zanten initiated takeoff while the Pan Am flight was still on the runway. The accident killed nearly 600 people and remains the deadliest in aviation history.

Yet 40 years on from Tenerife, a central question remains unanswered. As articulated by the official 1978 Spanish government disaster report:

How is it possible that a pilot with the technical capacity and experience of CAPT Van Zanten whose state of mind during the stop over at Tenerife seemed perfectly normal and correct, was able, a few minutes later, to commit a basic error in spite of all the warning repeatedly addressed to him?[viii]

Van Zanten was ultimately found to be primarily responsible for the disaster. His decision-making was all the more mind-boggling because of the veteran position he held at KLM. Van Zanten was the airline’s most seasoned pilot, having even appeared in their worldwide advertising campaign, as well as being an instructor pilot with thousands of hours of flight experience. In a cruel twist of irony, the airline even attempted to contact Van Zanten to lead the Tenerife disaster investigation before realizing he too was among the fallen. Why Van Zanten made such a reckless mistake is an enduring puzzle. Nevertheless, it is a puzzle that can be better understood by applying the Prospect and Rubicon models of decision-making.

The recovered cockpit flight recorder reveals Van Zanten made two fateful decisions which caused the accident. The Prospect Model can explain his initial decision to take off without having positive confirmation that Pan Am 1736 was clear of the runway. The Rubicon Model can explain his subsequent decision to overrule his crew’s protestations and proceed with the takeoff.

Decision One—Initiating takeoff: At 1706 and 11 seconds, KLM 4805 began its takeoff maneuver on Van Zanten’s order.[ix] An agitated Van Zanten announced “let’s go . . . check thrust” and began applying takeoff power.[x] The takeoff roll quickly was halted by the KLM first officer who urgently protested that the Pan Am flight was not clear of the runway. Relenting momentarily, Van Zanten impatiently snapped at the first officer to confirm their takeoff clearance and that the Pan Am jet was clear.

Explaining this decision requires looking at the hours preceding the crash. Neither the KLM nor Pan Am flight should have been at Tenerife that day. The aircraft were diverted from their intended destination due to a terrorist attack, setting them both hours behind schedule. The delay was putting Van Zanten’s crew at risk of exceeding strict duty time limitations. If exceeded, a fresh crew would have had to be flown out from the Netherlands further delaying his passengers and inconveniencing his airline. Furthermore, weather conditions were beginning to deteriorate at Tenerife. If conditions fell below KLM safety minimums, the delay for weather to clear would almost certainly have caused Van Zanten to exceed duty limitations. Van Zanten felt the pressure to get airborne.

Van Zanten was in a classic situation explained by the Prospect Model. With every passing minute, a certain loss of time, money, and personal reputation loomed larger than any potential gains in safety of clarifying Pan Am’s location, waiting for clearer weather, or following proper communications procedures. Indeed, the Spanish report notes “a growing feeling of tension as the problems for the captain continued to accumulate.”[xi] Others on the day sensed Van Zanten’s anxiety from his irritated radio calls. Moments before the collision the Pan Am captain nervously joked “let’s get the **** out of here,” to which the Pan Am first officer responded “Yeah, he’s anxious isn’t he . . .”[xii] It was this perception of a certain looming loss that pushed Van Zanten to take greater and greater risks which ultimately ended in disaster.

Decision Two—Disregarding crew warnings: While Van Zanten’s initial decision to takeoff almost led to catastrophe, redundant safety processes worked as intended and the first officer halted the takeoff. Unfortunately, it was only a fleeting reprieve from disaster. Though the KLM first officer repeatedly attempted to confirm the position of Pan Am, confusion lingered in both cockpits and the tower—a dense fog had set in which meant no one knew exactly where the other aircraft were on the field. Still without any confirmation or takeoff clearance, Van Zanten again initiated takeoff.[xiii] With the plane accelerating, this time the KLM flight engineer protested. At 1706 and 32 seconds he urgently asked, “Is he not clear, then?” and again two seconds later repeating, “Is he not clear, that Pan American?”[xiv] Brushing aside the concerns, Van Zanten responded at 1706 and 35 seconds with an emphatic “Oh, yes!”[xv] From this moment, all further questioning ceased for the captain had made his decision. Thirteen seconds later, the KLM flight emerged from the fog at nearly 140 knots to find the Pan Am jumbo jet still halfway down the runway. Shrieking in horror at the sight, Van Zanten futilely jerked the nose up attempting an early takeoff. But it was too late, the fates of all 600 souls had been sealed.

Even in the 1970s, investigators had a hunch that Van Zanten’s disregard of his crew warnings was evidence of a peculiar cognitive phenomenon at play. The Air Line Pilots Association Study Group on the incident developed a working definition of something they termed the “filter effect.” They defined the filter effect as “the peculiar manner in which an individual screens and rejects or admits to the brain incoming physical stimuli.”[xvi] The concept was rudimentary and ill-defined, but it was valiant effort by the Study Group to get their heads around how Van Zanten could disregard such clear and acknowledged warnings from his own crew. Today, the filter effect is better expressed by the Rubicon Model and we can use it to explain what the study group could not.

When Van Zanten made his initial takeoff attempt, he made up his mind and shifted into an implemental mindset. This shift in mind-set had several deleterious effects on his risk assessment abilities: it decreased his receptivity to incoming information, biased his processing of whatever information did make it through, it made him vulnerable to self-serving evaluations that validated his decision, made him more prone to illusions of control over his situation, and finally it made his expectations of task completion more optimistic limiting his ability to imagine the downsides of his decision to takeoff. [xvii] Doubtless, these cognitive biases were only further compounded by Van Zanten’s sense of self as KLM’s most distinguished pilot. The cumulative effect was an unfounded overconfidence that fueled his cavalier and blasé attitude to the warnings of his own crew.

To understand and explain Van Zanten’s decisions neither absolves nor indicts him. Like all disasters, Tenerife was the amalgamation of numerous factors including unfavorable weather, an inept and overwhelmed Air Traffic Control, Dutch social mores about deference to seniority, Pan Am’s confusion, and just plain old bad luck. Nevertheless, it was ultimately Van Zanten’s feet on the brakes and hands on the throttle. To explore Van Zanten’s decisions using the Prospect and Rubicon Models is to hold a mirror up to our own ship bridges, cockpits, and watch floors. If it can happen to him, it can happen to us and it is the height of folly to presume otherwise. Staying left of boom requires us to integrate these concepts and proactively improve decision-making force-wide.

Three Lines of Effort: Training, Operations, and Culture

We can sustain change and derive the most value from the Prospect and Rubicon models by integrating them across three lines of effort: training, operations, and culture.

Training: Following the Tenerife disaster, airlines and the naval aviation community institutionalized the lessons of Tenerife by adopting a program called Crew Resource Management (CRM) to “minimize crew preventable errors.”[xviii] CRM’s success resulted in its expansion to surface combatants with the advent of Bridge Resource Management (BRM). In both CRM and BRM, “Decision-making” is one of seven critical skills taught to crews. The Prospect and Rubicon models could be integrated by adding them to the existing decision-making skillset. Folding these models of decision-making into existing programs would make for quick uptake throughout the force. During annual CRM/BRM trainings, crews could be taught to identify and mitigate known cognitive biases in the same way they do for factors like fatigue and stress. Awareness is half the battle.

Alternatively, the Prospect and Rubicon models can be integrated into the existing Operational Risk Management (ORM) program. ORM’s guiding instruction, OpNavInst 3500.39C, already specifies known “Risk Assessment Pitfalls” that should be avoided. Among these pitfalls are “over-optimism” defined as “not being totally honest,” or “misrepresentation” defined as an “individual perspective [that] may distort the data.”[xix] While it is valuable that they are noted, the pitfalls described in the instruction are vague and ill-defined making them less useful to front line operators. What does over optimism look like? When should it be most expected? How can it be mitigated? The instruction and ORM program could refashion the “Risk Assessment Pitfalls” around the insights of the Prospect and Rubicon models to answer these questions. Rather than crews having a vague notion that they should avoid overconfidence, they would instead have a precise understanding of what causes overconfidence, how it will skew risk assessment, and specific measures to mitigate it. Such precision in our concepts will be essential to sustain implementation in high operational tempo (OpTempo) environments.

Operations: Operationally, two insights fall out of the Prospect and Rubicon models. First, central to the Prospect Model is the idea that loss and gain are determined relative to an arbitrary reference point. This reference point is susceptible to manipulation through selective framing which changes how a decision maker assesses the risk associated with a choice.[xx] If decision makers are constantly thinking in terms of losses because of unnecessary pressure from senior leaders, risk tolerances and risk taking will imperceptibly increase. The “can do” attitude exacerbates this tendency because subordinates feel pressure to downplay risk, or overestimate operational capacity to satisfy requests. As observed by retired cruiser commanding officer Captain Kevin Eyer, officers too often “suffer the consequences rather than say ‘no!’” even when it imperils their crews. Leaders can encourage more candid risk assessments by removing unnecessary pressure in how they frame decisions.

Secondly, the Rubicon Model suggests leaders should be aware and conscious of the shift from a deliberative mindset to an implementation mindset in their crews. However, awareness and consciousness should not be confused with wariness and anxiety. General Patton’s aphorism that a “good plan violently executed now is better than a perfect plan executed next week,” applies in this circumstance. Confidence, zeal, and self-possession are the founts of the “violence” to which Patton alludes. These are desirable traits in any evolution and are partly a result of commitment to a decision and shift in mind-set. Leaders should nevertheless still be cognizant of the changing risk profiles of themselves and their crew. To mitigate Rubicon Effects, leaders could designate spoilers who deliberately separate themselves psychologically from an evolution to preserve a more deliberative mindset.

Culture: The third line of effort focuses on moving the force from a risk-averse culture to a risk-savvy culture. Even before these incidents, there was a growing chorus of criticisms that the Navy’s zero-defect culture was driving leaders to be excessively risk averse.[xxi] The consequences of this risk aversion are numerous: it saps the “Damn the Torpedoes!” mentality essential to seizing the initiative in war, it incentivizes micromanagement limiting growth opportunities among junior leaders, and it neuters valuable training in extremis scenarios like EM-spectrum-denied environments.[xxii] Yet risk aversion among senior leaders is understandable as no one wants to be left holding the bag when money is wasted, equipment damaged, or lives are lost. Dangerous and excessive risk taking is not generally conducive to an “EP” on your FitReps.

Critics of the Navy’s risk-averse culture achieved the crucial first step of identifying the problem. The service now must specify the solution and articulate its desired end-state—risk-savvy sailors. Risk-savvy sailors are not those who follow a formulaic ORM flow-sheet—savviness denotes an acuity, wisdom, and intangible intuition. Developing savviness requires sailors to be trained to understand and conceptualize risk, in-tune and aware of their own decision-making idiosyncrasies, and confident in their abilities thanks to realistic training and the crucible of experience. In striving for risk savvy sailors, we are paddling against the current. Wider public culture is profoundly risk illiterate and our members are products of a public education system that rewards risk aversion.[xxiii] Yet, just as with physical fitness and character, we expect and achieve higher standards through focused training and a shared culture of excellence. The same can be true of how we engage with risk.

In the annals of our great Navy’s history, we reserve special reverence for those leaders who demonstrate a rat-catching instinct for war. Boldness and initiative, the sense for when to press your advantage and when to consolidate your gains are amongst the highest virtues in war. This instinct is fundamentally predicated on an intimate and adroit understanding of risk. Looking past these incidents toward an era of renewed great power competition, we must train leaders to know when to advance a knight, when to defend a queen, and when to overturn the whole chessboard and punch the other guy in the face.

Endnotes

[i] Davidson, “Comprehensive Review of Recent Surface Force Incidents,” 97.

[ii] Davidson, 7.

[iii] Moran, “USNI Defense Forum: Remarks by Admiral William F. Moran, USN, Vice Chief of Naval Operations.”

[iv] Kahneman, Thinking, Fast and Slow, 12.

[v] Huddy, “Psychology and Foreign Policy Decision-Making,” 15.

[vi] Huddy, 13.

[vii] Davidson, “Comprehensive Review of Recent Surface Force Incidents,” 19.

[viii] Grela, “Report on the Accident Involving BOEING 747 PH-BUF of KLM and BOEING 747 N 736 PA of PANAM,” 128.

[ix] Federal Aviation Administration, “KLM Flight 4805 Collision with Pan Am Flight 1736 at Tenerife.”

[x] Federal Aviation Administration.

[xi] Grela, “Report on the Accident Involving BOEING 747 PH-BUF of KLM and BOEING 747 N 736 PA of PANAM,” 128.

[xii] Air Line Pilots Association, “Human Factors Report on the Tenerife Accident,” 52.

[xiii] Grela, “Report on the Accident Involving BOEING 747 PH-BUF of KLM and BOEING 747 N 736 PA of PANAM,” 57.

[xiv] Federal Aviation Administration, “KLM Flight 4805 Collision with Pan Am Flight 1736 at Tenerife.”

[xv] Federal Aviation Administration.

[xvi] Air Line Pilots Association, “Human Factors Report on the Tenerife Accident,” 21.

[xvii] Johnson and Tierney, “The Rubicon Theory of War: How the Path to Conflict Reaches the Point of No Return,” 15.

[xviii] Commander, Naval Air Forces, “COMNAVAIRFORINST 1542.7: Navy and Marine Corps Crew Resource Management Program,” 2.

[xix] Chief of Naval Operations, “OPNAV INSTRUCTION 3500.39C: Operational Risk Management,” 10.

[xx] Mercer, “PROSPECT THEORY AND POLITICAL SCIENCE,” 3.

[xxi] Tanalega, “Invest in Initiative.”

[xxii] Stefanus, “Embracing the Dark Battle.”

[xxiii] Gigerenzer, Risk Savvy, 14.

Back To Top