I’ve sat through hundreds of navigation briefs as various control stations explain to the Captain and crew their role in safely taking a ship sea. Likewise I’ve sat through almost as many replenishment at sea briefs. Both have a significant component of risk management, so much so that operational risk management “ORM” is so embedded in our culture that its become commonplace and we have become complacent.

In those navigation or replenishment briefs there is an approved and lauded solution, typically provided by the local Afloat Training Group. And it’s fine. It just doesn’t do anything more than meet the criteria that ORM has been addressed.

But if all that’s being done is a “check in the block” is ORM really being addressed?

A new paper from SRA brings forward 5 areas that lead to complacency. 5 “Neglects” in risk management.

1. Probability neglect – people sometimes don’t consider the probability of the occurrence of an outcome, but focus on the consequences only.
2. Consequence neglect – just like probability neglect, sometimes individuals neglect the magnitude of outcomes.
3. Statistical neglect – instead of subjectively assessing small probabilities and continuously updating them, people choose to use rules-of-thumb (if any heuristics), which can introduce systematic biases in their decisions.
4. Solution neglect – choosing an optimal solution is not possible when one fails to consider all of the solutions.
5. External risk neglect – in making decisions, individuals or groups often consider the cost/benefits of decisions only for themselves, without including externalities, sometimes leading to significant negative outcomes for others.

Where do these fit within the subject of navigation or replenishment briefs?

Probability neglect: Every brief speaks of grounding or collision and they do so because of the consequence, not the probability. That means that precious time is spent talking about things that are very unlikely to occur. There is an opportunity cost there.

Consequence neglect: Honestly, this is something Navy writ large does well. To the point that we overemphasize the consequence and oversimplify the solution path.

Statistical neglect: The Surface’s Navy’s slavish devotion to Cold War stand off ranges is probably the single best ORM example for statistical neglect, even if it is outside the normal navigation or replenishment detail. Ships can, and do, pass safely within 500 yards of each other. Why then do so many Commanding Officers insist on being contacted about every ship that will pass within 10,000 or in some cases 20,000 yards?

Solution neglect: This one is simple. All to often we take solutions off the table before we even get a chance to start framing the problem. This is most often found on ships that are mono-decisional – those that only say “yes” or only say “no”. Not every person needs to be onboard for every underway period. Not every person needs to be on the lines for every replenishment. But sometimes someone does need to get underway and miss something at home. Simple examples, but what other ways do we rig the ORM game by ignoring a potential solution? Or, does Ops, who had CDO and had to deal with some messy issue, really need to stand Officer of the Deck? Changing the watchbill might be the right thing to do – even if it is at the last minute. Routinely changing the watchbill at the last minute? That’s something else.

External risk neglect: Again, not an easy fit to the “check in the block” navigational or replenishment detail but a Navy issue all the same. Moving a person or part from one ship to another for an underway period or an inspection. Forgetting to notify local officials that you are getting underway from a liberty port – or forgetting to ask the pilot what ships are coming in or out that day.

By sticking with the canned ORM we are hurting future generations or surface warfare officers by subjugating their original and creative thinking to a “just get it done” mentality. Navigators, when you plan your next brief think about these things for ORM:

When was the last time the ship got underway? What did we do right? What did we do wrong? Have we done that wrong thing before? Why?

What’s the weather predicted and how will that change the ORM slides? Low visibility increases the risk of grounding or collision – if a ship has difficulty with electronic navigation. Bad weather can certainly slow the transit speed down. How does that impact your knowledge of traffic in the channel?

Who’s new to the ship? What distractions are there that can get in the way of a safe and focused detail? These things are mentioned in the brief…but never seem to make it into the ORM section.

What other realistic and likley problems are there that ships will encounter every day that can and all to often do lead to accidents? And why aren’t those being addressed during ORM discussions?

If ORM remains a check in the block for an inspection, we will see more, not less, mishaps in coming years.

Posted by M. Ittleschmerz in Training & Education

You can leave a response, or trackback from your own site.

  • Benjamin Walthrop

    Some interesting insights on ORM as a check-in the box. Unfortunately, based on my experience (albeit a little dated) it seems that was the direction the canned ORM discussions, aided and abetted by Power Point, often went.

    Also, the biases that you identified are largely baked into the ORM system that is currently used. I’m not convinced that the risk matrix is enough to actually work through the important points of consequence and probability neglects that you bring up. The very nasty habit of labelling low probability but high consequence situations and high probability but low consequence situations as green or “low” yellow fails to take into account chaining of events and how a high probability and low consequence situation appears to be able to chain very quickly up the actual risk scale. Again, I mostly blame Power Point and the shallow thinking it abetts for this state of the game.

    An in depth discussion of roles and responsibilities of the bridge team as well as each individual’s “red-lines” for action might be a better tool for mitigate chained risks. Also, quite a bit more thought and time should be devoted to debriefs which often get short shrift in the mad rush to liberty or the next evolution. The USN seems to have institutionalized an attitude that if the overall evolution was successful “it’s all good.” The very human trait of generally not learning enough from successes and concentrating learning surrounding a failure has a huge opportunity cost that should be re-examined.

    At any rate, good post and a good topic for the USNI blog. It’s not very flashy, but I think it is in keeping with the mission of USNI as outlined in the mission statement. Iterestingly it’s only generated one comment. I wonder if that is a reflection of the level of interest and professionalism across the organization?

  • JD

    Great post and a very interesting topic for me as a crisis manager in the civilian sector.


    Can you give an example, naval or otherwise, of a high probability/low consequence event chaining up? Your comment raises a significant point.

  • Dave Schwind

    I agree with Benjamin as far as ORM not being a sexy topic, and thus the reason this post hasn’t garnered much attention. However, it is interesting that someone (or, rather a group of someones in this case) actually spent the time parsing out the reasons behind risk. The paper, however, was focused on environmental risk and land management, and risk involved in the management thereof. I opine that Navy risk management is slightly different due to the aspect of “careerism” inherent in those in command.

    I’m not saying everyone in command always puts career interests above the health and welfare of their crew, or the endangerment of their ship (or boat, aircraft, etc.) There are some excellent commanding officers out there who genuinely don’t want their people to get injured because a lack of planning. However, when you dig down into the weeds and look at the risks considered for mitigation during, say, sea and anchor detail, it is clear that many of the risks are considered merely for the purpose of self-preservation. The greatest fear is not so much the physical damage caused if “X” goes wrong, but the end of a “promising” career. Thus, commanders are encouraged by their own interest in self-preservation to ORM any possible negative occurrence, and then avoid the risk if at all possible.

    This is the difference between ORM in the Navy (in practice, NOT in theory…as it is a very good tool, don’t get me wrong!) and the risks being taken (or not) in a government bureaucracy like the BLM, as the paper discusses. In the Navy, a single, small failure can cost an entire career, whereas in government service, it takes years and reams of counselings before someone starts to feel an adverse affect to their career (just look at some of the recent fraud, waste, and abuse done in the GSA…and how many of them were disciplined, let alone fired for blowing millions in taxpayer dollars? My point, exactly…)

    Personally, do I feel that ORM can be done better? Of course. When you can recite each bullet of each powerpoint slide in the nav brief and use it as a method to fall asleep at night (let alone the audience phase-out or asleep during the actual brief) there’s a definite problem in application. Perhaps the removal of the slides covering the improbable risks and briefing those only on station might help (as inferred in the “probability neglect” section)? How about concentrating on actual skill-level professional training vice concentrating on the (figurative) “fire of the day” or “social issue GMT”?

    Ships have gotten underway safely for hundreds of years without ORM. How did they do it? By having competent watchstanders who were sufficiently well trained to be able to properly react to changing situations, rather than reacting to situations properly in order to appease ATG inspectors. I could go on…there’s certainly a good number of ways to make ORM more applicable…but these are two immediately applicable examples that could pay significant dividends in actually mitigating risk, rather than “checking the box” with pretty powerpoint slides, and thus encouraging the continuing slide into “ORM apathy”.

  • BJ Armstrong

    For years as a Safety Officer on a Big Deck I tried to have a message that people would/could listen to. At times I may have succeed, more often I think I failed, but check out this link. Good stuff from Dirty Jobs guy Mike Rowe…