The role of uncertainty in risk management processes

The covid-19 pandemic has shown us all just how bad global society is at managing risk. How can this be? Surely every organisation, governments included, should adhere to basic international standards such as ISO 31000:2018 (Risk management) and make risk management processes a normal part of corporate practice.

Well, yes they do, but not very well, and a key part of the problem is failure to appreciate the role of uncertainty. I use the acronym TARMAC as a personal mnemonic for the risk management cycle:

  1. Threats (identify)

  2. Assets (affected)

  3. Risks (analyse)

  4. Mitigation (plans)

  5. Assurance (processes)

  6. Correction (actions)

... then go round again. You'll probably be familiar with something like this. As you may know, the tricky step is #3. I have another personal mnemonic for the steps involved in risk analysis, PICTURE:

  1. Probability of events and consequences

  2. Impact of consequences

  3. Complexity and connectivity

  4. Time-related factors and volatility

  5. Uncertainty and sensitivity

  6. Reliability of analysis

  7. Effectiveness of existing controls

Again, you'll probably be familiar with something like this, but perhaps you may not appreciate the importance of step #5, which is often omitted from risk analyses. Unless you appreciate the critical role of fat tail events, you may end up creating a professional-looking spreadsheet or database of risks that is not only of low value but positively dangerous.

To see this, consider IT projects, the failure of which often leads to much wider issues for organisations, up to and including bankruptcy. For example, a study in 2011 examined 1471 IT projects of various types, in both public and private sector organisations. The average cost overrun was 27%, but this figure conceals the true picture. One in six of the projects was a black swan, with a typical cost overrun of 200% and schedule overrun of almost 70%. Failure on this scale breaks organisations. It may also affect organisations in their supply network as well as wider society.

It is nearly 2 decades since Nassim Nicholas Taleb published his famous book Fooled by Randomness. Black Swan theory is now widely understood. So why does risk analysis continue to focus on averages instead of the more damaging outliers? Because people without mathematical training struggle to distinguish probability and uncertainty.

A project manager may assign a probability of 20% to system testing completing within 6 months, but what is this figure based on? Until testing starts, no-one may know enough to provide a reliable estimate, since full knowledge of component quality, test coverage, user interactions, external factors, and other factors is not available - and it may not be possible to create it in the timescale available. This inherent lack of knowledge is uncertainty, and it must be assessed alongside probability to have any chance of devising effective risk management processes.

To have any chance of protecting yourself against the unknowable, you must try and assess how much you don't know. Otherwise, any attempt you make to quantify and mitigate risks will be leaving your stakeholders open to disaster. As anyone living in the UK now knows first hand, decisions based simply on the probability of an event such as a pandemic are likely to be poor - you must also take into account how little you know about the likelihood of it happening and about the impacts of its consequences.


Recent Posts

See All

Meeting The Data Challenge

Businesses lose, on average, 6% of their revenue due to poor quality data. Here is how to fix it. #data #enterprisedata #enterprisedatamanagement #enterprisearchitecture #lossprevention #losscontrol

Data Strategy

"Keith premises this Column on the belief that processes and data are mirror images of one another. Processes not only consume and generate data, but are themselves a form of data, and governance proc

©2020 Keith Harrison-Broninski