- LATEST BLOG POST
- echo $post_date ?>
- A decade of physical risks generated by industrial systems hacking is featuring a remarkable and worrisome acceleration. Indeed, it all…
- Read More
In risk management terms, forecasting the future relies on Hazard identification. Is it a science or an art? Please notice we are talking about hazard identification and not risk identification for a very specific reason. We want to avoid confusion and enable proper analysis. Others, following for example General Motors’ journey to establish a Risk Sensing function, take a short cut and try to identify risks.
What can go wrong is a hazard. Hazards have consequences and the hazard-consequence couple is a risk. If one starts lumping-up things from the beginning there is one sure result: confusion and misleading analyses.
Literature is full with example of Oracles or crystal balls to forecast the future. Corporations do not have such old fashion “magic” contraptions. Note however some vendors would like you to believe they sell those things), Oracles are not fashionable anymore and literature shows that results were rarely clear-cut and often misleading.
One can use Internet of things (IoT) and statistics to analyze the past. Some dare to use to extend their vision toward the future. In fact those are hazardous exercises, especially in our fast changing geopolitical, social, technological and climatological environments.
Like for weather forecasts, longer terms “predictions” pair with larger uncertainties.
Unfortunately there is no scientific methodology to apply to Hazard identification. Is it a science or an art? The question remains open!
Here is an example of what you can do.
Ask within your organization and/or expert opinions (you can also ask outside, research literature, drill social media and customer experience records. The more diverse the knowledge horizon, specialties of the group, the better. However, beware: if you do not agree first on the system to be analyzed, the purpose of the analysis and on the glossary, you will distill many useless conclusions from this.
One very important step here is to define the level of zooming your are at: are you looking at long term mega-trends, or local short term? . One should not use words like operational, tactical, strategic at this level. That is because those adjectives are results and not a priori hazard qualifiers.That means no one can actually allot them to a hazard without having conducted proper analysis first.
The definition of the system brings the very important aspect of interdependencies to the table. Interdependencies can be “a necessary state of nature” in an organization. For instance, Division A depends on products from Division B. They can also or be “hidden” and sometimes “superfluous”, due to bad habits, siloed culture, etc.
Interdependencies, especially the “hidden” ones are generally seen after the fact as “complexities”, just because no one bothered to lay down the system definition properly. Another fashionable name is blind-spot. Again, there is no scientific way to eliminate all blind-spots, but if one defines the system, avoids siloed culture, ensures through proper interviewing (one-on-one interview allow submissive employees to talk freely) that “inconvenient truths” emerges, one is hundred miles ahead!
If you properly define the system then it is also possible to perform a threat-to, threat-from analysis on each element. This is a marvelous tool to avoid blind-spots as well.
Contact us to know more!