Catastrophic failures forensic analyses

Catastrophic failures forensic analyses

Jul 8th, 2020

Catastrophic failures forensic analyses are, most of the time, left in the hand of experiences engineers. At least, that’s the case for tailings dams failures and other catastrophes like major infrastructural accidents and for instance, aviation.

Are we sure that is the best way?

Catastrophic failures forensic analyses

Normalization of deviance, management and decision-making are oftentimes ingredients of the catastrophic failure buildup, not only “engineering”. So, we dare to say, Independent Panels should include social scientists. They seem to be the most qualified to study social and organizational dimensions of failure causality.

The idea is not new. Indeed in the seventies a British sociologist named Barry Turner investigated a series of disasters, noting that generally no one paid attention to warnings, prior near-misses.  Had anyone, and especially highly skilled professionals involved in those cases, picked-up on those, catastrophic accidents could have generally been averted.

Why do failures happen?

So, why does this happen? We published a paper in 2016 (Oboni, F., Oboni, C., A systemic look at tailings dams failure process, Tailings and Mine Waste 2016, Keystone, Colorado, USA). It showed that, even a simple model looking at “engineering” hazards such as:

  • “excessive audacity”,
  • “insufficient effort” and
  • “mistakes”

was sufficient to model the rate of failure of the world-wide portfolio.

After the last few investigations by Independent Panels around the world, we think our 2016 model should be extended to cover “soft issues” such as:

  • “poor perception of warning and near-misses”,
  • “complacency toward normalization of deviance”,
  • “unawareness of risk” or finally
  • “misplaced risk appetite”.

Why aren’t you talking about Failure Modes?

Failure Modes explain how a failure occurs, but not why it occurs.

Look at the two prior lists: there is actually no need to discuss Failure Modes when we seek to answer why!

As Terry Eldridge (Golder) said in his Keynote Lecture at TMW2019, engineers need to focus on Failure Modes when they design. Risk analysis is another job, uses different skills, and has to focus on the “whys”. Perhaps even more importantly risk analysis should follow a project from inception and should be able to pick-up “emerging issues” that can lead to entirely different risks than the ones engineers studied when the project was on their drawing-boards. I just realized this last remark dates me… yes, there was a time when there were drawing boards in engineering offices… I would be very curious to know how many of our readers remember that time.

Also, Failure Modes are part of the “siloed culture” heritage. By boxing reality in failure modes, it is easy to forget that failure arises because of a multitude of unfortunate “small” choices, selections, inattention, and it is generally not the result of one cause alone.

Catastrophic failures forensic analyses by sociologists

A sociologist by Yale, Charles Perrow, wrote a book on  “Normal Accidents”.  Perrow stated precisely that apparently trivial events and non-critical incidents sometimes interact in ways that preliminary engineering analyses could not predict.

Perrow called those failures ‘Normal Accidents’.

Tailings dams generally function with plenty of trivial events and non-critical accidents such as, for example:

  • Localized erosion
  • Some seepage
  • Settlements
  • Poorly maintained diversion channels and finally
  • Poorly maintained or under-designed weirs and water management ancillary structures.

We have omitted many others such for example defective monitoring devices and readings, poor understanding of the geology deriving (also) from too shallow investigations etc.

We are in text book cases of Normal Accidents. Engineers cannot eliminate all of these, of course, especially not at inception. It is the job of the risk analysis to encompass all of these “defects” and evaluate their compound effect on the probability of failure.  Again, that cannot be a “boxing” exercise based on failure modes. In addition, we know where working by “silos” leads: the rate of failure of tailings dams around the world is known, with obvious uncertainties, since 2013.

Of course, no matter how detailed and deep a risk analysis is, it will never be able to get all possible deviances and compound them “precisely”. However, in ORE2_Tailings™ we have spent years to finally distill the list of KPIs that seem to work and deliver an understanding of the causality of failures. It is certainly better than driving at night with no lights on, right?

Are tailings dams accidents true Normal Accidents?

Normal accidents are supposed to occupy blind spots in our way of controlling them.

That is certainly true if we keep “boxing” reality like with failure modes approaches. If we develop risk assessments that avoid “boxing”, then we have a chance of decreasing the blind spots. And if the risk analyses reveal that some element in the portfolio are strategic, i.e. require system’s change, well, we are better armed to do so.

Also, Normal Accidents normally occur in complex systems. Dams are certainly a complex system insofar they are investigated, designed built, operated, monitored and managed by interacting teams over long-time spans. Dams can also be interdependent with other dams and structures. IoT, automation, will add a layer to the complexity. Risk assessments have to be at par with such complex system to be useful. The time of risk-matrices is over! There is no more room for simplistic, misleading approaches.

Normal accidents also generally do not challenge common engineering understandings and theories, likely because the deviances that generate them are rarely intrinsically surprising. A plugged drain will not lead to a revolution in the drainage design around the world.

Furthermore, because dams are “unique structures” lesson learned are difficult to assimilate and a dam can only fail once!

Closing remarks

Normal Accidents are not the only sociological theory on accidents causes.

Other approaches are more oriented towards “technological innovation” linked catastrophes.

As we have shown, dams failures characterization as Normal Accidents is quite fitting.

However, our industry is correcting itself as we speak and we are confident that with a few more appropriate changes we will be ready to challenge present public perception.

Tagged with: , , , , ,

Category: Consequences, Crisis management, Hazard, Optimum Risk Estimates, ORE2_Tailings, Risk management

Leave a Reply

Your email address will not be published. Required fields are marked *

Riskope Blog latests posts

  • New achievements in risk assessment and management
  • 2-05-2023
  • Print PDF New achievements in risk assessment and management will be attained thanks to SRK Consulting merging with Riskope. Indeed,…
  • Read More
  • Open letter to the organizer of the tailings dam round robin exercise
  • 29-03-2023
  • Print PDF Dear Ryan, please receive this open letter to the organizer of the tailings dam round robin exercise. It…
  • Read More
  • Landslides risk assessment and monitoring
  • 8-03-2023
  • Print PDF During the first couple decades of our professional life we worked extensively with Landslides risk assessment and monitoring…
  • Read More
  • Get in Touch
  • Learn more about our services by contacting us today
  • t +1 604-341-4485
  • +39 347-700-7420

Hosted and powered by WR London.