- LATEST BLOG POST
- echo $post_date ?>
- In a world that used to be data-starved, is data indigestion a risk? Data starvation It is rare that we…
- Read More
Engineering anthropogenic global change is loaded with implicit Anthropocene ethical and geoethical issues to an unprecedented level because of:
many claim our anthropocenic systems are difficult to forecast and non-intended and counter-intuitive system behavior is likely. In our experience, however, it is not always so: poor risk asessment tend to mislead people to believe that things may be more complex that what they really are.
Academic and popular literature suggest agreement that the public’s distrust has developed over the past half century as a result of repeated failures to provide adequate and/or accurate risk information to the public.
In the public (health) arena the difficult task of allocating risks and benefits has hit regulators; sometimes they have missed important risks and sometimes they have spent a lot of money and energy on dealing with negligible risks. Technological hazards, but also geohazards certainly follow this trend as proven by recent “mining/ environmental” cases allowing to measure public skepticism. In fact, “the scientific majority sometimes finds itself pitted against a public opinion which simply does not accept its conclusions”.
Meanwhile, over the last five decades or so, the risk management community at large, including engineers, designers performing risk assessments on their own projects/designs for civil projects, oftentimes in a conflict of interest situation, has settled on representing the results of risk assessments using misleading methods and has maintained poor communication habits.
The implications of poor risk prioritization for the world industry’s balance sheet can be staggering, aside from the possible liabilities. Inaccuracies can lead to mistaken resource allocation. They create fuzziness for DMs and the public, thus offer little support to rational decision making. They lead to public distrust and loss of confidence because of their arbitrariness. It is not a surprise then that, contrary to what international codes like ISO 31000 propose, through the life of projects and operations we see poor communication and risk approaches.
No one is surprised that experts will disagree in their analyses. For example, probability or frequency estimates for an event. However, if and when the public disagrees with an expert analysis of risk, they are dismissed as being highly emotional or lacking scientific literacy.
This concept is important. Indeed there is an accepted difference in scientific literacy between the public and scientific experts. There is also an assumption that the public are ignorant about risks and probabilities and that an increased scientific literacy would help decrease perceived risks. An increase in scientific literacy may in fact increase perceived risks. The question remains as to whether the level of required scientific literacy is “so high that it is difficult to attain and difficult to motivate the public to attain it”.
It is simply unrealistic that the average citizen can obtain sufficient scientific hazard literacy to thoroughly tackle any risk assessment review. Thus Risk Managers must change their communication approach. It has to go from paternalistically doling out pieces of partisan information to partnering with the public. The goal is to demonstrate that the practices meet socially acceptable levels and practices. Partnering with the public requires effective communication, but more importantly, public consultation and participation.
Two vital components of risk communication are trust and credibility. Corporations and governments must earn them to maintain their Social License to Operate.
The price to pay if communication and partnering are not improved is never-ending crises, turmoils, boycotts and possibly revolts.