Understanding Risk Perception: A Human Factors View
Raul Sosa

Understanding Risk Perception: A Human Factors View Raul Sosa

A frequent criticism of current Safety Management Systems based on risk control, is that the entire process of identifying hazards and risks is biased by mental models (assumptions) and perceptions of the individuals concerned in the analysis.

This kind of criticism is more a way of expressing lack of understanding of the system than a valid refuting of the entire method. All management and decision making around us are also human activities influenced in similar way.

Humans have been always involved in risk appraisal scenarios, since the beginning of social and business organizations, as part of prudent management. Basic cost/benefit analysis also involves an implicit risk appraisal.

So, better than criticizing SMS for its inherent human bias (and since SMS is a regulatory requirement) it would be worthwhile to understand risk cognition and mitigate it as another Human Factors problem in SMS.

In fact, individual risk cognition can be considered the basis for risk assessment at all levels. Risk scoring is, by nature, a human creation, an intellectual or rational model created by human mind to deal with uncertainty.

Most people expect equations and graphs behind every risk assessing in aviation, providing scientific evidence for risk management, but this is not always possible.

In some cases, the statistical analysis is possible and we may show figures and formulas explaining risk scorings, but in most cases the process applied is qualitative and based on convincing experts’ arguments, like in safety cases.

Hazards and risks are socially constructed although they are based, and oriented to, material objects. The task of the psychological analysis of risk needs to be comprehensive considering individual and organizational levels. This is a big challenging goal due to the large amount of studies and perspectives.[i]

Really risk cognition at group level and risk perception at individual level can be considered part of the culture and the inner context of the organization. [ii]

The cultural theory of risk states that individual’s perception of risks reflects and reinforce at the same time their vision of society or organization’s culture.

Notwithstanding, management systems are controlled by feedback cycles, and the same way than cognitions inform decision making and shape strategy, the performance analysis influences the mental models and cognition on the way back. (Risk compensation and risk homeostasis theories develop this idea in different ways)

The subjective component of risk assessment, as in any human behaviour, requires a normative (or at least explanatory) model to correct for misleading biases. This would be a kind of “meta-risk” mitigation, or how to control the risk of failure in human made risk assessment.

What are the main components of risk cognition?

We realize that there are many theories about risk perception. To simplify our life now I would take only the case of professional risk assessment inside SMS processes, something that is usually done by a trained team of experts. We are not talking of climate change, nuclear power or smoking cigarettes public health risks perceptions by politicians or citizens.

The challenge of assessing risk by the operational people under time pressure and task overload is another Human Factor issue that I will review in another article.

When we analyse risks in management, there are always external and internal stimuli acting on us, some of them are the direct perception (or reported ones) of hazards combined with tasks objectives. However, risk perception is not about perceiving data or information but of interpreting the meaning and imagining the possible consequences. We have in this side external inputs integrated with our learning and memory that create heuristics judgement.

We may call this “cold cognitions” following cultural theory of risk.

In the central section of this array we have the formal problem solving /decision making process, at individual level and as a team if this is the case. On other side, we have a sort of “hot cognitions” represented by feelings, moods, emotions, motivations, individual risk perception biases, etc. All those elements influence the entire model, and at the end, the group’s decision.

Risks as feelings represent our fast and frugal,[iii] instinctive and intuitive reaction to known dangers. Risks as rational analysis brings our knowledge, expertise and rational scientific elements to assess hazards and risk projections. Both sides are also influenced by power, goals, and many organizational related biases that are normally, inversely proportional to distance to risk.

Sometimes both approaches clash internally in our own reasoning, or in team work, creating a dissonance that must be solved, and for that, a third way emerge: risk as an internal politics trade off.[iv]

Expertise can be paradoxical in risk assessment since many times after risk scoring process, decisions that end up in successful (non-event) situations, may generate a kind of confirmation or optimistic bias on next judgements.

The important finding is that human inferences in conditions of uncertainty are subject to cognitive biases. Reason for biases – very simplified – is on one hand the lack of computational capacity, memory or data, and the evolutionary development of operational heuristics by experts. On other hand, biases can serve motivational purposes that can go from self-esteem, mastering control on challenging situations, ego, to pleasing the boss or achieve the goals for the organization.

This situation constitutes clearly a new systemic hazard for SMS that requires mitigation.

A large body of research evidence on Human Factors has established that a defining characteristic of cognitive biases is that they manifest, automatically and unconsciously, over a wide range of human reasoning, so even those people aware of the existence of the phenomenon are unable to detect, let alone or mitigate, their manifestation via awareness only.

There are more than 150 cognitive biases identified, so going into detailed study of each of them can be frustrating. We should better apply a debiasing method, or a way to reduce the impact of cognitive biases on risk assessment.

Among many possible options I would propose a debiasing framework based on four elements:

1.      Human Factors specific situational training of cognitive biases cases related to risk perceptions, for SMS teams or managers’ teams that we would like to improve their decision making

2.      New teamwork task definition (applied TRM) including bottom-up analysis contrasted with top down ones and discussion of results under critical thinking approach.

3.      Corporate culture changes to act on underlying motivational roots for risk perception biases, making clear for any manager that “safety first” is not just a mantra but a real belief of the culture (considering also that not at any price). That would be also reinforced from a practical understanding and application of the ALARP Principle.

4.      Introduce a “Risk Gatekeeper” in any risk assessment team of SMS. That would be a good safety expert with proved qualification, who, given a certain risk evaluation in a team, takes a position that not necessarily agree with it (or simply an alternative position from the accepted common perception), with the goal of improving debate, visualize biases or to extend the analysis further on.

This framework should be supported with traceable records and reviews, as part of SMS Safety Assurance, to consistently check the validity of ongoing risk control or mitigation.

[i] Breakwell, G. M. (2007). The psychology of risk. Cambridge: Cambridge University Press.

[ii] Douglas, M., & Wildavsky, A. B. (1982). Risk and culture: An essay on the selection of technical and environmental dangers. Berkeley: University of California Press.

[iii] Gigerenzer, G. (2002). Calculated risks: How to know when numbers deceive you. New York: Simon & Schuster; Gigerenzer, G., Selten, R., & Dahlem Workshop on Bounded Rationality. (2001). Bounded rationality: The adaptive toolbox. Cambridge, Mass: MIT Press.

[iv] Slovic, P., & International Institute for Environment and Development. (2010). The feeling of risk: New perspectives on risk perception. London: Earthscan.



Claudio Pandolfi

Safety Manager FSO / # Head of the Flight Safety and Accident Investigation Group

7y

Raul , Well done professor !! ,

Like
Reply
Michelle Aslanides

Safety Critical Systems Ergonomist

7y

Dear Raul, thanks for your post... your thoughts invite us always to stop and question our ideas... Even if I think you bring some intersting ideas, as an ergonomist I would start acting on uncertainty, which in some contexts is very and too "certain" and absent of all the management indicators... operators have to deal with them alone, and make inferences that could be less activated if the chaos in which they work was in some way understood and anticipated, at least, if not removed... this is, complexity could be much better anticipated by operators and risky inferences less necessary or frequent if a serious work on the modelization of working context was done... risk perception in real situations is in my opinion, more the result of the encounter of human cognition - smart, before than biased - and a complex and underestimated - and sometimes also forgotten or denied by the organization - context. I would bet first for this strategy before trying to understand inferences and their biases in chaotic contexts that will still exist after your SMS strategy is developed... SMS should identify risky inferences related to context complexity before all the other steps are launched... this is my opinion not only according to my experience in latin american countries but in Europe... and I am not only thinking about aviation, but about all kinds of organizations.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics