The Psychology of Disaster Risk

People’s perceptions of risks are not easily represented quantitatively through a statistically probability of coming to harm. Once we start to characterize events with the word ‘risk’ and try to quantify the level of risk, we are entering the domain of subjectivity and emotion

Normalcy Bias

Psychological biases often prevent people from preparing properly for disasters, including cyberattacks. Very common is the “normalcy bias, which explains why most people, an estimated 70%, underestimate the threat associated with natural disasters like hurricanes and floods, and why people are plagued by inaction during a crisis. People assume the best will happen and situations will resolve the way they normally do because they do not want to think about a worst-case scenario. While people are “protecting” their mind from scenarios that feel unsafe, they underestimate the likelihood and effects of a disaster, including a cyberattack.


Another psychological state is fatalism, the phenomenon that occurs when dangers are so far outside someone’s control, they believe they can’t do anything about it. Fatalistic attitudes cause individuals to attribute damage to uncontrollable natural causes as opposed to human actions, like cybersecurity, risk management, or disaster preparation. This fatalistic attitude that nothing can be done deters people from taking preventive action.

Cognitive Biases

Two Wharton professors, Howard Kunreuther and Bob Meyer, researched the psychological limitations that prevent humans from better preparing for natural disasters, and underinvest in protection against low-probability, high-consequence events, like cyberattacks or hurricanes. In their book “The Ostrich Paradox: Why We Underprepare for Disasters” they outline six cognitive biases related to decision-making under uncertainty:

  • Myopia: a tendency to focus on short-term horizons.
  • Amnesia: a tendency to too quickly forget things that have happened in the past.
  • Optimism: a tendency to underestimate the likelihood that losses will occur and think “bad things won’t happen to me.”
  • Inertia: a tendency to prefer to maintain the status quo, or to “do nothing”.
  • Simplification: a tendency to selectively choose only a subset of relevant facts when making choices involving risk.
  • Herding: a tendency to base choices on the observed actions of others.

Policy makers often assume that fear is the best motivating factor in getting people to leave and they design legislation and word warning messages accordingly. Yet, as Brad Swain, behavioral researcher at the Common Cents Lab, explains “What policy makers fail to understand is that individuals who are making these decisions are constructing their preferences in real time, and optimizing across a host of goals and motivations.”

Overcoming Biases to Increase Preparedness

Human behavior is hardwired to avoid threats, and we retain many aspects of the mentality of our Stone Age forebears. Understanding these individual biases is a valuable first step in increasing disaster preparation.

Regarding cybersecurity, a lack of knowledge and the inability to see the cybercriminal hinders a will to prepare.  Similarly, regarding insurance coverage, the lack of incentive to purchase proper full coverage insurance for natural disasters, combined with the lack of education of available insurance products and their respective coverage gaps, presents challenges to preparedness. Understanding behavioral science can help to simplify the process to educate and incentivize people to prepare for threats.

Managed security providers and insurance companies can analyze disaster risk psychology to help encourage people to be better prepared for cyberthreats and disasters while still respecting their freedom of choice. Ultimately, products need to be affordable and provide great customer value to convince a buyer that it’s a worthwhile investment.