The normalcy bias and its impact on security

By Michael Lines on November, 3 2016

Get latest articles directly in your inbox, stay up to date

Back to main Blog
Michael Lines

code-1689066_1280.jpg"The fault, dear Brutus, is not in our stars, but in ourselves..."
Shakespeare, Julius Caesar

From Wikipedia: "The normalcy bias, or normality bias, is a mental state people enter when facing a disaster. It causes people to underestimate both the possibility of a disaster and its possible effects. This may result in situations where people fail to adequately prepare and, on a larger scale, the failure of governments to include the populace in its disaster preparations.

The assumption that is made in the case of the normalcy bias is that since a disaster never has occurred, it never will occur. It can result in the inability of people to cope with a disaster once it occurs. People with a normalcy bias have difficulties reacting to something they have not experienced before. People also tend to interpret warnings in the most optimistic way possible, seizing on any ambiguities to infer a less serious situation."

Sound familiar? When we look at the constant deluge of breach reports, one question that comes to mind is "how can this happen?” Can’t these companies see what is happening in their industry, and with their peers? Don't they read the papers? Are the management and boards of these companies out of touch with the reality of the industry? No, I would argue, they are just human. 

While we pride ourselves on our rationality, the truth of the matter is that people are not rational beings, they are rationalizing beings. Our intellect has not so much allowed us to foresee and prevent disaster, as it has allowed us to concoct better stories about why it has not happened, will not happen, and when/if it does, how it was not our fault. For example, pick any disaster over the past 100 years, from the Titanic, to World War Two, to the dot-com collapse. The signs were there for those who could read them and consider the possibilities. However, when it is not in your immediate best interest to hear the truth, guess what you can't and won't be able to hear. 

According to Wikipedia, the normalcy bias may be caused in part by the way the brain processes new data. Research suggests that even when the brain is calm, it takes 8-10 seconds to process new information. Stress slows the process, and when the brain cannot find an acceptable response to a situation, it fixates on a single and sometimes default solution that may or may not be correct. An evolutionary reason for this response could be that paralysis gives an animal a better chance of surviving an attack; predators are less likely to see prey that is not moving. In the fast moving high tech world however, this default reaction can lead to disaster.

As most competent information security professionals will tell you, your best bet is not to assume you might be breached, but rather that you already have been breached and you just don't know it. This is why there needs to be a continuous emphasis on not just preventative controls, but also on those technologies, processes, and people that allow you to detect possible breaches, as well as continuous drills on what you would do if you should detect such an incident. 

Information security is not a technology issue, though that has certainly exacerbated the problem, it is ultimately a people problem. To get a handle on it, start with the assumption that just because the alarm bells are not ringing, it does not mean they shouldn’t be. If you start with an assumption of the worst, and work your way out from there, you will be far better off than those companies and organizations that evidently assumed the opposite, to their ultimate surprise and detriment. 

Learn more about Online Business Systems’ Risk, Security and Privacy practice by clicking here

This piece was originally posted on LinkedIn Pulse and is reposted here with the permission of Michael Lines.

Submit a Comment

Get latest articles directly in your inbox, stay up to date