Demonstrating to employees that security is there to make their life easier, not harder, is the first step in developing a sound security culture in a company. But before we discuss the actual steps to improve it, let’s first understand the root causes of poor security culture.
Security professionals must understand that bad habits and behaviours tend to be contagious. Malcolm Gladwell, in his book The Tipping Point,28 discusses the conditions which allow some ideas or behaviours to “spread like viruses”. He refers to the broken windows theory to illustrate the power of context. The theory was first presented by Wilson and Kelling,29 who advocated for stopping smaller crimes by maintaining the environment in order to prevent bigger ones. The authors claim that a broken window left for several days in a neighbourhood would trigger more vandalism. The small defect signals a lack of care and attention on the property, which in turn implies that crime will go unpunished.
Gladwell describes the efforts of George Kelling, who employed the theory to fight vandalism on the New York City subway system. He argued that cleaning up graffiti on the trains would prevent further vandalism. Gladwell concluded that this several-year-long effort resulted in a dramatically reduced crime rate.
Despite ongoing debate regarding the causes of the 1990s crime rate reduction in the US, the broken windows theory can be applied in an information security context.
Security professionals should remember that minor policy violations tend to lead to bigger ones, eroding the security culture of the firm.
We discussed the reasons for violations in Chapter 8, but there is more to it than that. The psychology of human behaviour should be considered as well. Sometimes people are not motivated to comply with a security policy because they simply don’t see the financial impact of violating it.
Dan Ariely, in his book The Honest Truth about Dishonesty,30 tries to understand why people break the rules. Among other experiments, he describes a survey conducted among golf players to determine the conditions in which they would be tempted to move the ball into a more advantageous position, and if so, which method they would choose. The golfers were offered three different options: they could use their club, use their shoe or simply pick the ball up using their hands.
Although all of these options break the rules, they were designed in this way to determine if one method of cheating is more psychologically acceptable than others. The results of the study demonstrated that moving the ball with a club was the most common choice, followed by the shoe and, finally, the hand. It turned out that physically and psychologically distancing ourselves from the “immoral” action makes people more likely to act dishonestly.
It is important to understand that the “distance” described in this experiment is merely psychological. It doesn’t change the nature of the action.
In a security context, employees in a company will usually be reluctant to steal confidential information, just as golfers will refrain from picking up a ball with their hand to move it to a more favourable position, because that way they are directly involved in the unethical behaviour. However, employees might download a peer-to-peer sharing software to listen to music while at work, as the impact of this action is less obvious. This can potentially lead to even bigger losses due to even more confidential information being stolen from the corporate network.
Security professionals can use this finding to remind employees of the true meaning of their actions. Breaking security policy does not seem to have a direct financial impact on the company – there is usually no perceived loss; hence it is easy for employees to engage in such behaviour. Highlighting this link and demonstrating the correlation between policy violations and the ability of the business to generate revenue could help employees to understand the consequences of non-compliance.
Building upon the connection between breaking security policies and cheating, let’s look at another study conducted by Gino, Ayal and Ariely,31 where they asked participants to solve 20 simple maths problems and promised 50 cents for each correct answer.
The participants were allowed to check their own answers and then shred the answer sheet, leaving no evidence of any potential cheating. The results demonstrated that participants reported to solve, on average, five more problems than under conditions where cheating was not possible (i.e. controlled conditions).
The researchers then introduced David – a student who was tasked to raise his hand shortly after the experiment begun and proclaim that he had solved all the problems. Other participants were obviously shocked by such a statement. It was clearly impossible to solve all the problems in only a few minutes. The experimenter, however, didn’t question his integrity and suggested that David should shred the answer sheet and take all the money from the envelope.
Interestingly, other participants’ behaviour adapted as a result. They reported solving on average eight more problems than under controlled conditions.
Much like the broken windows theory, this demonstrates that unethical behaviour is contagious, as are acts of non-compliance. If employees in a company witness other people breaking security policies and not being punished, they are tempted to do the same. It becomes socially acceptable and normal. This is the root cause of poor security culture.
The good news is that the opposite holds true as well. That’s why security culture has to have strong senior management support. Leading by example is the key to changing the perception of security in the company: if employees see that the leadership team takes security seriously, they will follow.
Therefore security professionals should focus on how security is perceived. Brooks supports this point, outlining three basic steps in decision-making in his book The Social Animal:32
- People perceive a situation.
- People estimate if the action is in their long-term interest.
- People use willpower to take action.
He claims that people historically were mostly focused on the last two steps of this process. We have demonstrated in the previous chapter that relying solely on willpower only has a limited effect. Willpower can be exercised like a muscle, but it is also prone to atrophy.
In regard to the second step of the decision-making process, if people were reminded of the potential negative consequences they would be likely not to take the action. Brooks then refers to ineffective HIV/AIDS awareness campaigns, which focused only on the negative consequences, but ultimately failed to change people’s behaviour.
He also suggests that most diets fail because willpower and reason are not strong enough to confront impulsive desires. “You can tell people not to eat the French fry. You can give them pamphlets about the risks of obesity … In their nonhungry state, most people will vow not to eat it. But when their hungry self rises, their well-intentioned self fades, and they eat the French fry”.
This doesn’t only apply to dieting. As demonstrated in Chapter 6, when people want to get their job done and security gets in the way, they will circumvent it, regardless of the degree of risk they might expose the company to.
That is the reason for perception being the cornerstone of the decision-making process. Employees have to be taught to see security violations in a particular way that minimises the temptation to break policies.
Timothy Wilson claims, “One of the most enduring lessons of social psychology is that behaviour change often precedes changes in attitudes and feelings”.33
Charles Duhigg, in his book The Power of Habit,34 tells a story about Paul O’Neill, a CEO of the Aluminium Company of America (Alcoa) who was determined to make his enterprise the safest in the country. At first people were confused that the newly appointed executive was not talking about profit margins or other finance-related metrics. They didn’t see the link between his “zero-injuries” goal and the company’s performance. Despite that, Alcoa’s profits reached a historical high within a year of his announcement. When O’Neill retired, the company’s annual income was five times greater than it had been before his arrival. Moreover, it became one of the safest companies in the world.
Duhigg explains this phenomenon by highlighting the importance of the “keystone habit”. Alcoa’s CEO identified safety as such a habit and focused solely on it.
O’Neill had a challenging goal to transform the company, but he couldn’t just tell people to change their behaviour. He said, “that’s not how the brain works. So I decided I was going to start by focusing on one thing. If I could start disrupting the habits around one thing, it would spread throughout the entire company.”
He recalled an incident when one of his workers died trying to fix a machine despite the safety procedures and warning signs. The CEO called an emergency meeting to understand what had caused this tragic event.
He took personal responsibility for the worker’s death, identifying numerous shortcomings in safety education. For example, the training programme didn’t highlight the fact that employees wouldn’t be blamed for machinery failure or the fact that they shouldn’t commence repair work before finding a manager.
As a result, the policies were updated and the employees were encouraged to suggest safety improvements. Workers, however, went a step further and started suggesting business improvements as well. Changing their behaviour around safety led to some innovative solutions, enhanced communication and overall increased profits for the company.
Security professionals should understand the importance of group dynamics and influences to build an effective security culture.
They should also remember that just as “broken windows” encourages policy violations, changing one security habit can encourage better behaviour across the board.
Improving security culture, however, is difficult without a better understanding of the human decision-making process, so let’s look at this in more detail in the next chapter.
28 Malcolm Gladwell, The Tipping Point: How Little Things Can Make a Big Difference, Little, Brown, 2006.
29 James Q. Wilson and George L. Kelling, “Broken Windows: The Police and Neighborhood Safety”, The Atlantic, 29(31), 1982, 29–38.
30 Dan Ariely, The Honest Truth about Dishonesty, Harper, 2013.
31 Francesca Gino, Shahar Ayal and Dan Ariely, “Contagion and Differentiation in Unethical Behavior: The Effect of One Bad Apple on the Barrel”, Psychological Science, 20(3), 2009, 393–398.
32 David Brooks, The Social Animal: The Hidden Sources of Love, Character, and Achievement, Random House, 2011.
33 Timothy Wilson, Strangers to Ourselves, Harvard University Press, 2004, 212.
34 Charles Duhigg, The Power of Habit: Why We Do What We Do and How to Change, Random House, 2013.