CHAPTER 9: SECURITY AND USABILITY – The Psychology of Information Security

CHAPTER 9: SECURITY AND USABILITY

In the previous chapter we mentioned that one of the main contributing factors to non-compliance by users is an extensive workload caused by poorly designed and poorly implemented security mechanisms. Next, we will discuss how these issues can be addressed.

Firstly, security professionals should understand that people’s resources are limited. Moreover, people tend to struggle with making effective decisions when they are tired.

To test the validity of this argument, Shiv and Fedorikhin designed an experiment where they divided participants into two groups: the first group was asked to memorise a two-digit number (e.g. 54) and the second group was asked to remember a longer seven-digit number (e.g. 4509672).19 They then asked the participants to go down the hall to the other room to collect their reward for participating. This payment, however, could only be received if the number was recalled correctly.

While they were making their way through the corridor, they encountered another experimenter, who offered them either fruit or the less healthy chocolate option. They were told that they could collect their chosen option after they finish the experiment, but they have to make a decision here and now.

The results demonstrated that people who were given the easier task of remembering a two-digit number mostly chose the healthy option, while people overburdened by the more challenging task of recalling a longer string of digits succumbed to the more gratifying chocolate.

The implications of these findings, however, are not only limited to dieting. A study conducted by Danzigera, Levav and Avnaim-Pessoa looks at the decision-making patterns that can be observed in the behaviour of judges when considering inmates for parole during different stages of the day.20

Despite rejecting parole being the default decision, in the mornings and after lunch judges had more cognitive capacity and energy to fully consider the details of the case and make an informed decision, resulting in more frequently granted paroles. In the evenings, judges tended to reject parole far more frequently, which is believed to be due to the mental strain they endure throughout the day. They simply ran out of energy and defaulted to the safest option.

How can this be applied to the information security context? Security professionals should bear in mind that if people are stressed at work, making difficult decisions, performing productive tasks, they get tired. This might affect their ability or willingness to maintain compliance.

In a corporate context, this cognitive depletion may result in staff defaulting to core business activities at the expense of secondary security tasks, making the scenarios described in Chapter 6 a real possibility.

Let’s look at the opportunities available to prevent such depletion.

When users perform tasks that comply with their own mental models (i.e. the natural way that they view the world and how they expect it to work), the activities present less of a cognitive challenge than those that work against said models.

If people can apply their previous knowledge and experience to a problem, less energy is required to solve it in a secure manner and they are less mentally depleted by the end of the day.

For example, a piece of research on disk sanitisation highlighted the importance of secure file removal from the hard disk.21 It is not clear to users that emptying the “Recycle Bin” is insufficient and that files can easily be recovered. However, there are software products available which exploit the users’ mental models from the physical world. They employ a “shredding” analogy to indicate that files are being removed securely, which echoes an activity they would perform at work. Such interface design might help lighten the burden on users.

Therefore security professionals should pay attention to the usability of security mechanisms, aligning them with the users existing mental models.

John Maeda supports the importance of relating to an existing experience to make design more user-friendly in The Laws of Simplicity.22 He refers to an example of the desktop metaphor introduced by Xerox researchers in the 1980s. People were able to relate to the graphical computer interface as opposed to the command line. They could manipulate objects similarly to the way they do with the physical desk: store and categorise files in folders as well as move, rename or delete by placing them in the recycle bin.

Building on existing mental models makes the adoption of new technologies and ways of working easier. However, such mappings must take cultural background into consideration. The metaphor might not work if it is not part of the existing mental model. For instance, Apple Macintosh’s original trash icon was impossible to recognise in Japan, where users were not accustomed to metallic bins of this kind.

Good interface design not only lightens the burden on users but can also complement security. Traditionally, it has been assumed that security and usability always contradict each other – that security makes things more complicated, while usability aims to improve the user experience. In reality, they can support each other by defining constructive and destructive activities. Effective design should make constructive activities simple to perform while hindering destructive ones.

This can be achieved by incorporating security activities into the natural workflow of productive tasks, which requires the involvement of security professionals early in the design process. Security and usability shouldn’t be extra features introduced as an afterthought once the system has been developed, but an integral part of the design from the beginning.

Security professionals can provide input into the design process via several methods such as iterative or participatory design.23 The iterative method consists of each development cycle being followed by testing and evaluation and the participatory method ensures that key stakeholders, including security professionals, have an opportunity to be involved.

Including security requirements from the very start is not necessarily enough to guarantee the success of solving a particular security-usability issue. The reason for this is that such problems can be categorised as wicked.

Rittel and Webber define a wicked problem in the context of social policy planning as a challenging, if not impossible, one to solve due to missing, poorly defined or inconsistent requirements from stakeholders, which may morph over time and which can be demanding to find an optimal solution for.24

Therefore one cannot apply traditional methods to solving a wicked problem; a creative solution must be sought instead. One of these creative solutions could be to apply design thinking techniques.

Methods for design thinking include performing situational analysis, interviewing, creating user profiles, looking at other existing solutions, creating prototypes and mind mapping.

Plattner, Meinel and Leifer assert that there are four rules to design thinking, which can help security professionals better approach wicked problems:25

  1. The human rule: all design activity is ultimately social in nature.
  2. The ambiguity rule: design thinkers must preserve ambiguity.
  3. The redesign rule: all design is redesign
  4. The tangibility rule: making ideas tangible always facilitates communication.

Security professionals should adopt these rules in order to develop secure and usable controls, by engaging people, utilising existing solutions and creating prototypes which can help, by allowing the collection of feedback.

Although this enables the design of better security controls, the design thinking rules rarely provide an insight on why the existing mechanism is failing.

When a problem occurs, we naturally tend to focus on the symptoms instead of identifying the root cause. Taiichi Ohno developed the Five Whys technique, which was used in the Toyota production system as a systematic problem-solving tool to get to the heart of the problem.

In one of his books, Ohno provides the following example of applying this technique when a machine stopped functioning:26

  1. Why did the machine stop? There was an overload and the fuse blew
  2. Why was there an overload? The bearing was not sufficiently lubricated.
  3. Why was it not lubricated sufficiently? The lubrication pump was not pumping sufficiently
  4. Why was it not pumping sufficiently? The shaft of the pump was worn and rattling
  5. Why was the shaft worn out? There was no strainer attached and metal scrap got in.

Instead of focusing on resolving the first reason for the malfunction, i.e. replacing the fuse or the pump shaft, repeating “why” five times can help to uncover the underlying issue and prevent the problem from resurfacing again in the near future.

Eric Reis, who adapted this technique to starting up a business in his book The Lean Startup,27 points out that at “the root of every seemingly technical problem is actually a human problem.”

As in Ohno’s example, the root cause turned out to be human error (an employee forgetting to attach a strainer), rather than a technical fault (a blown fuse), as was initially suspected. This is typical of most problems that security professionals face, no matter which industry they are in.

These techniques can help to address the core of the issue and build systems that are both usable and secure. This is not easy to achieve due to the nature of the problem. But once implemented, such mechanisms can significantly improve the security culture in organisations.

19 Baba Shiv and Alexander Fedorikhin, “Heart and Mind in Conflict: The Interplay of Affect and Cognition in Consumer Decision Making”, Journal of Consumer Research, 1999, 278– 292.

20 Shai Danziger, Jonathan Levav and Liora Avnaim-Pesso, “Extraneous Factors in Judicial Decisions”, Proceedings of the National Academy of Sciences, 108(17), 2011, 6889–6892.

21 Simson. L. Garfinkel and Abhi Shelat, “Remembrance of Data Passed: A Study of Disk Sanitization Practices”, IEEE Security & Privacy, 1, 2003, 17–27.

22 John Maeda, The Laws of Simplicity, MIT Press, 2006.

23 For iterative design see J. Nielsen, “Iterative User Interface Design”, IEEE Computer, 26(11) (1993), 32–41; for participatory design see D. Schuler and A. Namioka, Participatory Design: Principles and Practices, CRC Press, 1993.

24 Horst W. J. Rittel and Melvin M. Webber, “Dilemmas in a General Theory of Planning”, Policy Sciences, 4, 1973, 155–169.

25 Hasso Plattner, Christoph Meinel and Larry J. Leifer, eds., Design Thinking: Understand–Improve–Apply, Springer Science & Business Media, 2010.

26 Taiichi Ohno, Toyota Production System: Beyond Large-Scale Production, Productivity Press, 1988.

27 Eric Reis, The Lean Startup, Crown Business, 2011.