top of page
  • Stewart Room

Bad tech design increases the digital divide. D&I implications for cybersecurity

Touch screen devices are great. From a design perspective they get rid of the mechanical buttons that are riddled with wear-and-tear downsides. And if you can get rid of mechanical buttons, you can save on production and manufacturing costs, bringing the device down in price. Everyone's a winner, right?


Well maybe not, if you're blind. BBC News published a very interesting story about this over the weekend, drawing attention to the fact that technology design choices can sometimes exclude people, adding to the digital divide. And as you read through the story, you'll pick up on the fact that sometimes shop assistants ask blind customers for their PIN numbers, so that they can tap in the information for the customer. That is a shocking security risk and an invasion of privacy, adding further harm and insult to the disability discrimination that is being suffered.


I mention this report to illustrate a big issue within security design, which is that a lack of diversity awareness or empathy for human factors will lead to sub-optimal security outcomes. This is a problem that regularly undermines security awareness initiatives, for example, in that they are not fit for purpose because they do not cater for the special characteristics of the target audience.


Good security design will recognise the importance of understanding the implications and impacts of human factors such as wants and needs, psychology, personality and character traits, cultural norms, beliefs and physical, cognitive and economic vulnerabilities. Likewise, in an organisational context, good design will take account of how the origanisational culture as a whole, including the "tone from the top", interacts with the kinds of human factors that I've just outlined.


Here are some examples of the types of issues that I am talking about:


If the organisation is production and task orientated, but not security orientated, people under pressure will prioritise the former in their actions, not the latter. The problem could be exacerbated further by a person's vulnerabilities.


If an organisation requires unique, complex, long passwords that are regularly updated, cognitive vulnerabilities combined with production and task demands may render it impossible for a person to cope with the mental demands of the password policy, leading to shortcuts that undermine security, such as writing down the password or leaving machines on while unattended.


If a password manager application is the solution to the problem of cognitive load, how does this work in a domestic setting, where people share a device for economic reasons?


If the domestic setting is abusive, what does the previous scenario mean for the abuser and the abused?


So what we see in the above range of connected scenarios is bad security design that has not considered the diversity and inclusion issues. The design solution of a password manager, while ostensibly good when viewed through limited lenses appears bad when we deploy new lenses. And none of the lenses that I have applied to the password problem are fanciful or fringe. They relate to common vulnerabilities.


There are many other security designs that may fail to recognise the existence of physical, congnitive vulnerabilities. For example, do captchas assume certain, uniform characteristics and ignore diversity? What about multi factor authentication, insofar as it relies a person having a smartphone?


The BBC report graphically illustrates the problem of bad tech design, but it's the tip of the iceberg.





bottom of page