Share this article on:
Organisations have been working on cybersecurity for years and still not getting anywhere… “Don't tell people not to do something - provide them with clear guidance about what they should (or must) do.”
Many of us have been toiling away at the cybersecurity coal face for years, if not decades, but can't help noticing: we're not getting anywhere. We're patching systems more and more frequently, upgrading defensive controls to incorporate threat intelligence, and detecting more and more malware at the perimeter - yet large breaches hit the headlines every day.
A key part of the problem is the one component we can't upgrade: the Mark I human. We get our users to complete the mandatory online security awareness course each year, and yet they're still clicking on phishing links, opening attachments and installing applications from Lord-knows-where. Clearly, something's not working. The answer may lie in the phenomenon of security fatigue.
Let me give you an example from my own recent experience. I logged in, for the first time in several months, to a financial services site using the credentials stored in my password manager. But immediately after successfully logging in I was forced to change my password.
This turned out to be a painful process, due to the imposition of complexity rules requiring a mix of upper case, lower case, a digit, a symbol, no characters repeated more than twice, etc. It took three attempts before my password manager coughed up a password that would pass muster. This kind of thing drives customers crazy; they just want to download a statement, not spend ten minutes trying to come up with a password that is unnatural but not impossible to remember.
NIST SP 800-63B deprecated this kind of nonsense back in 2017; see section. 5.1.1.2 which warns against complexity rules - and don't get me started on the use of a six-digit code SMS message (mTAN) for post-login verification, also deprecated in section. 5.1.3.3.
In its Cyber Security Awareness Month guidance, the Australian Cyber Security Centre rightly recommends the use of multi-word passphrases - not passwords - but I found this was near-impossible because of the site's complexity rules, coupled with a maximum password length much shorter than the NIST-recommended 64 characters.
In short, this was a tedious and time-wasting process. Cormac Herley, of Microsoft Research, long ago (2009) pointed out that by his calculations (which admittedly need to be updated), in order to be cost-effective, the effort involved in following security advice should be no more than 0.18 seconds per day. We need to substantially reduce the burden of security compliance.
Long lists of prohibitions - Don't click on links in emails! Don't open email attachments from unknown senders! - are mostly ineffective and lead to a state of learned helplessness. Put yourself in the position of an accounts payable clerk whose job is to process invoices, many of which will arrive as email attachments, sometimes from new suppliers that they do not recognise. Asking them not to open email attachments is asking them to not do their job.
Instead, we have to provide a positive default course of action which is always safe and secure. This may be a manual process which involves doing a malware scan (although this can easily be automated) followed by a phone call to confirm the invoice details and prevent a business email compromise. Don't tell people not to do something - provide them with clear guidance about what they should (or must) do.
We can then focus on a much shorter list of key cybersecurity behaviours and principles, using positive reinforcement - which we know works much better than negative consequences, by moving to user education and awareness - for example, demonstrating the consequences of clicking on phishing links and showing the signs that should make a user suspicious.
“Simple, clear purpose and principles give rise to complex, intelligent behavior. Complex rules and regulations give rise to simple stupid behavior.”
--- Dee Hock (founder and CEO of Visa)
What we must aim to do is to move from simple, rote-learning based training to a higher level which empowers users to make some security decisions by themselves, and to cooperate with a help desk or security personnel when they are uncertain. This is a cultural change which takes security out from behind a curtain and integrates it into the entire enterprise, for example by recruiting security champions from within the workforce who can be a point of contact for the staff around them and will also help to disseminate the key awareness messages.
In the next article, I’ll lay out some of the characteristics of a strong security culture.
For more information on how ALC can help with security awareness please see here
**This article was prepared for ALC by Les Bell. Les is a renowned Sydney-based writer, commentator, consultant and lecturer on Information Security and has been presenting cyber security training for ALC for more than 20 years.
References
Furnell, S., & Thomson, K.-L. (2009). Recognising and addressing ‘security fatigue.’ Computer Fraud & Security, 2009(11), 7–11. https://doi.org/10.1016/S1361-3723(09)70139-3.
Herley, C. (2009). So long, and no thanks for the externalities: The rational rejection of security advice by users. Proceedings of the 2009 Workshop on New Security Paradigms Workshop, 133–144.