Share this article on:
“people way down the line know what they are supposed to do in most situations because the handful of guiding values is crystal clear.”
Security culture is, in many ways, similar to the safety culture found in airlines and similar industries and, to a lesser extent to the quality cultures found in manufacturing and some service industries. Like them, it is built and maintained via security education, training and awareness. Perhaps the leading researcher on safety cultures has been Prof. James Reason, whose contributions in aviation safety, in particular, have saved countless lives. Some of these ideas can save us a lot of wasted effort, not to mention avoiding a wide range of breaches.
Reason adopts the following definition of organisational culture, which he attributes to B. Uttal, writing in Fortune magazine:
Shared values (what is important) and beliefs (how things work) that interact with an organisation's structures and control systems to produce behavioural norms (the way we do things around here).
Quoting Thomas Peters and Allan Kennedy in In Search of Excellence (a pop-management hit of the 1980's), Reason points to the attributes of companies which have strong cultures:
In these companies, people way down the line know what they are supposed to do in most situations because the handful of guiding values is crystal clear.
Reason identifies a number of key components of a safety culture, and these apply equally to a security culture:
Continuing respect for the many entities that can penetrate and breach the defences. In other words: not forgetting to be afraid. We may well disapprove of the ethics of threat actors, but must always maintain a healthy respect for their skills, which they may have honed over thousands of hours.
An informed culture - making use of a security information management system that collects, analyses and disseminates information from incidents and near-misses as well as from regular proactive checks on the system's vital signs. This integrates strategic threat intelligence from both external feeds and internally produced by the SOC, allowing us to direct our efforts where they will return the most benefit, i.e. most reduce risk.
A reporting culture - an organisational climate in which people are prepared to report their errors and near-misses. In air safety, near-misses are important as they provide free lessons, and so airlines and regulators operate confidential incident reporting systems which allow the reporters to avoid some sanctions and penalties in exchange for providing useful information.
A just culture - an atmosphere in which people are encouraged or even rewarded for providing essential safety (security) -related information - but in which they are also clear about where the line must be drawn between acceptable and unacceptable behaviour. In other words, just being willing to report an egregious violation does not provide a get-out-of-jail free card - but failing to report can make things much worse.
A flexible culture - evidence shows that high-reliability organisations possess the ability to reconfigure themselves in the face of high-tempo operations or certain kinds of danger. This often involves shifting from the conventional hierarchical mode to a flatter professional structure, where control passes to task experts on the spot, and then reverts back to the traditional bureaucratic mode once the emergency has passed. Consider this especially in the context of incident response for a material breach; although executives must be informed and consulted at times, they cannot be expected to deal with technicalities and should be prepared to trust the incident response team. Consider preparing a RACI matrix for critical decision points in incident response playbooks.
A learning culture - the willingness and the competence to draw the right conclusions from its safety (security) information system, and the will to implement major reforms when the need is indicated. Treat every reported near-miss, every incident, as a learning opportunity and consider incorporating it into updated security awareness training or even rewritten policies. Make sure to close the incident response loop with the "Lessons Learned" phase: use what you have learned to update incident response plans and playbooks
There's a lot more to this, obviously; we have to take concrete steps to embed these characteristics throughout an organisation, such as recruiting cybersecurity champions, bringing security personnel out of the back room to talk to users and building awareness campaigns using both internal and external resources. The shift to hybrid work adds an extra layer of complexity, as it isolates staff members from the workplace culture and leaves them to make security decisions without easy access to support resources.
In addition, there are other, more specific components which can usefully be introduced into technical security operations (and which I can confidently say would have prevented some highly-publicised breaches). But that will have to wait for another article.
For more information on Security Awareness Training and how ALC can help please click here.
**This article was prepared for ALC by Les Bell. Les is a renowned Sydney-based writer, commentator, consultant and lecturer on Information Security and has been presenting cyber security training for ALC for more than 20 years.
References
Reason, J., Managing the Risks of Organisational Accidents, Ashgate Publishing Limited, 1997.