In this two part series, elevenM’s Tessa Loftus looks at ways to measure, and improve the measurement of, security awareness programs. 

Part Two: trends and culture

A good workplace culture is something that we all look for, it can be hard to measure, but we often know it when we see it. Whether it’s a culture of diversity, inclusion, security, or just a ‘good place to work vibe’, some of the things we look for as indicators of a good culture are:

  • it comes from the top
  • the organisation has proactive engagement with this issue
  • people engage with the issues voluntarily.

The same indicators apply to a culture of security, and while these may not be metrics in the standard sense of being a numerical data-point, but they are still areas against which we can check and report improvements in security-conscious behaviour.

The SANS Security Maturity Model recognises the importance of a culture of security as an integral part of security maturity. It tracks the development of a security awareness program as:

  1. Nonexistent: A security awareness program does not exist in any capacity.
  2. Compliance Focused. The program is designed primarily to meet specific compliance or audit requirements.
  3. Promoting Awareness & Behavior Change: The program identifies the target groups and training topics that have the greatest impact in managing human risk and ultimately supporting the organization’s mission.
  4. Long-Term Sustainment & Culture Change: The program has the processes, resources, and leadership support in place for a long-term life cycle, including (at a minimum) an annual review and update of the program.
  5. Metrics Framework: The program has a robust metrics framework aligned with the organization’s mission to track progress and measure impact.

While it may not be simple to measure the culture of security, there are ways to assess its maturity and effectiveness, and thus its growth and improvement.

‘Culture of security’ strategy

Having an active strategy for a culture of security is in itself measurement of maturity (as noted by SANS), although only if it is well-considered, organisationally size/risk-profile appropriate, and regularly updated. Many organisations will meet all the tick-box requirements (the ‘participation’ category), and, on paper, be doing what is necessary. But an organisation that has a security awareness strategy that goes beyond regulatory requirements is one that is both more aware of risk and taking steps to address that risk.

If that security awareness strategy is part of a top-down culture of security awareness, then it will be embedded in the organisational culture. That means it requires continuous review and improvement, C-suite sponsorship, and strategic engagement. Perhaps you would like to identify the baseline and track improvement/maturity over time? Perhaps you take the time to articulate what each level on the ‘maturing our strategy’ plan looks like. Assessing the appropriateness and maturity of your security strategy with help you measure improvement over time.

Events and self-education

Participation in optional awareness events and activities (such as internal and external conferences) indicates a high level of organisational awareness and engagement — if someone takes the time out of their day to attend an event, they have engaged with the importance of that issue, at least to a degree. Voluntary participation in self-education is one area where the participation metric is telling you what you need to know. However, this metric should take care to measure actual participation – attendance, not enrolment, and looking at attendance drop-off.

Awareness surveys

One of the methods of measuring the success of a security awareness program is occasional awareness surveys. Staff can be surveyed on security knowledge, the results of which are fed into training or education programs. An alternative to surveys are educational games, used as part of a specific event or at a specific time (such as during or after an awareness week event).

This creates a measurable trend metric but can return bad data if the surveys are too frequent or too onerous — people will tune out, skip answers or do the minimum, and then the information is less useful or just incorrect. Think about what information you are looking for — you want to find out if people have improved their awareness or understanding, or have/will change their behaviour.

Conclusion: Trends and culture as driving changes in behaviour

Trends, even simple trends, in measurable fields, will show improvement (or lack of improvement), and provide more information than a single data point. Examples of simple trends you may like to look at include:

  • consistent decrease in phishing clicks
  • consistent increase in report rate on phishing drills
  • consistent increase in report rate on suspicious emails generally
  • decrease in number of cyber security incidents caused by data misuse
  • decrease in number of cyber security incidents caused by human error.

However, as with any data-based decision, the more data points you have, the more accurate a picture you will build. For a true measure of the effectiveness of your security awareness program, you should look at multiple data points taken from different areas, and over time.

What you are seeking, overall, is a way to use data to assess changes in behaviour.

For example:

Getting things done on time

While mandatory training may not be the gold standard as an awareness metric, it can be used to assess more than completions. Is there a team that has a consistently low on-time completion rate for mandatory training? Could this indicate a top-down culture of security issue in that team? Do you see an improvement in on-time completion following events, surveys or a change in leadership? What does this trend indicate about the behaviour over time of this team/person?

Training for behaviours

Ultimately, an awareness program is all about changing behaviours. So, take the time to really assess everything that may be influencing that. Don’t think only about the direct A-to-B, action-to-behaviour (for example, reporting on phishing drill results leading to better results next time), think about all actions that may influence behaviour. Have you implemented or changed your training, and then seen a change in behaviour across the organisation? Perhaps you’ve updated the section on phishing in your mandatory training, then seen an increase in people reporting suspicious emails in your next phishing drill. Perhaps you held an awareness campaign event, then saw an increase in on-time completion of training. Maybe your new CTO comes from a background in security, and suddenly people are participating more readily in security training and events. Taking the time to look at how actions or changes may have impacted behaviours will tell you a lot about what influences behaviour in your organisation.

Cross-topic trends

Simple trend data will tell you one thing, but looking at multiple trends together will give a more complex and useful picture. Decrease in phishing drill click rate and improvement in suspicious email report rate at the same time as an increase or lack of improvement in human-error data breaches might tell you that your phishing/social engineering training is changing behaviour, but that now you need to improve your training about privacy, personal information, and data handling.

Demographics

Many organisations will have teams or individuals with a higher risk profile. This information can also be used to improve the quality of your trend data — are the teams/people with high click rates or late training those who handle sensitive information? This information will help you improve, or track improvement, in your awareness program — this may identify a need for more targeted training for high-risk groups, or the need for better entry-level training for those with a less technical background.

Culture of security

Don’t forget to take into account your culture of security — a top-down culture of security, while less measurable, is itself a very significant measure of impact and organisational behaviours, just as any culture issue is. An organisation with an Executive that values and prioritises a specific culture will have better compliance and engagement with the expected behaviours of that culture than one without the leadership. Do you have security champions? Are people in the Executive driving participating in awareness events? Has this engagement improved/decreased over time?

The least measurable metric of all

The ‘Chat in the kitchen’ metric: if I walked into the kitchen and people were chatting about how hard (or easy) the latest phishing drill was, or talking about a privacy or security event they attended, I would feel positive about the level of security awareness and engagement in my organisation.

Further reading

If you’re interested in reading further (and there is certainly a lot more to say on this topic), these are a few places to start: