Alright, buckle up buttercups, because we’re about to dive into the ever-so-thrilling world of cybersecurity awareness programs. You know, those things your company makes you do once a year that magically transform everyone into impenetrable human firewalls? Yeah, those. Spoiler: they’re often a load of crap, but not always for the reasons you think. It turns out, the fancy tech is only half the story; the real mess is, as always, us – the people.
First off, let’s talk about “culture.” It’s this nebulous blob of shared values, beliefs, and how things are “done around here.” And boy, does it throw a wrench in the cybersecurity works. The document I was forced to read (let’s call it the “source material” for this little rant) bangs on about how organizational culture is foundational. If your company is all about “move fast and break things,” they might be a bit too quick to adopt shiny new tech without thinking about, oh, I don’t know, securing it. On the flip side, if they’re so risk-averse they still communicate via carrier pigeon, they might be safe, but they’re also probably not growing. It’s a delicate bloody balance.
And it’s not just me saying this. Some academics, in what I can only assume was a moment of profound boredom, decided to empirically analyze this stuff. Their paper, something along the lines of ‘Individual and Contextual Variables of Cyber Security Behaviour’, pointed out that the existing literature is annoyingly fixated on the technical bits of cyber security, while the messy human factors need way more research. They found that national culture, the type of industry, and the company’s own security vibe are pretty big deals when it comes to whether people behave securely or, well, don’t.
The source material also whines about how tough it is to actually measure this security culture. It’s subjective, people see things differently – shocker. You need a mix of chats and actual numbers to get a decent picture. And you can’t just bolt on a “security culture” initiative; it has to actually fit with the overall company culture, or it’s doomed. This is a sentiment echoed by other research I stumbled upon, like a piece called ‘Developing a cyber security culture: Current practices and future needs’, which also highlighted that organizational culture is a big player in crafting these models. They also pointed out that things like getting the big bosses on board and having clear policies are, you know, critical. Who knew?
Then there’s training. God, the training. It needs to be inclusive and not treat everyone like they’re either a tech wizard or a complete Luddite. Tailor it, make it relevant – basic adult learning stuff that somehow still gets missed.
It’s Not Just Culture, It’s the Whole Damn Organization
Beyond the fluffy culture stuff, there are good old organizational factors that make or break these awareness efforts. The source material points out that your job title apparently dictates how much you know, or care, about cybersecurity. Non-IT execs? Often blissfully unaware of critical stuff compared to their techy counterparts. It’s almost like people in different roles need different information. Mind. Blown.
And leadership commitment? Oh boy. If the CEO thinks cybersecurity is just “that IT thing,” then good luck getting anyone else to take it seriously. The source material, and pretty much common sense, suggests that when leaders champion security, it actually makes a difference. Some folks who wrote ‘Investigating the Role of Socio-organizational Factors in the Information Security Compliance in Organizations’ surveyed a bunch of employees and, wouldn’t you know it, found that management commitment, along with training and accountability, really matters. It’s about making security a company-wide concern, not just something for the nerds in the basement. Another paper I saw, delightfully titled ‘”What Keeps People Secure is That They Met The Security Team”‘, noted that security awareness is often weirdly split between actually telling people how to be secure and just, like, making employees feel connected to the security team. Billions are spent on this, and the whole practice still feels a bit… vague.
Communication is another one of those “no shit, Sherlock” factors. If people are scared to report a potential oopsie, then you’re flying blind. Open communication, where people aren’t terrified of getting blamed, means you might actually catch things before they go completely sideways.
And the training programs themselves need to be backed by clear policies. It’s not just a one-off “don’t click weird links” email. It’s ongoing education, keeping people in the loop about the latest ways they’re going to get phished. This isn’t just about ticking a compliance box; it’s about trying to build a real security culture.
Do These Awareness Programs Even Work? Or Are We Just Shouting into the Void?
This is the million-dollar question, isn’t it? How do you actually tell if these programs are effective? The source material suggests a mix of things: track phishing simulation fails (always a fun office league table), incident response times, and, dare I say it, actually talking to employees via surveys and focus groups. Some other clever people, in a paper specifically on the ‘Evaluation of Security Training and Awareness Programs’, also highlighted how critical it is to measure this stuff to minimize human security risk. They even tried to devise some guidelines, bless their hearts.
A big part of the problem seems to be a shift that’s needed, as pointed out in a study called ‘From Compliance to Impact: Tracing the Transformation of an Organizational Security Awareness Program’. We need to move away from just measuring if people completed the damn training to seeing if their behavior actually changed. Apparently, not many studies have dug into how security teams are supposed to make this magical transformation happen.
And for the love of all that is holy, cybersecurity awareness can’t just be a once-a-year thing, or a special “month.” It needs to be a constant, nagging presence, like that one relative you can’t avoid at family gatherings. Continuous messaging, people!
Why It All Goes Pear-Shaped So Often
So, why do so many of these well-intentioned (or sometimes, not-so-well-intentioned) programs fail spectacularly? Well, as mentioned, measuring genuine effectiveness is a nightmare. Employees have different levels of awareness, the threats are always changing, and organizational dynamics are a messy beast.
Often, these campaigns just plain fail to change behavior. I recall skimming a paper, rather pessimistically titled ‘Cyber Security Awareness Campaigns: Why do they fail to change behaviour?’, which talked about how just scaring people (the old “fear appeals”) often doesn’t cut it. There are psychological models of behavior change, apparently, and just shouting “BE AFRAID” isn’t one of the effective ones.
And sometimes, the whole thing is just a compliance exercise, which, as the source material and common sense dictate, is not the path to actual security.
Fine, What’s the Magical Fix Then? (Spoiler: There Isn’t One)
The source material offers some recommendations, which, to be honest, sound suspiciously like common sense that’s been dressed up in corporate jargon.
- Tailor your training programs. Groundbreaking. Make materials for different roles and tech skills. Use different formats. Don’t use impenetrable jargon.
- Foster open communication. Create channels where people feel safe reporting stuff. Be transparent. Again, not exactly rocket science.
- Leadership commitment. Yes, we’ve covered this. Leaders need to actually lead by example and prioritize this stuff. C-suite involvement is crucial, as some folks at MIT Sloan also pointed out in their work on building an organizational cybersecurity culture model.
- Encourage employee empowerment and responsibility. Make people feel like they have a stake in this. Maybe even designate “Security Champions” – because nothing says “empowerment” like a fancy new title with no extra pay.
- Conduct regular assessments and feedback. Keep checking if what you’re doing is working. Surveys, focus groups, analyze those phishing stats.
So, there you have it. Cultural and organizational factors are a massive, complicated, and often frustrating part of making cybersecurity awareness anything more than a corporate tick-box exercise. It’s a never-ending slog of trying to get humans to behave in ways that don’t compromise the entire damn company. Good luck with that. You’re gonna need it.
Citations:
- https://ppl-ai-file-upload.s3.amazonaws.com/web/direct-files/attachments/698559/f768ed98-d05c-43ee-b4c5-d52bdeb5a717/How-do-cultural-and-organizational-factors-influence-the-effectiveness-of-cybersecurity-awareness-programs-across-different-sectors.pdf
- http://arxiv.org/pdf/2405.16215.pdf
- https://arxiv.org/pdf/2112.06356.pdf
- https://arxiv.org/pdf/2106.14701.pdf
- https://arxiv.org/ftp/arxiv/papers/1901/1901.02672.pdf
- https://arxiv.org/pdf/2309.07724.pdf
- http://arxiv.org/pdf/2404.18365.pdf
- https://arxiv.org/pdf/1606.00875.pdf
- https://arxiv.org/html/2504.02109v1
- https://arxiv.org/pdf/1906.09584.pdf
- https://su.diva-portal.org/smash/get/diva2:1955667/FULLTEXT01.pdf
- https://iacis.org/iis/2023/4_iis_2023_51-65.pdf
- https://www.sciencedirect.com/science/article/abs/pii/S0167404820302765
- https://www.upguard.com/blog/developing-a-culture-of-cybersecurity
- https://www.emerald.com/insight/content/doi/10.1108/ics-11-2023-0209/full/html
- https://arxiv.org/pdf/2312.12073.pdf
Answer from Perplexity: pplx.ai/share