(This post is going to sound exceedingly pessimistic about us humans. It’s purposely slanted a bit to make some points, but also to let me rant just a bit.)
I just got done reading a rather large post elsewhere about information security training. And it was long, and detailed, and probably more detailed than anyone actually does, anywhere, without multiple full-time staff dedicated just to training.
Which brought me to the question: why do I take a slightly more pessimistic view of security awareness training? I like awareness training, but I put more emphasis on actual technology controls, than I do trusting people to do the right thing. I’ll trust, but I’ll verify. I’ll say security awareness training is necessary, but I won’t say it’s one of my key tenets I lean on to provide security or one of the most important things one can do in the business to improve security.
To me, training has a few achievable goals (this probably isn’t my exhaustive list, just a quick one):
1. checkbox. Let’s face it, requirements are a driver.
2. education on process – Make sure everyone knows how to deal with incidents or questions. Know to dial 911.
3. education on best practices – Enough knowledge to have a chance to make the correct decisions.
4. education on bottom-line performers – Provide education to those who truly didn’t know these things.
5. education about controls – What they are, why they’re in place, how they help. How to work with them instead of against them.
6. education about things too nuanced for actual controls (lots of social engineering falls here, and this is the elephant in this post).
That makes it sound like I want to deliver lowest denominator training, but that’s not true. I actually think training should challenge the audience a little bit, and make sure it improves knowledge, rather than baseline it. I prefer trainings that add value, even a little bit, to the audience, rather than “yet again” going over the same ol’ bullet points. I want people to learn something and not feel talked down to. One of the main problems is such learning can get into technical weeds pretty quickly. Questions like, “Well, why is this password weak?” or “What do you suggest to be more secure at home?” get deep very quickly, if you’re not careful and empathetic to the audience. Also, random attendance can mean you get non-technical folks in with the developers, and those developers love to ask questions about password complexity, because it’s arguable and there’s no real good right answer, which muddies the experience.
But, why do I get pessimistic about awareness training? For the same reasons I think people suck when they make risk decisions while driving. Unless there are radar detectors or tickets waiting around a corner, many drivers will drive at a speed that matches their own desires and risk tolerance; which often seem to be 5-15 mph over the posted speed limit, but sometimes more. Let’s just say 30% push this boundary marker on any given road.
These are the same people in the business as are on the road. And in the business, they have their own goals and things to get done for their job, boss, and customers. In fact, I would guess that 30% of employees will do whatever they need to do to get their jobs done efficiently, even if that runs contrary to security policies, as long as they’re not outright prevented. Need to trade a document with a client, but the client balked at the clunky “email encryption” solution you utilize? It’ll be ok to use Dropbox this one time. Email is too clunky? It’ll be ok to use Messenger on my phone. I need to work on this highly confidential document at home this weekend and I don’t want to bother VPNing in? It’s ok this one time to put it on my personal USB stick.
People will do what they can get away with if it is in their best interests. People are innovative, creative, selfish, and usually pretty passionate and determined. None of that should imply malicious, but there are malicious actors lurking as well.
This means you need to pair up education with technological controls. Actually stop the unwanted behavior as much as possible, or detect/alert and provide feedback when it occurs. And educate about those controls and why they are in place. It also means that breaking security policies should cost users more than they gain, making it actually in their best interest to follow the policies.
Education goes so far. You can post signs about children at play, school zones, speed zones, and even radar detection enforcement. But you have to have controls in place that properly detect, prevent, stop, and penalize unwanted activity if you truly want to reduce and change behavior.
I do think people generally want to do the right thing, but that often slides to the side when someone needs to get something done.
If a control impedes business or seems like it stifles innovation or “getting the job done,” then it needs to be discussed and the reason why such controls are needed. This way alternative solutions can be identified and tried out, rather than users crying about security and security crying about users. Both sides need to know the lines, the controls, and where the business itself wants to draw them.