Objective
The principal goal of risk analysis and risk mitigation is to protect your organization’s ability to perform its mission, not just to protect information assets and data. Because risk cannot be eliminated entirely, risk analysis must be presented in a manner that allows decision-makers to consider the operational and financial cost of protective measures weighed against the need to enable the business.
In other words, it’s not about you.
That probably sounded harsher than it needed to, but I want to hammer this in. Our goal as security professionals should be to provide analyses with as few biases as possible while not watering down the message. Decision-makers need us to provide the best information we can so they can make better choices.
We All Have Biases
Well, not me. I’m not biased. It’s not my fault that my gut instincts are always correct. OK, fine. You caught me.
One of the biggest lies we can tell ourselves is that we aren’t biased. Everyone sees the world, new ideas, interactions, and threats through the lens of their own personal experiences, knowledge, traumas, concerns, and hopes. You’re no different and neither am I.
That doesn’t mean that we can’t do anything about it.
The 5 Steps
Don’t Start with a Goal in Mind
I know this sounds contrary to everything we’re taught from an early age. “You need to set goals or you’ll never achieve anything meaningful.” “If you don’t know where you’re going, how will you get there?” This sort of advice is fine in certain circumstances but if you’re attempting to reduce bias in your analysis you need to throw this out the window. Fast.
Starting with a goal in mind can easily lead to confirmation bias. Confirmation bias is when we look for evidence that confirms our pre-existing ideas and ignore everything that indicates we could be wrong.
Let me tell you a story about a totally fictional leader of a major super power. This leader was certain that a far-away country was producing weapons that posed an imminent threat and were capable of doing so on a massive scale. He asked his intelligence agency to find evidence of these weapons. Because they went into it with the goal of finding evidence that confirmed pre-existing beliefs, their analysis was tainted before they even started. They saw what they wanted to see and ignored evidence that pointed in a different direction. Whether they were right or wrong, the choices made based on their analysis were not the best they could be.
This leads us nicely to our next step.
Don’t Try to Appease Anyone
In our 100% fictional account that was totally not based on real life, we saw that starting with a particular goal can lead to faulty or at least untrustworthy conclusions. Likewise, an intentional or even unconscious desire to please someone else can lead us to modify our conclusions, change our methods, or just plain cook the books (or logs in our case).
Be aware of any outside pressure to reach a certain conclusion. This sort of pressure to appease, especially someone in power, can manifest in many ways.
- The leader of the web development team wants you to downplay the severity of a vulnerability or penetration test finding because she’ll miss her go-live deadline.
- The engineer who you’ve been friends with for years wants you to exaggerate how vulnerable the server platform is so his proposal to change operating systems is accepted.
- The CISO wants you to keep certain findings out of a report to make the latest changes to the security program look more effective.
Not all outside pressure is as blatant as these examples. In fact, most aren’t and are therefore difficult to recognize. Stick to your methods and procedures. Report your findings reliably so decision-makers can do their jobs effectively. Trust in your work product is your currency. It’s difficult to earn and easy to lose.
Be Aware of Your Own Assumptions and Reasoning
We all make assumptions all day, every day. That’s not a bad thing. We need to make assumptions to get through our lives. It’s not about assumptions, but the reasoning behind them.
When I turn my front door handle, I assume that the latch will open. This is a reasonable assumption based on my knowledge of how door handles work, my experience in using them daily for the last 4 decades, my knowledge of the state of repair of my home, and the fact that it worked this morning. All of these things are directly or indirectly related to the likelihood of the expected result.
However, if I assumed that because I ate oatmeal (with banana, blueberries, and protein powder, of course) for breakfast the door handle would fail, most people would agree that is just silly.
Take the time to think about what assumptions you are making in your analysis and why. If they turn out to be unreasonable, re-assess and correct any errors before finalizing and communicating your analysis.
Consider Alternate Perspectives
This one is difficult for many people, but critical. You need to either play devil’s advocate or find someone who can. Actively seek differing opinions and alternate views on the same information.
This is where a diverse security team thrives. Different technical experience and specialties, different cultural backgrounds, different outlooks, different viewpoints, different risk tolerance. Cultivate a culture of difference where it’s safe to be the one dissenting voice and you’ll be stronger as a team.
But, don’t stop there. Seek non-technical opinions. Talk to the business people and actually listen. You may find out that things aren’t exactly what they seem.
Additionally, as we discussed earlier, confirmation bias is real and prevalent. To combat it, you must actively look for evidence that refutes your claims. Let’s play a little game to find out how important this is.
Let’s say I lay out four cards on the table and tell you that all face cards have a blue back. Which cards do you need to turn over to demonstrate this?
The first card is obvious. You need to turn over the King to see if it has a blue back. Many people then choose to turn over the blue card. Let’s say it’s a Queen. What have we learned? We have two confirming pieces of evidence for the rule. But remember, we’re talking about seeking disconfirming evidence.
If you stopped there, you’re missing an important piece of information. You need to turn over the red card. If it’s a face card, you’ve just disconfirmed the rule.
In fact, we don’t actually need to turn over the three or the blue card. Our rule says nothing about their expected states. I didn’t say that all blue cards are face cards or that number cards can’t have blue backs.
Fine. What does this have to do with security analysis? The lesson here is that it’s easy to look for things that confirm our assumptions or hypothesis, but you need to pause and think about what would disconfirm it then perform that test.
Employ Bias-Reducing Techniques
Let’s face it. There’s a reason that medical trials use double-blinding in their protocols. Ensuring that neither the subject nor the researcher know what the outcome “should” be reduces the impact that both unintentional and intentional bias have on the results. Because you’re not working with a controlled trial where you can administer a placebo, this is more difficult but you can still use specific techniques to reduce bias.
Let’s say you need to communicate the risk of a botnet performing a successful credential stuffing attack against your website. You’re trying to gain buy-in and funding to deploy anti-botnet technology. It’s clear to you that if you’re currently experiencing botnet attacks, the likelihood of one being successful is higher. Which of these requests for a log file analysis do you think will result in a less biased result?
“Hey, Alice. Can you look at this log file and tell me if you see anything of interest?”
In this scenario, it’s pretty obvious that if Alice comes back stating that she’s fairly certain the logs show an active botnet attack you can trust that analysis. You gave her no priming on what she may find, so her analysis is properly independent.
“Hey, Alice. Can you look at this log file? I need to see if we have any botnets currently attacking our login page. I’m concerned that a credential stuffing attack could be successful and we need to get some funding for that super cool anti-bot tech we looked at last week.”
Even if Alice is trying to be objective, the likelihood of a biased result from the second request is high. No matter what the result of the analysis is, you won’t be able to trust it.
Maybe Alice thinks the anti-bot tech you looked at last week is a waste of money so she downplays the seriousness of current events. Maybe Alice wants to please you and unintentionally skews the analysis to support the narrative you provided. Maybe Alice, now primed to see botnets, sees patterns in the logs that simply aren’t real. We are pattern-seeking animals, after all.
Remember the 5 Steps
- Goals
- Don’t start with a goal in mind. Communicate in a manner that is not influenced by any preference for a particular decision.
- Appeasement
- Don’t try to appease anyone. Communicate in a manner that is not crafted to appease any individual, group, or specific outcome.
- Assumptions
- Be aware of your own assumptions and reasoning.
- Perspectives
- Consider alternate perspectives and contrary information. Look for evidence that refutes your perspective.
- Techniques
- Employ bias-reducing techniques.