Tilbage

We see what we expect – and miss what matters

Behavioral science·Sarah Aalborg·02/07-2025· 3 minutter

Forensics sounds like something that should be clean, technical, and data-driven. But if you've ever sat in a post-mortem meeting or led an investigation, you know that what actually happens is... messier.

Because even when we're dealing with facts, our brains are doing something else entirely.

They're telling stories.

We see a clue. We build a theory. And then, without noticing, we start looking for more things that fit the story we've already made up.

This isn’t a character flaw. It’s just how the brain works. It’s constantly trying to save energy by shortcutting the thinking process. So instead of treating every new piece of information equally, it quietly asks, “Does this fit what I already believe?”
If yes, it lets it through. If not, it dismisses it, reframes it, or ignores it completely.

This is what confirmation bias looks like. And it doesn’t go away just because you’re smart or experienced.

In fact, the more cases you’ve seen, the easier it is for your brain to say: “I’ve seen this before. I know what’s going on.”
Even when you don’t.

Confirmation bias

You start with a theory. Maybe the attack came through email?
And from that point forward, every log, alert, and timestamp gets filtered through that lens.
You focus on phishing indicators, spam filters, click-logs. You don’t check physical access or internal movement. Because that’s not the story you’re expecting.

Anchoring bias

The first clue you got - a flagged IP address or a bad login - becomes your anchor.
Even if it turns out to be irrelevant, your brain keeps adjusting everything else around it.

A real-world example

A study highlighted by The Guardian asked 53 digital forensics experts to examine the same hard drive. Half were told the suspect had confessed. The other half weren’t.
And the result? Those told about the confession found more evidence of guilt. The others didn’t.
Same drive. Same data. Different conclusions. Just because of a single line of context.

Bias doesn’t just affect users. It affects experts. Especially when they think it doesn’t.

Read the study here
Or The Guardian’s summary

So what can you do?

  1. Say it out loud
    “We’re at risk of confirmation bias and anchoring.” Naming it makes it easier to spot.

  2. Run a pre-mortem
    Ask: “What if we’re wrong?” before you dig too deep. Make space for more than one hypothesis.

  3. Bring in someone who wasn’t there at the start
    Someone who hasn’t been anchored yet. They’ll see things the others can’t.

  4. Track alternative narratives
    Keep two or three possible explanations alive. Use a matrix. Force it if you have to.

  5. Debrief and reflect
    Ask: “Where did we assume too early?” It’ll help next time – and make this one better too.

Why it matters

Forensics isn’t just about figuring out what happened. It’s about learning something useful from it.

But if the investigation only confirms what we already believed, we don’t learn much.
We just feel more confident about being wrong.

And in cybersecurity, that’s not a great outcome.

So no... you probably don’t need more tools or more data.
You need a way to catch your own brain doing what it always does.
Quietly filling in the blanks, skipping the hard questions, and pretending to be sure.

Call it out. Design around it.
Because the most dangerous blind spot is the one you don’t know you have.