Tilbage

Your brain is sabotaging your cybersecurity tools

Sarah Aalborg·29/09-2025· 3 minutter

We like to think of cybersecurity as a technical discipline.

We buy tools, configure alerts, and trust our processes to keep the organization safe. But the biggest challenge isn’t always in the code or the configurations... It’s in our own brains.

When I talk to security teams, I often see a familiar pattern. They have a SIEM full of alerts, incident response plans ready to go, and a team that works hard to keep up. Yet in some cases incidents slip through, false positives pile up and takes up a lot of time, and investigations take longer than anyone would like.

It’s easy to blame the tools. But what if the real culprit is our human brain quietly sabotaging our efforts?

Three areas where bias creeps in

In my newst talk, I explore three common areas where our brains interfere with our best intentions:

  • SIEM and configuration – how we set up alerts and choose what to monitor

  • incident management – how we respond when something happens

  • forensics and investigations – how we make sense of what went wrong

Each of these areas seems straightforward in theory. But in practice, our cognitive biases can lead us astray.

The hidden forces behind our decisions

Some of the key biases that show up in cybersecurity work include:

  • Availability bias and complexity bias, which shape what we notice and prioritize

  • Action bias and alert fatigue, which push us toward reacting fast rather than reacting well

  • WYSIATI (“what you see is all there is”), confirmation bias, and the famous invisible gorilla experiment, which make it easy to miss the obvious

These mental shortcuts evolved to help us survive as humans. But in security and technical setup, they can cause us to misconfigure tools, chase the wrong alerts, or overlook critical evidence.

So what can we do about it?

The good news is that once we understand how these biases work, we can design processes and tools that help counteract them.

It’s not about becoming perfectly rational. It’s about building systems that take our messy, human brains into account.

This post is the start of a new series where I’ll explore each bias in detail and share practical ways to reduce their impact on your security program.
Because when we understand how our brains really work, we can turn our biggest weakness into one of our greatest strengths.

If you're Danish and found this interesting, you’ll love my newsletter. 
It’s where I share practical tips on how to work with – not against – human nature in cybersecurity.

Sign up here and get the next post straight to your inbox.