The Hackable Human – 6 Psychological Biases that Make Us Vulnerable
Technology is only half of the story. Explore the human weaknesses that make us targets for cyber criminals
There’s a red thread that you can follow in each story about cyber attacks. If you pay attention, you’ll see how human nature is deeply rooted in the mechanics of successful cyber compromise.
Technology is only half of the story. When cyber crooks launch their assault on your devices and data, they don’t target just the security holes on your system. They also aim to prey on your weaknesses.
But how do attackers know which buttons to push to make users click on infected links, even when all the signs spell “danger”?
Today’s article focuses on just that: some of the cognitive traits that make us, humans, hackable (myself included, of course) and how to fight them.
Social engineering and its many tentacles
When you think about cyber criminals, you might be tempted to reduce them to the “hoodie-clad, lone wolf who does nothing but code” stereotype.
However, nowadays, cyber crooks are highly skilled in the art of digital illusion. They have a strong portfolio of tactics and knowledge, including:
- what Internet users like to do online and which brands they trust
- which wants and desires make these users act towards achieving them
- which technology products have the most vulnerabilities that can be exploited
- where they can purchase malware that can get them what they want (money, data or both)
- how they can build a business by recruiting more cyber criminals to spread their malicious software.
When all the elements I’ve just mentioned come together, you get a rough definition of what social engineering is. Its mission is clear: to persuade the victim to give up confidential information or perform actions that cause a security breach.
Anything you can think of, cyber criminals use on a daily basis: instilling fear, creating confusion, impersonating trusted people or entities, sabotage and a plethora of other mind games.
To bring down the bigger targets, social engineers spend time thoroughly documenting their attacks. They have to make sure that their plan can be executed to perfection. If you’ve watched Mr. Robot, you know how it works. (If you haven’t watched it, please do.)
The further you move from clear thinking and rational decision-making, the stronger the grip that cyber criminals have on you.
Our imperfect human nature turns us into liabilities for our own online safety. Add carelessness and distractions to the equation and you have the perfect scenario for an attacker to take advantage of.
The sooner we accept our faults, the faster we can learn to become stronger when confronted with cyber threats.
6 Psychological biases that favor bad decisions
Certain thinking patterns breed poor decision-making. Just like hanging out with the “cool” gang in high school gets to many teenagers to start smoking.
The 6 preconceptions below are traps we set up for ourselves and which Internet crooks exploit. It’s time to be honest with ourselves and admit that we can do better.
1. Anchoring bias
When you first bought a computer, you were probably told or found out that you need antivirus. Ten or twenty years later, you probably still believe that antivirus is the only solution you need to keep your computer safe.
This is the anchoring bias in action! Relying too much on the first piece of information you received (the “anchor”) will affect how you act going forward.
If your job and your personal life have changed in the past 10 years, then so has Internet security. It’s time to let go of the past and make decisions based on what’s going on at the moment.
2. Availability heuristic
“I don’t need antivirus or other security products. My brother doesn’t have antivirus and he never got hacked!”
The availability heuristic makes people overestimate how important the information that’s available to them really is.
Knowing someone who somehow got by without AV doesn’t mean that roaming around the web without any kind of protection guarantees your safety. That person may have a ton of malware on his PC without even knowing it.
So remember: the related situations you know are not the industry average. A tiny bit of research using trustworthy sources will give you a better impression of what’s objectively recommended.
3. Information bias
More information isn’t always better. This is what the information bias is all about.
You’ll find this to be especially true in cyber security. It’s easy to get caught up in all kinds of details, but you don’t need all those details to strengthen your online safety. You just need the right ones.
That’s why you may find it difficult to make a decision after reading tens of articles on the subject. The deeper you dig, the more complex it becomes.
I’m not saying you should fall into the anchoring bias I mentioned earlier. But you should choose the details that suit your purpose and act on them.
Internet security advice is abundant, but applying it is what makes a real impact.
4. Ostrich effect
“Look at all this news about cyber hacks! There’s nothing I can do about it, so I’ll just ignore it.”
As you can imagine, this bias comes in when we stick our heads in the “sand” and decide to just ignore negative information.
But we both know that ignoring an issue doesn’t make it go away. As humans, we may be hardwired to avoid psychological discomfort, but acting on this feeling is when change happens.
If you’re uncomfortable with negative cyber security news (which is torrential nowadays), it’s because you know that even you could become a victim. But sitting idly by is not going to stop that.
5. Placebo effect
You already know this one and you probably stumble upon it more often than you realize.
“I don’t go on any strange website, so there’s no chance I’ll get infected.”
Or: “Antivirus is all I need to keep my data and devices safe.”
The placebo effect might make you feel safe, but it doesn’t mean that you are safe. Cyber criminals don’t get scared because you strongly believe in your cyber security habits.
So don’t mistake your perspective for reality. They rarely overlap in Internet security matters.
“If I got infected with malware, I would know.”
This well-established bias is all about people who are too confident of their abilities. It can happen to anyone, but overconfidence can trick you into making bad decisions.
Remember that this is a subjective perspective, so you should check the facts to see if you’re not building a false sense of security.
Oh, and if you did get infected with malware, you most likely won’t notice. Second-generation malware, which roams the Internet today, is incredibly stealthy and damaging. It can infect your computer in a matter of seconds and trigger the attack at specific moments (for example, when you do online banking transactions).
It’s important that you train yourself to spot threats and avoid them, but your intuition, skills and experience can’t replace cyber security technology.
Developing cognitive humility
These 6 cognitive biases are a gold mine for cyber crooks of all ranges. They know that people tend to neglect cyber security because of these preconceptions or because they lack the time or skills to do better.
By becoming aware and accepting that we have our limitations and weaknesses, we can help us develop better strategies to protect us from ourselves. Not just in cyber security, but in life as well. This is what it takes to build cognitive humility.
So try to take a few minutes now to go over the biases listed above and see if they got in your way lately. Making a conscious effort to “override your default settings” can help you gain clarity and make better choices for your cyber safety.
The one key habit to cultivate your Internet safety
How you perceive things, your outlook basically, determines your actions. A perspective distorted by biases cannot lead to sound decision-making.
In the malicious hacker’s playbook, mental weakness = vulnerability. Attackers don’t exploit this with technology, but, as you now know, social engineering comes with a large toolkit.
Counteracting inevitable missteps is certainly possible. All it takes is sticking to one key habit that I’ve found helped me a lot. But before I share it, let me ask you:
Have you noticed how we think more clearly after something bad has already happened?
In hindsight, we make better decisions because we’re not limited by fear or scared of the unknown. At that stage, we’re not overwhelmed by emotion. Instead, we rely on logic and see things for what they are.
In real-life, however, I noticed that we’re more inclined to learn from our own mistakes rather than others’. It’s natural, and I’ve done the same many times over. But in cyber security (and some other fields), personal mistakes are usually costly experiences.
So the right moment to decide which cyber security products you should use and which advice is worth applying is now! Not tomorrow, not the next weekend.
“Now” is a great time. A time that’s not troubled, when your computer is malware-free and there are no constraints to rush you into poor decisions.
Now, when your story is not one of these.