This article was first published at (in German only) as part of Â̲èÖ±²¥'s #Security column. The column appears six times a year. Experts from Â̲èÖ±²¥ independently express their opinions on topics relating to politics, technology and awareness of IT security.
The human factor – of powerlessness and insanity
The human factor is one of the greatest risks in information security. However, the established solutions in this area are surprisingly uncreative and ineffective. For years, people have relied on e-learning courses and phishing simulations and wonder why employees still behave unsafely.
Paralysing powerlessness…
Humans are the number one gateway for cybercriminals. Social engineering is a successful approach that has been enjoying increasing popularity for years.
Various studies and surveys show that the human factor is involved in most investigated incidents. We cannot avoid the fact that we – the people, employees and users – represent one of the biggest risks, if not the biggest, to information security.
The security community – otherwise so flexible and innovative – shies away from this insight and seems gripped by a paralysing powerlessness in the face of the ‘layer 8’ issue: we provide information, even in plain language. What else can we do?!
Too often, in panels and podium discussions, I have seen this powerlessness in dealing with the human factor crystallise as the lowest common denominator of all the participating specialists. At the end of every discussion, however heated, everybody confirms that the problem is actually the users and their insistence on making silly mistakes. Et voilà – with this common concept of the enemy, everyone can part in peace with the feeling that their hands are tied.
… and insanity
When I look at the prevailing security awareness measures, an Albert Einstein quote often comes to mind: ‘Insanity is doing the same thing over and over again and expecting different results.’ For years, employees have been expected to make lasting changes to their firmly held convictions and behavioural routines having gone through an e-learning course and a phishing simulation. And for years, the security community has been wondering why these expectations are not being met.
Are we all insane? I often see job advertisements for security awareness positions that are looking for people with IT training. Listed under Duties are items such as ‘Develop campaign’, ‘Engage target groups’ and – my absolute favourite – ‘Promote a security culture’. The required skills include IT studies, enterprise security architecture and incident handling. This raises the following questions: what does an incident handling specialist know about developing communication campaigns? How is that supposed to work? Wouldn’t it make more sense to leave such tasks to communication professionals?
Beautiful dashboards
What happens in the vast majority of cases is as follows. Experts in the field of technology solve problems with technology – this is the usual approach. There is a software solution for everything, so why not also for unsafe employee behaviour? Without much thought, they buy the tried-and-tested e-learning solution including phishing simulation and enjoy the beautiful dashboards. This keeps them on familiar terrain. But will it bring about a lasting change in behaviour? No chance. This cannot be achieved through e-learning alone. And not even with gamification.
Taking the human factor into account means
- getting people interested in the topic of security and demonstrating its relevance,
- equipping and training users with relevant knowledge and skills in a manner appropriate to the target group
- and enabling the desired safe behaviour through adapted processes and supporting tools.
A degree in computer science does not provide the necessary skills and abilities for these three areas. And it shouldn’t have to.
Delegate, collaborate, hand over
Instead of going round in circles to the point of insanity, the security community should learn to delegate, collaborate and hand over. With the human factor, various disciplines are finding their way into the field of security. It’s no longer enough to make software more secure, we also must identify and eliminate weaknesses in processes and behaviours. Members of the security community can no longer afford to just stay amongst themselves; they must be open to dialogue and collaboration with other specialist areas. We need the knowledge of professionals in the fields of communication, psychology, sociology and usability. Some of the most successful security awareness programmes I know of are run by professionals outside of IT security. Without their expertise, most attempts at dealing with the human factor end up in another boring online training session.
A clear trend
At the Â̲èÖ±²¥ competence centre for security awareness, we’re pleased to see a clear trend towards interdisciplinary collaboration. In recent years, the security community has received more and more support from other specialist areas. There are initiatives such as the , which combine practice and research from various fields. There’s the at the Ruhr University Bochum and the at ETH Zurich. As the topic of digitalisation develops, security is also finding its way into social research.
At the same time, we are seeing companies now increasingly using resources to specifically address the human factor. The number of full-time security awareness professionals is increasing – slowly, but it is increasing. Events such as the continue to grow.
It is now up to the security community to continue this trend and not to close itself off. We don’t have to fall powerlessly into insanity, but can delegate, collaborate and hand over. Let’s think outside the box, look for advice in other fields and make job descriptions interesting to non-IT professionals.
Cyber Security