Everyone knows that security starts with awareness. That's why information campaigns within professional organizations are often an integral part of security policies. But do these campaigns have an effect? How long does that effect last and how do you measure it?

Opportunities to measure the effect of awareness campaigns was the subject of a meeting at De Nederlandsche Bank in Amsterdam, organized by Security Awareness NL. It is a very important topic, because if employees do not continuously keep the importance of security in mind, even the most advanced measures no longer offer protection.
Wrong things
Professor Muel Kaptein, professor of business ethics and integrity management at Erasmus University, addressed the question of why good people sometimes do the wrong things. He outlined his explanation using a scientific model with eight culture dimensions. With these, culture and behavior can be measured within most diverse organizations. "Measuring is knowing if you know how to measure," Kaptein spoke cryptically. "Because how do you know how to measure so that what comes out is applicable?" Awareness, or awareness is not always black or white, the professor stated. "The question is always whether you are 'in control' as an organization and have done enough to prevent people from causing a serious incident." He advised that the first thing to do is to make clear with ground rules and job descriptions what is expected of employees. "Otherwise, you simply fall short as an organization," he said.
Norms and values
Kaptein also pointed out the effect of exemplary behavior. People readily regard the behavior of others as the norm. Even if the booklet says otherwise. "Also create commitment if you want people to do what they are asked to do!" Fraud, according to the professor, manifests itself mainly when things are going very well "they can spare it" or when things are going very badly "save what can be saved" with the organization. Constantly putting people under high pressure also encourages undesirable behavior. 'I knew it wasn't allowed, but I wouldn't have managed it otherwise.' A high chance of being caught does deter, according to Kaptein, but not if people do not know they are doing something wrong. "So make behavior negotiable and keep in mind that there are many shades of gray between black and white." According to him, it is disastrous if fraudulent top people receive a fat bonus upon their dismissal.
Security Walks
The top of organizations largely determines security culture, Kaptein continued. "It's about how people perceive their organization. What they think of the top and the extent to which things are negotiable. For example, how are people motivated to comply with safety rules? If the corporate culture is good, there is also awareness. After all, it is nurtured by the organization. But how do you know if the culture is good? How do you measure that?" Security Walks are a good method, according to the professor. As management, just keep your eyes and ears open. "If you leave that to a 'tool,' you waste a lot of your own skills. Just listen in the elevator to what kind of confidential information is revealed." Another method is to survey employees to find out how they perceive clarity and discussability within their organization. "By doing this periodically, you can see the effect of changes," he says. Furthermore, you can take pictures of unsafe situations, such as confidential documents on desks or in the trash. "Cleaners often know what is going on in a company better than the employees. So they too are a good measuring tool to gauge corporate culture."
Social engineering
The carelessness of employees is being exploited by "social engineers. These are criminals who use social media and other means to locate weak vital links in organizations in order to circumvent security measures. Kaptein cited an example of nice girls advertising a new restaurant nearby in front of a company building. Employees were invited to visit the website and come up with a slogan. The best slogan was rewarded with a free dinner. The girls had been hired by the employer, who could thus see how many employees were susceptible to a common trick used by cybercriminals, luring employees to a website with malicious software. "Yet then you still know little about the culture. Because why are people so careless? So why do they do what they do? And how does that relate to the organization's standards framework?" the professor asked rhetorically. "In measuring that, objectivity is important. So don't do that yourself, because that only produces socially desirable answers. What you conclude must also be unequivocal and not intended to lecture people. A wrong culture is not the result of individuals. Link the measurement results back too, otherwise you won't change anything. Embed what you measure in the safety culture of the organization. And just tell people openly and honestly what the shortcomings of safety measures are and what their role is to compensate for those shortcomings."
Psychology as security
Psychologist Drs. Maarten Timmerman of Awareways Detachment, in collaboration with the University of Utrecht, developed a research method to map the degree of information awareness in organizations. This research has now been conducted within a large number of companies and provides concrete insight into behavior and culture regarding information security. "Security technology is now so effective that criminals will use psychology to make their move," he warned. "The question then is how to secure against that with psychology."
According to Timmerman, people often don't know how much value business information has. "Just think about a patient record. In healthcare, the primary concern is the health of the patient. Protecting information is then considered of secondary importance. You can point out the importance of a strong password, but highly skilled people almost take that as an insult. So it's important to know how "aware" people are and what the investment in an awareness campaign pays off. But how do you put that on the measuring stick?"
Gap between intention and behavior
The psychologist dove into the books with a student and found "Based on Theory of Planned Behavior" by Icek Ajzen. "That established that while people can think something is very important, this doesn't mean they will then do anything for it. So how do you close the gap between intention and actual behavior?" Timmerman also found that vulnerability to phishing mail, for example, is related to cognitive availability. "Under great stress, people become less careful. Then there are other priorities than information security." Also important, according to the psychologist, is the emotion a strong password evokes in people. "A sense of security or irritation?"
Like Kaptein, Timmerman pointed out the importance of management's role model. In some of his surveys, half of the employees say they don't notice a good example. Management may be doing a good job, but employees don't see any of that. Then they can't address it either. "It's low-hanging fruit. So make security negotiable! After all, people often think their risky behavior won't hurt them. Certainly not if colleagues and managers are also guilty of it. So make sure that at least 30 percent of the people set a good example. Then the rest will come naturally. And provide enforcement. That's necessary for many people if they want something to matter."
A lot of influence
"Behavior has a lot of influence on safety within the organization," Timmerman emphasized. "So make it clear to people how much damage they can do and what the consequences are. Don't counter undesirable behavior with measures that are too complicated for many people and do continue to give a certain amount of trust and responsibility. Specify the standard you want to achieve and make it a kind of internal competition. "Who has the most tidy desk? But always start with management. If they don't support it, it will never work."
The psychologist recommended supporting any new security measure with an awareness campaign. "Show that it's important for everyone! Then you'll see a need for knowledge emerge, such as how to recognize a phishing email. You can use all kinds of tools from psychology to drive behavior, such as rewarding good behavior. But keep repeating. That's important to get it done for the longer term. You can't change a culture, but you can change a safety culture. By measuring, you can see where you've made gains and what challenges remain. Security awareness is like a vaccination. You also have to keep repeating it to keep it effective." Through injury and disgrace, people do not become wise, Timmerman concluded. "Research by Alert Online showed that 53 percent of victims do not take extra measures after a cyber-attack. Much more effective is to regularly show what measures work out. For example, 20 percent improvement and a month later 30 percent improvement. Then you show that it is useful and that is the best motivator!"
Digital behavior determine
The last speaker was also a psychologist by profession. Dr. Sophie van der Zee stated that knowledge about risks does not always lead to safe behavior yet. She said that is something to consider when putting together safety training and campaigns. Measuring safety awareness can be done scientifically and practically, according to Van der Zee. Her ambition is to narrow the gap between the two. She is also researching factors that determine people's digital behavior, such as the cybersecurity paradox. This involves warning people about things that then turn out never to happen. So is it still something to worry about?
Smartest kid in the class
According to Van der Zee, many companies believe that people will behave more safely if they know how to do so. "We investigated scientifically whether that is justified. People had to fill out a list of questions about safety. Then they received training to increase awareness. This was followed by another questionnaire to show whether awareness had actually improved. That turned out to be the case. But had behavior changed as a result? People could now recognize a phishing email, but did that mean they would no longer click on dangerous links? Unfortunately. When people know they are being tested, they are the smartest kid in the room for a while. But that can quickly change unless someone is looking over their shoulder.
Predicting daily behavior
175 subjects received an email two weeks after completing the study that had many characteristics of a phishing email. Yet nearly half clicked on the link in it. So what we want is to be able to predict everyday behavior. That's a challenge. You can count concrete behaviors, but not the risks people take. Such a phishing email, followed by a warning, is then a good method. The content of that warning also matters. 'This website contains malware. Continuing may harm your computer. Please return to safety!" does not work. Better is: 'This website contains malware. 90% of the people receiving this warning, decided to return to safety'. By measuring behavior you do not yet know what causes it, but you can determine the effect of measures. Also effective is training groups differently to see what works best." Van der Zee wants to set up a website with the results of field experiments so that it can be seen which factors are or are not important. Those results will then also be used for upcoming Alert Online campaigns.
Shame
Led by chairman of the day Erik Jan Koedijk, chairman of Alert Online, the evening concluded with a panel discussion. One of the questions was whether research should always be done by an external party. Kaptein advised to do so, because a butcher should not check his own meat and employees behave differently towards the 'boss' than towards an external researcher. A questionnaire is not enough, according to Timmerman. It tells you that people think safety is important, but not whether they will act accordingly. One of the participants pointed out that victims of traditional crime adjust their behavior. For example, they no longer open their doors at night. With cybercrime, unfortunately, the situation is different. Shame is an important cause of this. "How could I have been so stupid? By making it clear that it can happen to anyone, you make it discussable and we can learn from each other. As a victim, you should not be punished by your social environment as well. 'Never waste a good crisis,' Koedijk advised. "Build a good case from each incident. That will be useful to you. However, it requires good leadership. A leadership with the understanding that enforcement is not only meant as punishment, but also as a reward for those who do well.
Source: securitynews.com
This article can also be found in the Privacy in the Workplace file
