Phishing

Why employees keep falling for phishing (and the science to help them)

Susan Morrow
July 20, 2021 by
Susan Morrow

I once received an email from a concerned individual who had contacted me through the email system of a forum for company directors. The emailer told me they had found a fake Facebook page showing me in a less-than-professional light. The fake Facebook page existed (I checked) and it was awful and damning. The person wanted to engage with me to help me “take the page down” (for a fee).

This elaborate phishing scam was put together, playing on potential concerns I might have over embarrassing and unflattering portrayals of my personal life. The scammer played on typical human behavior triggered by shame and embarrassment. They contacted me via a professional body and used my trust in that body to cement their claim. The scam didn't work in my case, but so often, fraudsters turn to human psychology to perpetrate a crime.

For far too long the human in cybersecurity was forgotten. Whenever cybersecurity was discussed, it was about how hackers break into systems using their technical expertise. Whilst this is true to an extent, the underlying pulleys and levers behind cybersecurity incidents, like phishing, are ultimately down to the human in the machine.

Strengthen security awareness with human risk management

Strengthen security awareness with human risk management

Infosec HRM, powered by Right-Hand Cybersecurity, provides alert-based training nudges to minimize human risk at your organization. 

What is it about humans that makes us the perfect vector for cybercrime? Is it our gung-ho attitude to clicking or our “link blindness?” Here are a few ideas of why employees keep falling for the phishing ruse and how cybersecurity training can benefit from the same principles.

It's all the fault of the HCI (human-computer interface)

Fraudsters the world over and across time have used human behavior as a basis for a scam. In the offline world, fraud has always relied on an element of manipulation of a human being at some point. However, when computers entered the frame, the scam potential took off. For computers to be usable by human beings, en masse, the discipline of human-computer interaction (HCI) was invented. Companies such as SRI International, who invented the computer mouse, opened ways of interacting with computers that tied human behavior closely to computer operations. HCI research involved the merger of computer science with cognitive science and human factors (user experience engineering).

HCI was vital in the humanization of computing as it allowed the human operator to have a more natural experience with a computer, allowing for more seamless operation. Since then, the close-knit connection between human operators and the computer has become the open doorway to cybercrime. As the HCI improved, and as UI engineers have taken user experience to a level where interaction with a computer interface became almost automatic, nefarious manipulation of the user only became easier.

Human behavior: What makes us click?

Human beings can learn behaviors, for example, parents teach kids to always wash their hands after going to the bathroom. Using computers means that certain behaviors are learned too. Building better user experiences is an area of computing that relies on designing systems that use natural behaviors or that avoid interactions that need to create new behaviors. The use of computers depends on certain learned behaviors such as clicking on a link to open a web page. 

User experience (UX) designers, base user journeys and a UI design on this kind of behavioral conditioning to make the use of technology easier and more intuitive. UX designers use Pavlovian conditioning type exercises to establish patterns of behavior that train users. An example is the use of feedback mechanisms such as changing the color of a button as a positive feedback mechanism.

This is important as many computer tasks are laborious and repetitive. The ingrained nature of the click is behind both the intuitive use of computers and many scams. It is not surprising that UI design must be as seamless as possible. In 2020, 306.4 billion emails were sent and received daily. With so many emails popping into our inbox, automated click behavior is common, making it hard to police every email; employees under time constraints may open an email as knee-jerk behavior to a regular task.

The result of all this careful attention to psychology within UX/UI design is that we have become conditioned when using a computer to the point where we automatically click links or open websites without thinking. This automation of conditioned behavior is a cybercriminal’s dream. Cybercriminals make use of the same psychological conditioning to get that automated click on a link in a phishing email.

By understanding human behavior and what makes us click, fraudsters have taken cyberattacks to new levels of success. But cybersecurity can also use the science of psychology to protect against cyberattacks.

Behavioral science and psychology

In the paper “Leveraging behavioral science to mitigate cybersecurity risk” the researchers focused on two behavioral elements, cognitive load and bias. The paper, published in 2012, was designed to encourage a process to build up knowledge at the intersection of behavioral science and cybersecurity. The researchers stressed that the technologists behind secure systems must “understand the behavioral sciences as they design, develop and use technology.” Since the paper was published, the idea of human factors being central to cybersecurity mitigation has been cemented by the massive increase in the use of social engineering by cybercriminals.

Get six free posters

Get six free posters

Reinforce cybersecurity best practices with six eye-catching posters found in our free poster kit from our award-winning series, Work Bytes.

Cognitive load and bias have also been explored in the paper “Encouraging Employee Engagement With Cybersecurity: How to Tackle Cyber Fatigue.” The paper looked at the impact on employees in dealing with overexposure to cybersecurity-related work demands or training. The researchers explored possible explanations, overconfidence and complacency from security training that impacted positive training outcomes. The study teased out two possible sources of cyber fatigue:

  • Fatigue source (action or advice)
  • Fatigue type (attitudinal and cognitive)

The research created a four-component model allowing an organization to identify the type of cybersecurity fatigue employees experienced and how work processes could be modified to improve cybersecurity behavior.

Behavioral cybersecurity and human-centric cybersecurity

Scams have increasingly included a large element of social engineering during a cyberattack. The increase in Covid-19 related phishing campaigns by 30,000 percent during 2020 is a case in point. Many of these scams played on concerns over the virus and used trusted entities such as the U.S. Centers for Disease Control and Prevention and World Health Organization (WHO) to manipulate the behavior of users by building trust and using fear as a trigger. Human behavior manipulation in all its forms works, as the success of phishing and cyberthreats that use behavior manipulation attest.

Psychology and behavioral science are now firmly recognized as offering scientific principles that can be applied to cybersecurity awareness programs as well as in the design and development of mitigating measures against cyberattacks. By merging these disciplines, a new area of behavioral cybersecurity has emerged. Behavioral science facilitates the mapping of risk to behavior and the understanding of the perception of risk by different types of employees. The result is a human-centric view of cybersecurity in the form of behavioral cybersecurity.

See our previous article “Can your personality indicate how you’ll react to a cyberthreat” for more details of personality types and cybersecurity behavior.

Behavioral cybersecurity and cybersecurity awareness programs

By applying the tenets of psychology to cybersecurity awareness programs an organization can improve and even enhance the training:

Social proof

The principle of social proof in psychology is often used by UI engineers to improve user experience. It is defined as “the influence of other people that leads us to conform to be liked and accepted by them.” This builds on the idea of the propagation of trust across networks of trusted people (or things). Carnegie Mellon’s Human-Computer Interaction Institute performs research into Social Cybersecurity. The institute looks at the applications of social psychology in the adoption of cybersecurity practices, using social proof to “boost awareness, knowledge and motivation to adopt secure behaviors online.”

Cybersecurity fatigue

If trainees become fatigued or disinterested, cybersecurity training will fail. Understanding how different people react to security awareness training can help personalize security training programs and make them more effective. The research paper on “Encouraging Employee Engagement With Cybersecurity” has some ideas for building more tailored and effective cybersecurity training programs based on human behavior.

Design for good security behavior

Design your security training programs around known trigger behaviors exploited by cybercriminals. Trust, conditioned behaviors, and social influence are used by cybercriminals to entrap users and manipulate behavior. These same behaviors can be used as a basis for training users in using good cybersecurity behavior.

Phishing simulations & training

Phishing simulations & training

Build the knowledge and skills to stay cyber secure at work and home with 2,000+ security awareness resources. Unlock the right subscription plan for you.

Empowerment through security education

The scientific evidence is stacking up in favor of using psychology to thwart cyberattacks. An article from the World Economic Forum sums the situation up “By understanding what drives people’s behavior, we can come up with ideas for how to change it.”  When looking to design the best security awareness training program, look at how it uses behavior and psychology to create more impactful training and nudge cybersecurity behavior in the right direction.

 

Sources:

Susan Morrow
Susan Morrow

Susan Morrow is a cybersecurity and digital identity expert with over 20 years of experience. Before moving into the tech sector, she was an analytical chemist working in environmental and pharmaceutical analysis. Currently, Susan is Head of R&D at UK-based Avoco Secure.

Susan’s expertise includes usability, accessibility and data privacy within a consumer digital transaction context. She was named a 2020 Most Influential Women in UK Tech by Computer Weekly and shortlisted by WeAreTechWomen as a Top 100 Women in Tech. Susan is on the advisory board of Surfshark and Think Digital Partners, and regularly writes on identity and security for CSO Online and Infosec Resources. Her mantra is to ensure human beings control technology, not the other way around.