0% found this document useful (0 votes)
47 views

Fpsyg 11 01755

The document discusses human cognition through the lens of social engineering cyberattacks. It proposes an extended framework of human cognitive functions to better understand how social engineering exploits weaknesses in cognition. The framework could inspire future research at the intersection of cognitive psychology and cybersecurity.

Uploaded by

eshensanjula2002
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views

Fpsyg 11 01755

The document discusses human cognition through the lens of social engineering cyberattacks. It proposes an extended framework of human cognitive functions to better understand how social engineering exploits weaknesses in cognition. The framework could inspire future research at the intersection of cognitive psychology and cybersecurity.

Uploaded by

eshensanjula2002
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

REVIEW

published: 30 September 2020


doi: 10.3389/fpsyg.2020.01755

Human Cognition Through the Lens


of Social Engineering Cyberattacks
Rosana Montañez 1 , Edward Golob 2 and Shouhuai Xu 1*
1
Department of Computer Science, University of Texas at San Antonio, San Antonio, TX, United States, 2 Department of
Psychology, University of Texas at San Antonio, San Antonio, TX, United States

Social engineering cyberattacks are a major threat because they often prelude
sophisticated and devastating cyberattacks. Social engineering cyberattacks are a kind
of psychological attack that exploits weaknesses in human cognitive functions. Adequate
defense against social engineering cyberattacks requires a deeper understanding of
what aspects of human cognition are exploited by these cyberattacks, why humans
are susceptible to these cyberattacks, and how we can minimize or at least mitigate
their damage. These questions have received some amount of attention, but the
state-of-the-art understanding is superficial and scattered in the literature. In this paper,
we review human cognition through the lens of social engineering cyberattacks. Then,
we propose an extended framework of human cognitive functions to accommodate
social engineering cyberattacks. We cast existing studies on various aspects of social
engineering cyberattacks into the extended framework, while drawing a number of
Edited by:
insights that represent the current understanding and shed light on future research
Paul Watters,
Independent Researcher, Melbourne, directions. The extended framework might inspire future research endeavor toward a
Australia new sub-field that can be called Cybersecurity Cognitive Psychology, which tailors or
Reviewed by: adapts principles of Cognitive Psychology to the cybersecurity domain while embracing
Alex Ng,
La Trobe University, Australia
new notions and concepts that are unique to the cybersecurity domain.
Luca Allodi,
Keywords: social engineering cyberattacks, cyberattacks, cyberdefenses, human cognition, cognitive
Eindhoven University of Technology,
psychology, cybersecurity
Netherlands
*Correspondence:
Shouhuai Xu 1. INTRODUCTION
[email protected]
Social engineering cyberattacks are a kind of psychological attack that attempts to persuade an
Specialty section: individual (i.e., victim) to act as intended by an attacker (Mitnick and Simon, 2003; Anderson,
This article was submitted to 2008). These attacks exploit weaknesses in human interactions and behavioral/cultural constructs
Cognition,
(Indrajit, 2017) and occur in many forms, including phishing, scam, frauds, spams, spear phishing,
a section of the journal
Frontiers in Psychology
and social media sock puppets (Stajano and Wilson, 2009; Linvill et al., 2019). For example, in
the 2016 U.S. election, attackers used so-called social media sock puppets (also known as Russian
Received: 19 January 2020
Trolls) or fictional identities to influence others’ opinions (Linvill et al., 2019). The effectiveness
Accepted: 25 June 2020
Published: 30 September 2020
of current security technologies has made social engineering attacks the gateway to exploiting
cyber systems. Most sophisticated and devastating cyberattacks often start with social engineering
Citation:
Montañez R, Golob E and Xu S (2020)
cyberattacks, such as spear phishing, where the attacker gains access into an enterprise network
Human Cognition Through the Lens of (Hutchins et al., 2011). Indeed, Mitnick and Simon (2003) describe many ways to gain access
Social Engineering Cyberattacks. to secure systems using social engineering cyberattacks. Research in social engineering has
Front. Psychol. 11:1755. mostly focused on understanding and/or detecting the attacks from a technological perspective
doi: 10.3389/fpsyg.2020.01755 (e.g., detecting phishing emails by analyzing email contents). However, there is no systematic

Frontiers in Psychology | www.frontiersin.org 1 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

understanding of the psychological components of these attacks, 1.2. Related Work


which perhaps explains why these attacks are so prevalent To the best of our knowledge, we are the first to systematically
and successful . This motivates the present study, which aims explore the psychological foundation of social engineering
to systematize human cognition through the lens of social cyberattacks. As discussed in the main body of the present
engineering cyberattacks. To the best of our knowledge, this is paper, most prior studies focus on social engineering cyberattack
the first of its kind in filling this void. or cyberdefense techniques. For example, Gupta et al. (2016)
investigate defenses against phishing attacks; Abass (2018)
1.1. Our Contributions discusses social engineering cyberattacks and non-technical
In this paper, we make the following contributions. First, we defenses against them. Few prior studies have an aim that is
advocate treating social engineering cyberattacks as a particular similar to ours. Salahdine and Kaabouch (2019) review social
kind of psychological attack. This new perspective may be of engineering cyberattacks and mitigation strategies, but they do
independent value, even from a psychological point of view, not discuss factors such as human cognition. Darwish et al.
because it lays a foundation for a field that may be called (2012) discuss at a high-level the relationship between human
Cybersecurity Cognitive Psychology, which extends and adapts factors, social engineering cyberattacks, and cyberdefenses, but
principles of cognitive psychology to satisfy cybersecurity’s they neither examine what makes an individual susceptible to
needs while embracing new notions and concepts that may social engineering cyberattacks nor do they discuss the effect
be unique to the cybersecurity domain. This approach would of a victim’s psychological and situational conditions (e.g.,
pave the way for designing effective defenses against social culture and short-term factors) on the outcome. Pfleeger and
engineering cyberattacks and assuring that they are built based Caputo (2012) take a multidisciplinary approach to examine
on psychologically valid assumptions. For example, it may be cybersecurity from a Behavioral Science perspective, but they
convenient to assume that individuals are willing to participate do not offer any systematic framework of looking at human
in defenses against social engineering cyberattacks or that victims cognition in the context of of social engineering cyberattacks.
are simply reckless. However, these assumptions are questionable The lack of studies in social engineering cyberattacks might
because most social engineering cyberattacks are crafted to be associated with these studies involving human subjects. In
trigger subconscious, automatic responses from victims while an academic setting, approval for deceptive studies on human
disguising these attacks as legitimate requests. subjects requires consent from all entities involved, including
Second, as a first step toward the ultimate Cybersecurity ethics board and IT department (Vishwanath et al., 2011). The
Cognitive Psychology, we propose extending the standard nature of the topic might also raise sensitivities among those
framework of human cognition to accommodate social involved (Jagatic et al., 2007), which can lengthen the process.
engineering cyberattacks. This framework can accommodate This can be discouraging for most researchers.
the literature studying various aspects of social engineering
cyberattacks. In particular, the framework leads to a quantitative 1.3. Paper Outline
representation for mathematically characterizing persuasion, Section 2 reviews a basic framework for human cognition
which is a core concept in the emerging Cybersecurity Cognitive (without the presence of social engineering cyberattacks).
Psychology and is key for understanding behavior in the Section 3 extends the basic framework to accommodate social
traditional framework of human cognition. Some of our findings engineering cyberattacks and systematizes victim’s cognition
are highlighted as follows: (i) a high cognitive workload, a high through the lens of social engineering cyberattacks, with future
degree of stress, a low degree of attentional vigilance, a lack research directions. Section 4 concludes the present paper.
of domain knowledge, and/or a lack of past experience makes
one more susceptible to social engineering cyberattacks; (ii) 2. OVERVIEW OF HUMAN COGNITION
awareness or gender alone does not necessarily reduce one’s
susceptibility to social engineering cyberattacks; (iii) cultural In this section, we review human cognition functions prior to
background does affect one’s susceptibility to social engineering the presence of social engineering cyberattacks. This framework
cyberattacks; (iv) the more infrequent the social engineering of human cognition serves as a basis for exploring how
cyberattacks, the higher susceptibility to these attacks; (v) for victims’ cognition functions are exploited to wage social
training to be effective, it should capitalize on high-capacity engineering cyberattacks.
unconscious processing with the goal of creating a warning
system that operates in parallel with the user’s conscious focus 2.1. Human Cognitive Functions
of attention; (vi) it is currently not clear how personality affects The term “cognition” can have radically different meanings in
one’s susceptibility to social engineering cyberattacks; and different contexts. Here, we use the term “cognition” in the
(vii) more studies, especially quantitative studies, need to be broadest sense as a descriptive term for the software counterpart
conducted to draw better and consistent results. In addition to to the brain as hardware. That is, cognition is the abstract
these findings, we propose a range of future research directions, information processing that is being implemented by neurons
with emphasis on quantifying the effect of model parameters in the brain (Pinker, 2009). From this perspective, cognition can
(i.e., victim’s short-term cognition factors, long-term cognition also include information processing that computes emotions as
factors, long-memory, and attacker effort) on the amount of well as the vast majority of neural information processing that is
persuasion experienced by the human target. not reflected in our conscious awareness (Baars, 1997).

Frontiers in Psychology | www.frontiersin.org 2 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

FIGURE 1 | A basic, selective schema of human cognition, where the blue background within the oval indicates long-term memory that is accessible by the four
components of human cognition functions.

Cognitive psychologists often consider information to be retained over time. The basic processes of perception,
processing to be the basic function of the brain, in the working memory, decision making, and action that are engaged
same way that the liver functions as a complex filter and “in the moment” use information that is preserved from
arteries and veins are essentially pipes. Correlates of information earlier moments in time. Memory consists of distinct systems
processing in the brain can be directly observed using various (Tulving and Craik, 2000), in the same way that our domain
methods to record electrical and chemical activity (Kandel of “perception” includes the visual, auditory, somatosensory,
et al., 2000). Information processing is evident at multiple olfactory, gustatory, and vestibular systems. One important
spatial, from compartments within individual neurons to tightly distinction among systems is whether the information is retained
organized networks having members in different parts of the over short periods of time, typically seconds to minutes, or
brain. These concrete, physically measurable, neurophysiological longer periods of time. In our overview shown in Figure 1, short-
activities are analogous to the hardware of a computer. Indeed, term memory is a component of working memory. Long-term
neurons have been profitably studied in terms of functioning memory contributes to cognition in general, and for this reason,
as Boolean logic gates, and action potentials, perhaps the most we have situated all of four domains supporting cognition in
characteristic property of neurons, is convey all-or-none binary the moment within long-term memory (indicated by the blue
states (Shepherd, 2004). background). As with the other cognitive domains, memory
Figure 1 presents a very basic, and selective, schematic systems can work in parallel. For example, the memory of the
of human cognition functions, which are centered at four previous sentence is supported by short-term memory, yet the
information processing components analogous to software memory for what each word means resides in long-term memory.
components in an information processing system. These four Above, we presented several basic types of information
components are called perception, working memory, decision processing that together generate behavior. We now consider
making, and action. These four components are elaborated how these basic cognitive processes can be influenced, for better
below as follows. Perception converts information in the word, or worse, by a few important factors that are demonstrably
sampled from the senses, into neural codes that can be used for relevant to cybersecurity. The “short-term” factors, reflecting
intelligent behavior and conscious experience (Mesulam, 1998). the immediate situation, and “long-term” factors are ultimately
Working memory consists of attention and short-term memory, coded in some form by the brain and exert an influence on the
and coordinates processing information by prioritizing certain basic cognitive processes that drive behavior. The short-term and
information for short periods of time, often to accomplish a goal long-term factors are elaborated in the next two subsections.
(Miyake and Shah, 1999). Decision making further prioritizes
information from working memory and other unconscious 2.2. Short-Term Cognitive Factors
sources and is a gateway to behavior (Kahneman, 2011). Action We focus on three short-term factors: workload, stress, and
is the implementation of computations from decision making, as vigilance. These factors operate on relatively short timescales
well as other influences, and also organizes the physical activity of (minutes to hours) that have been intensively studied because
muscles and glands that are measurable as behavior (Franklin and they impair human performance. We will consider how these
Wolpert, 2011). Perception, working memory, decision making, factors may relate to social engineering and point out the extant
and action are often considered to be roughly sequential, as literature and promising future directions.
when trying to hit a baseball, but can mutually influence each
other in many ways. All of these cognitive processes operate 2.2.1. Workload
on a foundation of accumulated knowledge in memory, which Human cognition is affected by cognitive workload, which
informs these processes, such as when perceiving a familiar face. depends on task demand and the operator in question.
Memory is intrinsic to cognition, because information Depending on the details, two tasks can be done at the same time
processing occurs over time and thus requires some information with little or no performance costs (a manageable workload) or be

Frontiers in Psychology | www.frontiersin.org 3 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

nearly impossible to do well together (a very high workload). A ways (Evans, 2008). The first is by relatively automatic processes
nice example comes from Shaffer (1975), who found that typists that are fast but may not be the optimal choice in some instances
could very accurately read and type while they also verbally (termed “heuristics” and “biases”) (Tversky and Kahneman, 1974;
repeated a spoken message. Performance, however, plummeted Gigerenzer, 2008). The second approach is by using conscious,
on both tasks if they tried to take dictation (typing a spoken controlled processing reasoning, which is slower but can be more
message) while also trying to read a written message out loud. The sensitive to the particulars of a given situation. Acute stress has a
differences are thought to reflect the use of phonological (sound- variety of effects on decision making and many subtleties (Starcke
based) and orthographic (visual letter-based) cognitive codes. and Brand, 2012), but it can, in general, impair rational decision
In the first example one code is used per task (phonological: making, and one way is by reducing the likelihood of controlled
listen to speech-talk; orthographic: read-type), while in the decision making and increasing the use of automatic processing.
second each code is used for both tasks (speech-type; read-
talk). To account for these complexities, psychologists have 2.2.3. Vigilance
developed theories that consider different types of cognitive Vigilance and sustained attention are two closely related,
codes (Navon and Gopher, 1979), such as auditory or visual sometimes synonymous, terms for the concept that cognitive
sensory input, higher-level verbal or spatial codes, and output performance will systematically change the longer you perform
codes for driving speech or manual behaviors (Wickens, 2008). a given task. Here we will use the term vigilance, which in
Measures have also been developed to quantify the subjective the laboratory is studied in sessions that typically last 30–
sense of how much “cognitive work” is being done in a given task. 60 min. In a classic work by Mackworth (1948), subject
Perhaps the most common instrument to measure subjective watched an analog clock and responded to intermittent jumps
workload is the NASA-TLX, which has six dimensions that are in the clock hand. Much work since then has showed that
clearly explained to the subject, such as “mental demand” or performance in a wide range of tasks declines substantially over
temporal demand (time pressure), and are rated on a scale these relatively short periods of time (termed the “vigilance
from low to high. Lastly, neurophysiological measures are often decrement”) (Parasuraman and Rizzo, 2008). In our view, the
used to provide objective, convergent measures of workload as potential impact of the vigilance decrement on behavior is an
well as suggest potential neural mechanisms. Neurophysiological important factor to explore, because the probability of user error
measures such as transcranial Doppler measures of blood may covary with time on task. For example, the likelihood of
flow velocity in the brain, EEG measures of brain electrical downloading malware may increase as users go through their
potentials, autonomic nervous system activity such as skin email inbox, particularly if they have limited time. Lastly, we
conductance and heart rate and its variability, and functional note that although the situational categories of workload, stress,
magnetic resonance imaging (MRI) to quantify changes in blood and vigilance are individually important to examine in the realm
flow that are secondary to neural activity are commonly used of cybersecurity, they are also known to interact with each
(Parasuraman and Rizzo, 2008). other. For example, a high workload and prolonged vigilance
are stressful (Parasuraman and Rizzo, 2008). Another distinction
2.2.2. Stress to keep in mind is that many laboratory vigilance tasks are
Acute stress may also influence cognition and behaviors that boring and have a low workload. The extent that the vigilance
are relevant to cybersecurity. We distinguish acute from chronic literature generalizes to other settings such as an office, where
stress, with chronic stress beginning after a duration on the workers may have high workloads and stress from complex job
order of months, as their impact on cognition can differ and demands, is an empirical question worth considering in future
chronic stress is better classified here as a long-term factor. The cybersecurity studies.
neurobiological and hormonal responses to a stressful event have
been well-studied, as have their impact on behavior (Lupien et al., 2.3. Long Term Cognitive Factors
2009). Acute stress can influence attention, a vital component In contrast to short-term factors that reflect the current situation
to working memory, in ways that are beneficial as well as and can change rapidly, our second grouping of “long-term
detrimental (Al’Absi et al., 2002). Attentional tunneling is one factors” covers more stable attributes of a person and their
such effect of acute stress where attention is hyper-focused on experiences that only gradually change. We consider factors
aspects relevant to the cause of the stress but is less sensitive of personality, expertise, age and gender, and culture. We
to other information. The term tunneling derives from the use include personality as a long-term factor, even though it can be
of spatial attention tasks, where arousal due to stress leads to situation dependent as well (as with short-term factors) (Kenrick
subjects ignoring things that are more distant from the focus of and Funder, 1988). These factors offer some predictability of
attention (Mather and Sutherland, 2011). In the realm of cyber individual behavior in a given situation. In the context of
security, attention tunneling from an emotion-charged phishing cybersecurity, long-term psychological factors can impact how an
message could lead one to hyper-focusing on the email text individual responds to social engineering attacks.
but ignore a suspicious address or warnings at the periphery.
Working memory is also vulnerable to acute stress (Schwabe and 2.3.1. Personality
Wolf, 2013), particularly by way of interfering with prefrontal To Psychologists, “personality” is a technical term that differs
cortex function (Elzinga and Roelofs, 2005; Arnsten, 2009). somewhat from ordinary usage. It refers to individual differences
Decision making can be driven in two fundamentally different in thoughts, feelings, and behaviors that are relatively consistent

Frontiers in Psychology | www.frontiersin.org 4 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

over time and situations. We say “relatively” because, as noted countermeasures are often quite different than with adults (see
above, thoughts, feelings, and behaviors are highly dependent below). Cognition changes throughout the adult lifespan at a
on the situation, and lifespan approaches have defined notable less frenetic pace vs. in children, but longer-term changes are
changes in personality with age (Donnellan and Robins, 2009). similarly dramatic (Park and Reuter-Lorenz, 2009; Salthouse,
Personality research is dominated by the Big 5 framework 2012). Declines in fluid intelligence, essentially one’s ability
of personality domains, which was developed over much to “think on your feet,” are particularly dramatic and have
of the twentieth century in various forms (Digman, 1997). wide implications for everyday life (Horn and Cattell, 1967).
The Big 5 framework is based on statistical methods (factor Overall, there are many changes, some declining with age
analysis) that identify abstract dimensions that can economically (fluid intelligence) and others not (Schaie, 2005). Another
account for much of the variance in personality measures. The angle is that age is positively associated with the risk for
factors are labeled conscientiousness, agreeableness, neuroticism, many neurological disorders that can impair cognition, such as
openness to experience, and extraversion. For present purposes, stroke and Alzheimer’s disease (Hof and Mobbs, 2001). Age-
the labels of the factors are adequate descriptions of the related neurological disorders are not considered “normal aging,”
underlying constructs. Many studies on the relationship but the potential vulnerability of many elders due to brain
between social engineering and personality focus on openness, disease has been well-known to criminals for a long time. As
conscientiousness, and neuroticism which are thought to have expected, social engineering attacks are a major problem for this
the most impact on susceptibility to social engineering. The vulnerable population.
factors that comprise the Big 5 framework are the following: Psychology has a long history of studying sex differences,
defined by biology (i.e., the presence of two X or one X and
1. Openness: the willingness to experience new things.
one Y chromosome) and gender, which is a social, rather than
2. Conscientiousness: favors-norms, exhibiting self-control and
biological, construct. In terms of basic cognitive functions such
self-discipline, and competence.
as working memory and decision making, which are typically
3. Extraversion: being more friendly, outgoing, and interactive
studied in a neutral laboratory context (such as remembering
with more people.
strings of letters, judging categories of pictures, etc.) there are
4. Agreeableness: being cooperative, eager to help others, and
generally little or no differences between sexes and genders.
believing in reciprocity.
There are a few well-documented exceptions, such as males
5. Neuroticism: tendency to experience negative feelings, guilt,
having an advantage for mental spatial rotations (Voyer et al.,
disgust, anger, fear, and sadness.
1995). The situation is quite different when examining cognition
in the context of social and emotional factors (Cahill, 2006).
2.3.2. Expertise For our purposes, sex and gender are basic considerations for
Expertise is typically limited to relatively narrow domain and social engineering attacks, particularly spear phishing, which is
does not transfer to other areas as much as we tend to tailored to an individual. Our list could include many other types
believe (termed the “transfer problem”) (Kimball and Holyoak, of individual differences that are useful for social engineering
2000). Limited transfer of expertise can be compounded by attacks, such as socio-economic class, education, personal
cognitive illusions such as the Dunning-Kruger effect. The interests, job position. We chose to focus on age and sex/gender
Dunning-Kruger effect empirically shows that individuals because they are prominent topics in the cognition literature and
often overestimate their competence relative to their objective important considerations for cyber security challenges such as
performance (Kruger and Dunning, 1999). Similarly, the spear phishing.
“illusion of knowledge” shows that people generally know far
less about a topic than they believe, as revealed by questioning 2.3.4. Culture
(Keil, 2003). In the realm of cybersecurity, these and other In mainstream cognitive psychology, culture is not a prominent
empirical phenomena underpin user over confidence. As will variable, as much of the basic literature studies participants
be detailed below, narrow expertise about cybersecurity can be in countries that have predominantly western cultures (Arnett,
beneficial, but computer expertise more generally may not confer 2008). Nonetheless, a wide variety of studies have shown that
security benefits. cultural differences are evident in many aspects of cognition,
such as basic perception, language and thought, attention, and
2.3.3. Individual Differences reasoning (Grandstrand, 2013). Culture is an important variable
There are many kinds of individual differences and we focus to consider for any social engineering attack. A phishing email,
on two kinds: age and sex/gender; others would include for example, is unlikely to be effective if the message violates
role in companies and seniority. In terms of age, there norms of the target’s culture. We also consider the more specific
are dramatic changes in cognitive function and behavioral case of organizational culture in the workplace because it is highly
capacities of children as they develop (Damon et al., 2006). relevant to employee behavior as it applies to cyber security
Considering how youths can safely use computers is a major (Bullee et al., 2017). As with all of the other short-and long-term
parenting, education, public policy, and law enforcement variables that we consider, culture is assumed to interact with
challenge. Social engineering attacks can readily take advantage the other variables, with particularly large interactions with age,
of the cognitive and emotional vulnerabilities of children, and gender, and perhaps personality.

Frontiers in Psychology | www.frontiersin.org 5 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

FIGURE 2 | Extending the basic schema of human cognition presented in Figure 1 to accommodate social engineering cyberattacks. The extension is to incorporate
an attacker that wages a social engineering cyberattack against a victim’s human cognition functions (i.e., the oval). The resulting behavior is associated to persuasion
(i.e., an attack succeeds when a victim is persuaded to act as intended by the attacker).

3. VICTIM COGNITION THROUGH THE events when focusing on the primary task (Simons, 2000). In
LENS OF SOCIAL ENGINEERING most cases, security is a secondary task . For example, when
an employee attempts to manage several tasks simultaneously
CYBERATTACKS
(e.g., reply to hundreds of emails in the email inbox while
Social engineering cyberattacks are a type of psychological answering calls and an occasional request from the boss), the
attack that exploits human cognition functions to persuade an employee is more likely to overlook cues in phishing messages
individual (i.e., victim) to comply with an attacker’s request that might indicate deception. A study that examined actual
(Anderson, 2008). These attacks are centered around a social phishing behavior by sending employees an innocuous phishing
engineering message crafted by an attacker with the intent of email, found that self-perceived work overload was positively
persuading a victim to act as desired by the attacker. These associated with the likelihood of clicking on the phishing link
attacks often leverage behavioral and cultural constructs to (Jalali et al., 2020). Vishwanath et al. (2011) investigate the effect
manipulate a victim into making a decision based on satisfaction of information processing and user’s vulnerability to phishing.
(gratification), rather than based on the best result (optimization) Leveraging two phishing attacks that target a university, they
(Kahneman, 2011; Indrajit, 2017). For example, one behavioral survey undergraduate students on their recollection and response
construct is that most individuals would trade privacy for to the phishing emails. They find that in the presence of a
convenience, or bargain release of information for a reward perceived relevant email, individuals focus more on urgency cues,
(Acquisti and Grossklags, 2005). while overlooking deception cues in the message, such as sender’s
To establish a systematic understanding of the victim’s email address or email grammar/spelling. They also find that
cognition through the lens of social engineering cyberattacks, individuals that regularly manage large volumes of emails have a
we propose extending the framework presented in Figure 1 high inattentiveness when evaluating emails, making them more
to accommodate social engineering cyberattacks against vulnerable to phishing attacks. They also find that a high email
human victims’ cognition functions, leading to the framework load triggers an automatic response, meaning that workload
highlighted in Figure 2. This implies that the resulting behavior significantly increases a victim’s vulnerability to phishing attacks.
of a victim will also depend on the attacker’s effort. In what Summarizing the preceding discussion, we draw:
follows, we will cast the social engineering cyberattacks literature
INSIGHT 1. Cognitive workload, via mechanisms such as
into this framework, by first discussing the literature related
inattentional blindness, can increase vulnerability to social
to short-term and long-term cognition factors, and then the
engineering cyberattacks.
literature related to cognition functions.
3.1.2. Stress
3.1. Short-Term Cognition Factors Through The particular kind of stress, namely acute stress mentioned
the Lens of Social Engineering above, has only been indirectly investigated in the context
Cyberattacks of social engineering cyberattacks. Stajano and Wilson (2009)
3.1.1. Workload examine how principles of scams apply to systems security.
In computer-mediated communications, cognitive workload can Scams are a form of social engineering cyberattack that usually
affect an individual’s ability to process socially engineered involves a physical interaction between attacker and victim. One
messages. Pfleeger and Caputo (2012) observe that cognitive scamming technique is the principle of distraction, by which the
workload could make individuals overlook elements that are not attacker can take advantage of a victim that is in a state of mind
associated with the primary task. This effect, called inattentional that prevents them from evaluating deceptive cues. For example,
blindness, affects an individual’s ability to notice unexpected when an unemployed individual pays a job recruiting company

Frontiers in Psychology | www.frontiersin.org 6 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

for job hunting assistance, the individual does not realize that finding that high neuroticism decreases trust and increases risk
it is a scam. Catphishing is a social engineering cyberattack by perception, which makes one more likely to misclassify benign
which the attacker creates a fictional online persona to lure a emails as phishing ones. They also find that higher agreeableness
victim into a romantic relationship for financial gains. In this increases trust and lowers risk perception (i.e., more likely
case, an individual who is searching for a romantic partner and classifying phishing messages as benign). Consciousness is
is experiencing some personal stress might find a catphishing commonly associated with self-control, which diminishes
message appealing and, therefore, may be unable to detect the impulsive behavior (Cho et al., 2016). Pattinson et al. (2012) find
deception cues in the catphishing messages. In summary, we that less impulsive individuals manage phishing messages better.
draw the following: Halevi et al. (2015) show that individuals with high consciousness
and lower risk perception are more likely to fall victims to social
INSIGHT 2. Stress may reduce one’s ability to detect deception engineering cyberattack messages. Lawson et al. (2018) find that
cues in social engineering cyberattack messages but the direct extroversion decreases phishing detection accuracy while high
effects of acute stress on cybersecurity social engineering have not consciousness increases detection accuracy, and that openness is
been examined. associated with higher accuracy in detecting legitimate messages.
Darwish et al. (2012) find that individuals high in extraversion
3.1.3. Vigilance
and agreeableness pose a higher security risk. McBride et al.
Purkait et al. (2014) conduct a study to examine cognitive
(2012) find that consciousness is associated with low self-efficacy
and behavioral factors that affect user’s capabilities in detecting
and threat severity. Workman (2008) and Lawson et al. (2018)
phishing messages. They study attentional vigilance and short-
that personality traits are related to the degree of persuasion
term memory by surveying 621 participants’ ability to identify
by social engineering cyberattacks. Summarizing the preceding
phishing sites, Internet skills, usage and safe practices, and
discussion, we draw the following:
demographics. The measure of “vigilance” was a brief visual
search task in six photographs, which did not evaluate vigilance INSIGHT 4. Literature results are not conclusive on how
as we conventionally defined it above. Individual differences personality may influence one’s susceptibility to social engineering
in these visual search scores were significant predictors of cyberattacks.
performance distinguishing spam from phishing websites, which
3.2.2. Expertise
likely reflects the ability to detect visual cues on the website that
Related to expertise, domain knowledge, awareness, and
distinguish spam from phish sites.
experience have been studied in the literature on their impact on
INSIGHT 3. Attentional vigilance, particularly the vigilance reducing one’s susceptibility to social engineering cyberattacks.
decrement, may be an important influence on susceptibility to Impact of domain knowledge. An individual’s knowledge
social engineering attacks, but more research is needed. related to cyberattacks increases their capability to resist social
engineering cyberattacks. For example, the knowledge can be
about web browsers, including how to view site information and
3.2. Long-Term Cognition Factors Through evaluate certificates. Kumaraguru et al. (2006) find (i) non-expert
the Lens of Social Engineering individuals consider fewer security indicators (e.g., meaningful
Cyberattacks signals) than experts; (ii) non-expert individuals used simple
3.2.1. Personality rules to determine the legitimacy of a request, while experts also
Personality has been extensively studied in the context of consider other useful information (e.g., context) that may reveal
phishing. Studies show that Big 5 personality traits are related security concerns with the request; (iii) non-expert individuals
to individuals’ susceptibility to social engineering cyberattacks. make decisions based on their emotions, while experts make their
Pattinson et al. (2012) study how personality traits and decisions based on reasoning; and (iv) non-expert individuals
impulsiveness affect behavioral responses to phishing messages. rely more on (spoofable) visual elements to make decisions
They find that individuals that score high on extraversion and because they lack the knowledge that security indicators can
openness manage phishing emails better than individuals with be compromised, while experts are more efficient at identifying
other personality types. Halevi et al. (2013) find that high suspicious elements in a message. For example, corresponding to
neuroticism increases responses to prize phishing messages (iii), they observe that a non-expert individual might decide to
and that individuals with a high openness have low security download a software program based on how much they want it
setting on social media account, increasing their exposures and if the downloading website is recognizable; whereas an expert
to privacy attacks. Halevi et al. (2016) find that personality might consider how much they need it and if the downloading
traits affect security attitudes and behaviors as follows: high website is a reputable source. These findings resonate with what
conscientiousness is associated to highly secure behaviors but is found by Klein and Calderwood (1991), namely, that experts
does not affect self-efficacy (i.e., one’s ability in independently make decisions based on pattern recognition, rather than purely
resolving computer security issues); high openness is associated analyzing the available options. Byrne et al. (2016) find that risk
to high self-efficacy; high neuroticism is associated to low self- perception for non-expert individuals is influenced by the benefit
efficacy; and high emotional stability (inverse of neuroticism) that can be obtained for an activity, meaning that actions that an
is associated to high self-efficacy. Cho et al. (2016) contradict individual considers beneficial are performed more often and are
some of the findings presented in Halevi et al. (2013), by perceived as less risky.

Frontiers in Psychology | www.frontiersin.org 7 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

INSIGHT 5. Domain knowledge helps reduce vulnerability to spends online, the more skilled they are at identifying spams,
social engineering cyberattacks. and the less likely they will click on the links in the spam
messages. Gavett et al. (2017) find that education and previous
Impact of awareness. As a rule of thumb, training on non-expert experience with phishing attacks increased suspicion on phishing
individuals often emphasize on awareness. In a study of victims sites. Cain et al. (2018) find that past security incidents do not
in frauds involving phishing and malware incidents, Jansen and significantly affect secure behaviors. Abbasi et al. (2016) find (i)
Leukfeldt (2016) find that most participants express that they older, educated females and males fell victim to phishing attacks
have knowledge of cybersecurity, but it turns out only a few of in the past are less likely to fall victim to phishing attacks again;
them indeed have the claimed knowledge. Downs et al. (2006) (ii) young females with low phishing awareness and previous
find that awareness of security cues in phishing messages does experience in suffering from small losses caused by phishing
not translate into secure behaviors because most participants attacks do not necessarily have a lower susceptibility to phishing
are unable to tie their actions to detrimental consequences. On attacks in the future; and (iii) young males with high self-efficacy
the other hand, it may be intuitive that individuals that have and phishing awareness and previous experiences in phishing
received formal computer education would be less vulnerable attacks also do not necessarily have a lower susceptibility to
to social engineering cyberattacks. To the contrary, Ovelgönne phishing attacks in the future.
et al. (2017) find that software developers are involved in
more cyberattack incidents when compared to others. Purkait INSIGHT 7. Self-efficacy, knowledge, and previous encounter
et al. (2014) find that there is no relationship between one’s of social engineering cyberattacks collectively reduce one’s
ability to detect phishing sites and one’s education and technical susceptibility to social engineering cyberattacks. In particular,
backgrounds, Internet skills, and hours spent online. Halevi et al. costly phishing experiences would greatly reduce one’s susceptibility
(2013), Junger et al. (2017), and Sheng et al. (2010) find that to social engineering cyberattacks, while non-costly experiences
knowledge acquired through priming and warning does not do not.
affect ones’ susceptibility to phishing attacks.

INSIGHT 6. Awareness and general technical knowledge do 3.2.3. Individual Differences


not necessarily reduce one’s susceptibility to social engineering Two kinds of individual differences have been investigated in the
cyberattacks, perhaps because human cognition functions have not context of social engineering cyberattacks: gender and age.
been taken into consideration. Impact of gender. Initial studies suggest a relationship between
gender and phishing susceptibility. Hong et al. (2013) finds that
Impact of experience. Harrison et al. (2016) find that knowledge individual differences (e.g., dispositional trust, personality, and
about phishing attacks increases one’s attention and elaboration gender) are associated with the ability to detect phishing emails.
when combined with subjective knowledge and experience, Halevi et al. (2015) find that for women, there is a positive
and therefore lowers one’s susceptibility to fall victim to social correlation between conscientiousness and clicking on links and
engineering cyberattack messages. Wang et al. (2012) find that downloading files associated with phishing attacks. Halevi et al.
knowledge about phishing attacks increases one’s attention to (2013) find that women exhibit a strong correlation between
detect indicators. Pattinson et al. (2012) find that the higher neurotic personality traits and susceptibility to phishing attacks,
familiarity with computers, the higher capability in coping but no correlation to any personality trait is found for men.
with phishing messages. Wright and Marett (2010) find (i) Halevi et al. (2016) reports that women exhibit lower self-efficacy
a combination of knowledge and training is effective against than men. Sheng et al. (2010) find that women with low technical
phishing attacks; (ii) individuals with a lower self-efficacy (i.e., knowledge are more likely to fall victim to phishing attacks.
one’s ability to manage unexpected events) and web experience Sheng et al. (2010) find that women are more likely to fall victim
are more likely to fall victims to social engineering cyberattacks; to phishing attacks.
and (iii) individuals with high self-efficacy are less likely to However, later studies provide a different view. Sawyer and
comply with information requests presented in phishing attacks. Hancock (2018) finds that there is no relationship between
Halevi et al. (2016) find that a high self-efficacy correlates a gender and phishing detection accuracy. Similarly, Purkait et al.
better capability to respond to security incidents. Arachchilage (2014) find that there is no relationship between gender and
and Love (2014) find that self-efficacy, when combined with the ability to detect phishing sites. Byrne et al. (2016) finds that
knowledge about phishing attacks, can lead to effective strategies there is no relationship between gender and risk perception.
for coping with phishing attacks. Wright and Marett (2010) Rocha Flores et al. (2014) finds that there is no significant
find that experiential factors (e.g., self-efficacy, knowledge, and correlation between phishing resiliency and gender. Bullee et al.
web experience) have a bigger effect on individuals’ response to (2017) finds that gender does not contribute to phishing message
phishing attacks than dispositional factors (e.g., the disposition responses. Abbasi et al. (2016) finds (i) women with a high
to trust and risk perception). Van Schaik et al. (2017) find self-efficacy have a low susceptibility to social engineering
that a higher risk perception of online threats is associated cyberattacks, and that women without awareness of the social
with exposure to the knowledge that is specific to the threat. engineering cyberattack threat have a high susceptibility to these
Downs et al. (2006) find that users can detect social engineering attacks; and (ii) men with previous costly experiences with
cyberattacks that are similar to the ones they have been exposed phishing attacks have a low susceptibility to these attacks, while
to. Redmiles et al. (2018) find that the more time an individual overconfidence increases the susceptibility to these attacks. Cain

Frontiers in Psychology | www.frontiersin.org 8 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

et al. (2018) find that although men may have more knowledge (Hofstede et al., 2010; Rocha Flores et al., 2014), and self-
about cybersecurity than women, there is no difference in terms efficiency (Sheng et al., 2010; Halevi et al., 2016). Redmiles
of insecure behaviors by gender. Redmiles et al. (2018) show et al. (2018) suggest that country/communal norms might affect
that, in the context of social media spam, gender affects message spam consumption as follows: in countries where spam is more
appeal but not susceptibility to social engineering cyberattacks, prevalent, users are 59% less likely to click on spam when
and that women are more likely to click on sales-oriented spams compared to countries where spam is less prevalent. Halevi
while men are more likely to click on media spams that feature et al. (2016) find that individuals with higher risk perception
pornography and violence. Goel et al. (2017) find that women have higher privacy attitudes, which reduce the susceptibility to
open more messages on prize reward and course registration social engineering cyberattacks. Al-Hamar et al. (2010) perform
than men. Rocha Flores et al. (2014) find that gender affects experimental spear-phishing attacks against two groups from
the type of phishing message an individual would respond Qatar, where one group consists of 129 employees of a company
to and that women are less susceptible than men to generic (dubbed employees) and the other of 30 personal acquaintances
phishing messages. (dubbed friends); they find that find that 44% of the individuals
in the employees group are successfully phished while 57% of
INSIGHT 8. Gender does not have a big impact on the the friends groups are successfully phished. Tembe et al. (2014)
susceptibility to social engineering cyberattacks. report that participants from India exhibit a higher susceptibility
to phishing attacks when compared with participants from the
Impact of age. Most studies focus on age groups in young USA and China. (Bullee et al., 2017) report that participants
people (18–24) and old (45+) ones. In general, youth is related from China and India might not be aware of the harms and
to inexperience, high emotional volatility (Zhou et al., 2016), less consequences of phishing attacks, while participants from the
education, less exposure to information on social engineering, USA exhibit more awareness of privacy and online security
and fewer years of experience with the Internet. These factors features (i.e., SSL padlocks) and are more active in safeguarding
are often accompanied by a low aversion to risk and therefore their personal information. Halevi et al. (2016) find that although
can increase the chances of falling victim to social engineering culture is a significant predictor of privacy attitude, it does not
cyberattacks (Sheng et al., 2010). In an experiment involving predict security behavior and self-efficacy. Bohm (2011) finds
53 undergraduate students in the age group of 18–27, Hong that culturally sound messages do not raise suspicion. Farhat
et al. (2013) find that the students’ confidence in their ability to (2017) and Hofstede et al. (2010) show that scams with culture-
detect phishing messages does not correlate to their detection specific shame appeal are more likely to be effective in a certain
rate. Sheng et al. (2010) investigate the relationship between culture. Bullee et al. (2017) find that participants from countries
demographics and susceptibility to phishing attacks and find that with a higher Power Distance Index (PDI), which means that
individuals at the age group of 18–25 are more susceptible to individuals are more likely to comply with hierarchy, are more
phishing attacks than other groups 25+. Lin et al. (2019) report vulnerable to phishing than those individuals from countries with
a similar result but for an old group. Howe et al. (2012) find a lower PDI. Sharevski et al. (2019) show how to leverage cultural
that age also affects risk perception: individuals in the age groups factors to tailor message appeal.
of 18–24 and 50–64 perceive themselves at lower security risks
compared to other groups and therefore are more susceptible INSIGHT 10. Culture affects privacy and trust attitudes,
to social engineering cyberattacks. Purkait et al. (2014) find that which indirectly affect one’s susceptibility to social
the detection of phishing messages decreases with the age and engineering cyberattacks.
frequency of online transactions. Bullee et al. (2017) find that
age has no effect on spear-phishing responses and that Years
3.3. Victim Cognition Functions Through
of Service (YoS) is a better indicator of victimization (i.e., a the Lens of Social Engineering
greater YoS means less likely susceptible to social engineering Cyberattacks
cyberattacks). Gavett et al. (2017) examine the effect of age on 3.3.1. Long-Term Memory
phishing susceptibility and showed that processing speed and As reviewed in section 2.1, long-term memory is a very broad
planning executive functions affect phishing suspicion, hinting field, of which the following aspects have been studied through
a relationship between phishing susceptibility and cognitive the lens of social engineering cyberattacks. The first aspect is
degradation from aging. the frequency of attacks. The environment in which a victim
operates provides a context that may be exploited by an attacker.
INSIGHT 9. Old people with higher education, higher awareness For example, the attacker may leverage an ongoing societal
and higher exposure to social engineering cyberattacks are less incident or personal information to craft messages to make the
susceptible to these attacks. victim trust these messages. The attacker attempts to build trust
with a victim while noting that a suspicion thwarts the attack
3.2.4. Culture (Vishwanath et al., 2018). Both trust and suspicion are affected by
Culture affects individuals’ online activities (Sample et al., 2018), the environment, such as the frequency of the social engineering
decision making process and uncertainty assessment (Chu et al., events exploited by the social engineered messages. For example,
1999), development of biases and risk perception (Pfleeger and in a situation where social engineering cyberattacks are expected,
Caputo, 2012; da Veiga and Martins, 2017), reactions to events the attacker is at a disadvantage (Redmiles et al., 2018); in a

Frontiers in Psychology | www.frontiersin.org 9 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

situation where social engineering cyberattacks are infrequent, attacks, where the resulting behavior is whether a victim is
the attacker has the advantage. Sawyer and Hancock (2018) persuaded by the attacker to act as intended. For example, spear-
investigate how infrequent occurrence of phishing events (i.e., phishing is a special case of the model because the attacker often
the prevalence of phishing) affects individuals’ abilities to detect makes a big effort at enhancing message appeal by exploiting
cyberattacks delivered over emails. In their experiment, they ask personalization; scam is another special case of the model because
three groups to identify malicious and legitimate email messages. the attacker often makes a big effort at enhancing message appeal
The three groups, respectively, contain 20, 5, and 1% malicious by exploiting situational setting and possibly stress. In principle,
emails. They find that the accuracy of the detection of malicious the behavior (behavior) of social engineering cyberattacks can
emails is lower for the group dealing with emails that contain be described as some mathematical function f (mathematically
1% malicious ones. Similarly, Kaivanto (Kaivanto, 2014) show speaking, more likely it will be a family of functions):
that a lower probability of phishing occurrence increases victim’s
susceptibility to phishing cyberattacks. behavior = f (short_term_factors,
long_term_factors, long_memory,
INSIGHT 11. The success of social engineering cyberattacks is attacker_effort). (1)
inversely related to their prevalence.
Note that f mathematically abstracts and represents the
Insight 11 causes a dilemma: when automated defenses are interactions between the four cognitive domains operating
effective at detecting and filtering most social engineering in long-memory (i.e., perception, working memory, decision
cyberattacks, the remaining attacks that make it through to users making, and action), while also taking short-term and long-term
are more likely to succeed. One approach to dealing with this factors into account. Moreover, f accommodates attacker’s effort
dilemma is to resort to principles in Cognitive Psychology. It as input. It is an outstanding research problem to identify the
is known that most of the brain’s information processing is appropriate abstractions for representing these model parameters
sealed-off from conscious awareness (Nisbett and Wilson, 1977), and what kinds of mathematical functions f are appropriate to
some permanently while other information could be consciously what kinds of social engineering attacks. These questions need to
appreciated, but may not be conscious at a given moment. Our be answered using experimental studies. Note that the framework
visual system, for example, computes 3-D depth from 2-D retinal can be expanded to include measures of brain activity, either
inputs (DeValois and DeValois, 1990). We do not consciously direct measures such as electroencephalography, or indirectly
experience the calculations needed to transform the 2-D input using peripheral measures such as eye tracking and autonomic
into a 3-D percept. Instead, we are aware of the product (i.e., nervous system activity (Valecha et al., 2020).
seeing a 3-D world) but not the process that led to the product. Specific to the cybersecurity domain, we propose considering
The influences of subconscious processing are well-known to persuasion-related behavior, as shown in Figure 2 and the
impact behavior (Kahneman, 2011; Nosek et al., 2011). This fact corresponding mathematical representation of Equation (1),
leads to the following insight: meaning that the outcome in Equation (1) can be replaced by
the probability that a user is persuaded by the attacker to act as
INSIGHT 12. Training methods that ask people to consciously
the attacker intended. Intuitively, persuasion is the act of causing
think about social engineering cyberattacks are unlikely to be very
someone to change their attitudes, beliefs, or values based on
successful unless the learning reaches the point where it is a habit
reasoning or argument. Wright et al. (2014) defines Cialdini’s
that, largely unconsciously, guides safer computer use behavior.
Principles of Persuasion, which have been extensively used to
Insight 12 would avoid the dilemma mentioned above because study the response to social engineering messages (but not social
when the training/learning effort reaches the point that users can engineering cyberattacks). Table 1 presents a brief summary of
deal with social engineering cyberattacks subconsciously, users Cialdini’s Principles of Persuasion.
can effectively defend these attacks. This coincides with findings Intuitively, the mathematical function f in Equation
of Halevi et al. (2015), Rocha Flores et al. (2014), Halevi et al. (1) should accommodate or reflect Cialdini’s Principles of
(2016), Howe et al. (2012), and Sheng et al. (2010), namely that Persuasion. Although the state-of-the-art does not allow us to
users with higher risk perception can reduce the chance they fall draw insights into how these Principles would quantitatively
victim to social engineering cyberattacks. affect the form of f , we can still draw some insights from existing
studies, as discussed below.
3.3.2. Victim Cognition Functions: A Preliminary van der Heijden and Allodi (2019) study the relation between
Mathematical Representation phishing attack success and Cialdini Principles of Persuasion
The framework described in Figure 2 formulates a way of using enterprise emails from a financial institution. They find
thinking in modeling how the behavior of a victim is influenced that phishing emails that received the most responses (“clicks”)
by the victim’s short-term cognition factors (or short_term are those who use consistency and scarcity principles mentioned
factors), long-term cognition factors (or long_term above. They also find that emails with more cognitive elements
factors) and long-memory (or long_memory) as well as (e.g., proper grammar, personalization, and persuasion elements)
the attacker’s effort (or attacker_effort). This formulation receive most responses. In a related study, Lin et al. (2019)
is applicable to phishing, spear phishing, whaling, water holing, find that younger individuals are more susceptible to phishing
scams, angler phishing, and other kinds of social engineering messages that use the scarcity and authority principles, while

Frontiers in Psychology | www.frontiersin.org 10 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

TABLE 1 | Summary of Cialdini Principles of Persuasion. TABLE 2 | Summary of the five newly proposed principles of persuasion that
would better represent the psychological vulnerabilities exploited by social
Principle Description engineering cyberattacks (Ferreira et al., 2015).

Liking The act of saying yes to something you know and like; for Principle Description
example, a social engineer presenting himself as helpful and
empathetic toward the victim in a password reset process. Authority Obeying pretense of authority or performing a
Reciprocity Repaying an earlier action in kind; for example, conveying to a favor for an authority.
victim that they have detected suspicious activities in the victim’s Social Proof Mimicking behavior of the majority of people.
credit card account while encouraging the victim to reset the Liking, Similarity, and Obeying someone a victim knows/likes, or
password with their assistance. Deception (LSD) someone similar to the victim, or someone a
Social Proof The use of endorsement; for example, stating that, due to recent victim finds attractive.
suspicious activities, new security requirements are issued and Commitment, Reciprocity, Making a victim act as committed, assuring
must be complied by all account holders. and Consistency (CRC) consistency between saying and doing, or
Consistency Leveraging the desire of individuals to be consistent with their returning a favor.
words, belief, and actions; for example, reminding users that they Distraction Focusing on what a victim can gain, need, or
have to comply with a password reset policy as they have lose/miss out.
previously done.
Authority Responding to others with more experience, knowledge, or
power; for example, an email signed by a Senior Vice President of
a bank requesting customers to reset their account passwords.
INSIGHT 13. The representation of mathematical function f in
Scarcity Something being valuable when it is perceived to be rare or
Equation (1) should adequately reflect the Principles of Persuasion.
available for a limited time; for example, giving a user 24-h notice
before they deactivate the user’s account.
Since quantitatively describing the mathematical function f ,
Unity Shared identity between the influencer and the influenced
as demanded in Insight 13, is beyond the scope of the
The principle of Unity was introduced in Cialdini (2016) but has not been studied in social state-of-the-art, in what follows we explore some qualitative
engineering research; it is presented here for the purpose of completeness. properties of the these mathematical functions, showing how
an increase (decrease) in a model parameter would affect the
outcome behavior (more precisely, persuasion) of a victim. These
older individuals are more susceptible to phishing emails that qualitative observations also need to be quantitatively verified by
use the reciprocity and liking principles. Rajivan and Gonzalez future experimental studies.
(2018) find that the most successful phishing message strategies
are notifications messages, authoritative messages, friend request 3.3.3. Impact of Attacker Effort on Victim Behavior
messages, shared interest messages, and assistance with a failure. As shown in Equation (1), the attacker can affect victim behavior
These strategies map to Cialdini’s Principles of liking, authority, through the attack effort variable, which can be reflected by
and unity. Lawson et al. (2018) find that socially engineered attacker’s message quality and message appeal with respect to the
messages that use authority and scarcity principles are considered victim in question. In terms of the impact of message quality,
more suspicious than those that use the liking principle. Downs et al. (2006) find that most individuals rely on superficial
There have been proposals to augment Cialdini’s Principles elements when determining if a message is legitimate, without
to better represent the psychological vulnerabilities that have knowing that most of those elements can be spoofed. Jansen and
been exploited by social engineering cyberattacks. Ferreira Leukfeldt (2016) find that in online banking frauds involving
and colleagues (Ferreira and Lenzini, 2015; Ferreira et al., phishing or malware, most victims report that fraudulent stories
2015) present five Principles of Persuasion by combining in phishing emails or phone calls appear to be trustworthy.
(i) Cialdini Principles of Persuasion; (ii) Stajano’s study on Message quality appears increase social engineering cyberattack
scams and how distraction, social compliance, herd, dishonesty, success. Wang et al. (2012) find that visceral triggers and
kindness, need and greed, and time affect the persuasive deception indicators affect phishing responses. Visceral triggers
power of scam messages (Stajano and Wilson, 2009); and (iii) increases phishing responses, whereas deception indicators
Gragg’s psychological triggers on how strong affect or emotion, have the opposite effect, reducing phishing response. Similarly,
overloading, reciprocation, deceptive relationships, diffusion of Vishwanath et al. (2011) find that individuals use superficial
responsibility and moral duty, authority, and integrity and message cues to determine their response to phishing messages.
consistency can influence an individual’s response to social The study reports that urgency cues make it less likely for
engineered messages (Gragg, 2003). Table 2 presents a summary an individual to detect deception cues. One common method
of these newly proposed five principles. of trustworthiness is through the use of visual deception,
Guided by the newly proposed principles, Ferreira and Lenzini which is effective because most individuals associate professional
(2015) conduct experiments, using phishing emails, to show that appearance and logos with a website or message been legitimate.
distraction (e.g., fear of missing out, scarcity, strong affection, Visual deception involves the use of high-quality superficial
overloading, and time) is the most prevalent phishing tactic, attributes (e.g., legitimate logos, professional design, and name
followed by authority and LSD. spoofing). Hirsh et al. (2012) find that phishing messages that use
Summarizing the preceding discussion, we draw: visual deception have a higher victim response rate. Jakobsson

Frontiers in Psychology | www.frontiersin.org 11 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

(2007) observes that for most individuals, the decision to trust • Personalization is another framing technique in which a
a website or not is based on-site content and not on the status message is tailored to the preference of the victim (Hirsh et al.,
of a site security indicators (e.g., the use security certificates, 2012), such as friendship appeals, expressing similar interests.
HTTPS). He also notes that most users could not detect subtle In a phishing experiment (Jagatic et al., 2007), find that adding
changes in URLs (e.g., a malicious www.IUCU.com vs. a benign personal data found in social networks to phishing emails
www.IUCU.org). Dhamija (Dhamija et al., 2006) conducts increased the response rate from 16 to 72%. Rocha Flores et al.
a study on malicious website identification. Using malicious (2014) find that targeted, personalized phishing messages are
and legitimate websites with professional appearance, with the more effective than generic messages. They find that phishing
difference that malicious sites display security indicators (e.g., emails that receive the most responses are those perceived to
missing padlock icon on the browser, warning on site’s digital come from a known source. Bullee et al. (2017) also find that
certificate), they find that 23% of their participants fail to identify emails using personalized greeting line were responded 1.7
fraudulent sites for 40% of the time. This group of participants times more likely when compared with emails with generic
are asked to assess a website’s legitimacy based on its appearance greeting lines.
(e.g., website design and logos), and 90.9% of participants fail
Summarizing the preceding discuss, we draw:
to identify a high-quality malicious website that uses visual
deception (i.e., URL spoofing replacing letters in a legitimate INSIGHT 14. Message quality and message appeal, which reflect
URL, for example, “W” for “vv”). attacker effort (e.g., using contextualization and personalization),
On the other hand, message appeal is associated with the have a significant impact on the attacker’s success.
benefit an individual derives from complying with a request.
Halevi et al. (2013, 2015) find that many individuals that
fall to social engineering cyberattacks ignore the risk of their 3.3.4. Countermeasures Against Social Engineering
actions because they focus on the potential benefit that the Cyberattacks
phishing email offers. Message appeal has the most weight There have been some studies on defending against social
on social engineering susceptibility. An example of this is the engineering cyberattacks. First, it is intuitive that effective
Nigerian scam, also known as the “419” scam. Herley (2012) training should heighten a victim’s sense of threat because
notes that although the scam is well-known and information individuals are more cautious and sensitive to detecting elements
on the scam is readily available online, individuals still fall that might indicate deception. Along this line, Wright and Marett
victim to it because the message is designed to appeal the most (2010) find that suspicion of humanity was a dispositional factor
gullible. Two techniques that are commonly used to increase that increases the detection of deception in phishing messages,
message appeal are contextualization, also known as pretexting, more so than risk beliefs and trust. Pattinson et al. (2012)
and personalization. also find that when individuals participating in experiments
are aware that a phishing attack is involved, they perform
• Contextualization is a variation of message framing where better on detecting phishing emails. Second, Tembe et al.
the sender provides details or discusses topics relevant to the (2014) find that individuals from the U.S. having higher
group to vouch for the victim’s membership in the group. suspicion and caution attitudes on online communications,
Luo et al. (2013) conducts a study on phishing victimization when compared to individuals from China and India. Third,
with contextualization in the message. In the experiment, they Vishwanath et al. (2018) find that habitual patterns of email
use work benefits and compensation as a pretext in an email habits and deficient self-regulation reduce viewers’ suspicion.
to University staff. They find that individuals interpret high- Moreover, detecting deception cues decrease social engineering
quality content and argument messages (e.g., well written, susceptibility. Kirmani and Zhu (2007) find that detecting
persuasive messages) as originating from a credible sender. persuasion cues in a message activates suspicion and generates
Basing the message argument on a topic that is common a negative response to the message. Fourth, Canfield et al.
within a community (i.e., contextualization) gives the message (2016) find that individuals’ inability to detect phishing messages
the appearance of originating from a known person within increases their susceptibility, regardless of their cautionary
a group. Using this technique, they are able to achieve behavior. For suspicions to be effective in reducing social
15.24% victimization in 22 min by combining pretexting engineering susceptibility, the risk must out weight the benefit
and message quality. Similarly, Goel et al. (2017) examine of complying with the message (e.g., message appeal) without
the effect of messages contextualization (i.e., pretexting) on affecting decision performance (i.e., accuracy, precision and
phishing message opening and compliance rates. They find negative prediction) (Cho et al., 2016). Goel et al. (2017) find that
that highly contextualized messages that target issues relevant suspicion alone does not prevent phishing victimization because
to the victim are more successful. They also find that messages they report that individuals that are suspicious about email
with higher perceived loss have higher success rates than messages can still fall victim to social engineering cyberattacks.
those with high perceived gains. Rajivan and Gonzalez (2018) Summarizing the preceding discussion, we draw the following:
also find that phishing messages on work-related or social
communication topics (e.g., friend requests) have a higher
success rate than messages requesting a password change or INSIGHT 15. An individual’s capabilities in detection of social
offering a deal. engineering cyberattacks are affected by their awareness of the

Frontiers in Psychology | www.frontiersin.org 12 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

threats, their cultural backgrounds, and their individual differences TABLE 3 | Relationships between future research directions and insights.
in trust/suspicion. Future research direction Insights
Insight 15 highlights that there is no silver-bullet solution to
Human-Machine interactions Reduce trust (Insights 1, 2, 10, 14); Increase
countering social engineering cyberattacks. On the contrary, suspicion (Insights 3, 5, 7, 9)
effective defense must take into consideration the differences Immunity to social engineering Identify and quantify underlying causes of
between individuals because they are susceptible to social cyber attacks immunity (Insights 5, 7, 9); Security and UI
engineering cyberattacks at different degrees. design (Insight 1, 2, 3); Improve message
For more effective defenses against social engineering detection (Insight 10, 13, 14)
cyberattacks, the following aspects need to be systematically Human-Centric System Design Incorporate psychological state of computer
investigated in the future. user during system design (Insight 1, 2, 3, 11)
Designing effective training Training based on susceptibility elements
• Achieving effective human-machine interactions in defending (Insight 4, 6, 8, 10, 11, 12, 14, 15)
against social engineering cyberattacks. Effective defense Understanding and quantifying Increase research focus on short term factors
would require to (i) detecting message elements that attempt the impact of short-term factors impact in security (Insight 1, 2, 3)
to increase recipients’ trust and (ii) increasing recipients’
suspicion on messages. In either approach, we would need
human-machine teaming in detecting social-engineering
cyberattacks, highlighting the importance of effective defenses
introduce vulnerabilities while interacting with the system
against social engineering cyberattacks.
(i.e., assuming away that humans are often the weakest link).
• Improving users’ immunity to social engineering cyberattacks.
One approach to addressing this problem is to change the
To improve users’ immunity to social engineering
designers’ mindset to treat users of these systems as the
cyberattacks, we first need to investigate how to quantify
weakest link. This can be achieved by, for example, using
their immunity. In order to improve users’ immunity to
security designs that are simpler and less error-prone, while
social engineering cyberattacks, we need to enhance users’
considering the worst-case scenario that the users may have
protection motivation and capabilities in detecting deceptive
a high cognitive load when using these systems. How to
cues. One approach is to enhance the user interface design to
quantify users’ vulnerability to social engineering attacks is an
highlight the security alerts/indicators in email systems and
outstanding problem because it paves the way to quantify the
web browsers because their current designs are not effective
cybersecurity gain of a better design when compared with a
(Downs et al., 2006; Kumaraguru et al., 2006; Schechter
worse design.
et al., 2007; Abbasi et al., 2016). Along this direction, the
• Designing effective training to enhance users’ self-efficacy.
user interface must highlight security alerts/indicators with
Training is an important mechanism for defending against
specific and quantified severity of threats to the user. The
social engineering cyberattacks. However, it is a challenging
current user-interface design appears to mainly focus on
task to design effective training. One approach to addressing
usability and user experience while assuming the presence
the problem is to routinely expose users to specific socially
of (social engineering) cyberattacks as a default. This design
engineering cyberattacks (e.g., messages that have been used
premise needs to be changed to treating the presence of
by attackers). Another approach is to insert sanitized social
(social engineering) cyberattacks as the default. In order to
engineering attacks (e.g., phishing emails without malicious
enhance users’ capabilities in detecting deceptive cues, we
payload) into users’ routine activities to trigger users’ response
need to design new techniques to enhance our capabilities
to social engineering cyberattacks (e.g., the feedback will point
in automatically detecting or assisting users to detect social
to the user in question whether the user correctly processed
engineering cyberattacks. Automatic detection has been
the message). This also effectively increases users’ perception
pursued by previous studies, which however only focus on
of social engineering cyberattacks, making them appear more
examining certain message elements that are known to be
frequent than it is.
associated with previous social engineering cyberattacks
• Understanding and quantifying the impact of short-term
(i.e., signature-based detectors); these signatures can be
factors in social engineering cyberattacks. We observe that
easily avoided by attackers. In order to possibly detect new
as discussed above, few studies have examined the effect of
social engineering cyberattacks, future detectors should
short-term factors in social engineering cyberattacks. Short-
incorporate cognitive psychology elements to detect social
term factors are known to affect cognition and behavior in
engineering cyberattacks, such as the quantification of
other contexts profoundly. As highlighted in Equation (1) and
messages’ persuasiveness and deceptiveness. In order to
discussed above, we stress the importance of defining and
design automated techniques to assist users in detecting social
quantifying social engineering cyberattack metrics, which are
engineering cyberattacks, human-machine interaction is an
largely missing and will become an indispensable component
important issue.
of the broader family of cybersecurity metrics as discussed in
• Achieving human-centric systems design with quantifiable
Pendleton et al. (2016), Cho et al. (2019), and Xu (2019).
cybersecurity gain. Modern systems design, including
security systems design, often focuses on optimizing Table 3 highlights the five future research directions and their
performance without considering how humans would relationships to the insights.

Frontiers in Psychology | www.frontiersin.org 13 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

FIGURE 3 | Our speculation of the impact on one’s susceptibility to social engineering cyberattacks: a factor on the left-hand side means that substantially increasing
the factor (e.g., expertise) will decrease one’s susceptibility, with further left indicating a bigger degree in decreasing susceptibility; a factor on the right-hand side
means that substantially increasing the factor (e.g., workload) will increase one’s susceptibility, with further right indicating a bigger degree in increasing susceptibility;
gender has little or no effect on one’s susceptibility.

3.4. Further Discussion This was indeed one of our motivations for proposing the
First, we observe that the 15 insights mentioned above are mathematical framework outlined in Equation (1).
all qualitative rather than quantitative. Moreover, the factors Second, it is a fascinating research problem to fulfill the
are typically investigated standalone. Furthermore, even the quantitative framework envisioned in the paper because its
qualitative effects are discussed in specific scenarios, meaning fulfillment will permit us to identify cost-effective, if not optimal,
that they may not be universally true. Summarizing most of the defense strategies against social engineering cyberattacks. This
insights mentioned above, the state-of-the understanding is the will also help identify the most important factors. However, we
following: (i) cognitive workload, stress, and attack effort increase suspect that the optimal defense strategies will vary with, for
one’s vulnerability to social engineering cyberattacks; (ii) the example, different combinations of short-term factors and long-
effect of vigilance, personality, awareness, and culture remains to term factors. For example, we suspect that the importance of
be investigated to be conclusive; (iii) domain knowledge, (certain factors may be specific to attack scenarios. This is supported
kinds of) experience, and age (together with certain other factors) by two very recent studies: van der Heijden and Allodi (2019)
reduce one’s vulnerability to social engineering cyberattacks; and observed that certain short-term and long-term factors (e.g.,
(iv) gender may not have a significant effect on one’s vulnerability workload) may be exploited to wage phishing attacks because
to social engineering cyberattacks. malicious emails can coincide with high email volume; and
Figure 3 depicts our speculation of the impact on one’s Jalali et al. (2020) showed that certain short-term and long-
susceptibility to social engineering cyberattacks. Specifically, we term factors (e.g., high workload and lack of expertise) are
suspect that expertise can decrease one’s susceptibility to the two important factors against medical workers. For example,
largest extent among the factors because expertise equips one Insight 6 says that awareness and general technical knowledge do
the capability to detect the deceptive cues that are used by social not necessarily reduce one’s susceptibility to social engineering
engineering cyberattacks. We suspect that vigilance and domain cyberattacks; however, this may not hold when taking awareness
knowledge have a significant, but smaller, impact on reducing and human cognition functions into consideration. In other
one’s susceptibility because it is perhaps harder for an expert to words, we can speculate that the effect of considering one factor
fall into victim of social engineering cyberattacks. Since there alone and the effect of considering multiple interacting factors
is no evidence to show which one of these two factors would together may be different. This phenomenon is also manifested
have a bigger impact than the other, we subjectively treat them by Insight 7, showing that self-efficacy, knowledge, and
as if they have the same impact. We suspect awareness would previous encounter of social engineering cyberattacks collectively
have a significant impact on reducing one’s susceptibility, despite reduce one’s susceptibility to social engineering cyberattacks.
that the literature does not provide any evidence. According to In particular, costly phishing experiences would greatly reduce
Insight 8 (which is drawn from a body of literature reviewed one’s susceptibility to social engineering cyberattacks, while non-
above), gender has little or no impact. We suspect that stress costly experiences do not. Putting another way, a certain previous
would decrease one’s capability in detecting deception cues, but encounter may or may not have a big effect when considered
attack effort would have an even more significant impact on together with other factors.
increasing one’s susceptibility. We suspect that workload may Third, Insight 12 says that effective training should not
have the biggest impact on one’s susceptibility because it would ask people to consciously think about social engineering
substantially reduce one’s ability in detecting deception cues. cyberattacks, but making people to formulate an unconscious
For other factors like age and culture, we suspect that their habit in coping with these attacks. This points out an important
impacts might have to be considered together with other factors, research direction on designing future training systems.
explaining why we do not include them in Figure 3. In summary, Fourth, studies in the context of social engineering
our understanding of the factors that have impacts on human’s cyberattacks inevitably involve human subject, meaning
susceptibility to social engineering cyberattackers is superficial. that ethical aspects of these studies must be taken into adequate

Frontiers in Psychology | www.frontiersin.org 14 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

consideration when designing such experiments and an IRB AUTHOR CONTRIBUTIONS


approval must be sought before conducting any such experiment.
For ethical considerations in phishing experiments, we refer to All authors contributed to the article and approved the
(Finn and Jakobsson, 2007) for a thorough treatment. submitted version.

4. CONCLUSION FUNDING
We have presented a framework for systematizing human This paper was supported in part by the Mitre Basic Education
cognition through the lens of social engineering cyberattacks, Assistance Program (BEAP) and ARO Grant #W911NF-
which exploit weaknesses in human’s cognition functions. The 17-1-0566 and NSF Grants #1814825 and #1736209. The
framework is extended from the standard cognitive psychology funders had no role in study design, data collection and
to accommodate components that emerge from the cybersecurity analysis, decision to publish, or preparation of the manuscript.
context. In particular, the framework leads to a representation The opinions expressed in the paper are those of the
of a victim’s behavior, or more precisely, the degree a victim is authors and do not reflect the funding agencies’ policies in
persuaded by an attacker to act as the attacker intended, as some any sense.
mathematical function(s) of many aspects, including victim’s
cognition functions and attacker’s effort. We articulate a number ACKNOWLEDGMENTS
of research directions for future research. We hope that this
mathematical representation will guide future research endeavors We thank the reviewers for their constructive comments, which
toward a systematic and quantitative theory of Cybersecurity have guided us in improving the paper. In particular, the entire
Cognitive Psychology. section 3.4 is inspired by the reviewers’ comments.

REFERENCES Cahill, L. (2006). Why sex matters for neuroscience. Nat. Rev. Neurosci. 7:477.
doi: 10.1038/nrn1909
Abass, I. A. M. (2018). Social engineering threat and defense: a literature survey. J. Cain, A. A., Edwards, M. E., and Still, J. D. (2018). An exploratory study of
Inform. Secur. 9:257. doi: 10.4236/jis.2018.94018 cyber hygiene behaviors and knowledge. J. Inform. Secur. Appl. 42, 36–45.
Abbasi, A., Zahedi, F. M., and Chen, Y. (2016). “Phishing susceptibility: the doi: 10.1016/j.jisa.2018.08.002
good, the bad, and the ugly,” in 2016 IEEE Conference on Intelligence and Canfield, C. I., Fischhoff, B., and Davis, A. (2016). Quantifying phishing
Security Informatics (ISI) (Tucson: IEEE), 169–174. doi: 10.1109/ISI.2016.77 susceptibility for detection and behavior decisions. Hum. Factors 58,
45462 1158–1172. doi: 10.1177/0018720816665025
Acquisti, A., and Grossklags, J. (2005). Privacy and rationality in individual Cho, J.-H., Cam, H., and Oltramari, A. (2016). “Effect of personality traits on
decision making. IEEE Secur. Privacy 3, 26–33. doi: 10.1109/MSP.2005.22 trust and risk to phishing vulnerability: modeling and analysis,” in 2016 IEEE
Al’Absi, M., Hugdahl, K., and Lovallo, W. R. (2002). Adrenocortical stress International Multi-Disciplinary Conference on Cognitive Methods in Situation
responses and altered working memory performance. Psychophysiology 39, Awareness and Decision Support (CogSIMA) (San Diego, CA: IEEE), 7–13.
95–99. doi: 10.1111/1469-8986.3910095 Cho, J.-H., Xu, S., Hurley, P. M., Mackay, M., Benjamin, T., and Beaumont, M.
Al-Hamar, M., Dawson, R., and Guan, L. (2010). “A culture of trust threatens (2019). STRAM: Measuring the trustworthiness of computer-based systems.
security and privacy in Qatar,” in 2010 10th IEEE International Conference ACM Comput. Surv (New York, NY), 51. doi: 10.1145/3277666
on Computer and Information Technology (Bradford: IEEE), 991–995. Chu, P. C., Spires, E. E., and Sueyoshi, T. (1999). Cross-cultural differences
doi: 10.1109/CIT.2010.182 in choice behavior and use of decision aids: a comparison of Japan
Anderson, R. J. (2008). Security Engineering: A Guide to Building Dependable and the United States. Organ. Behav. Hum. Decis. Process. 77, 147–170.
Distributed Systems, 2nd Edn. New York, NY: Wiley Publishing. doi: 10.1006/obhd.1998.2817
Arachchilage, N. A. G., and Love, S. (2014). Security awareness of computer users: Cialdini, R. (2016). Pre-suasion: A Revolutionary Way to Influence and Persuade.
a phishing threat avoidance perspective. Comput. Hum. Behav. 38, 304–312. New York, NY: Simon and Schuster.
doi: 10.1016/j.chb.2014.05.046 da Veiga, A., and Martins, N. (2017). Defining and identifying dominant
Arnett, J. J. (2008). The neglected 95%: why American psychology needs information security cultures and subcultures. Comput. Secur. 70, 72–94.
to become less American. Am. Psychol. 63:602. doi: 10.1037/0003-066X. doi: 10.1016/j.cose.2017.05.002
63.7.602 Damon, W., Lerner, R. M., Kuhn, D., and Siegler, R. S. (2006). Handbook of Child
Arnsten, A. F. (2009). Stress signalling pathways that impair prefrontal cortex Psychology, Cognition, Perception, and Language, Vol. 2. Hoboken, NJ: John
structure and function. Nat. Rev. Neurosci. 10:410. doi: 10.1038/nrn2648 Wiley & Sons.
Baars, B. J. (1997). In the theatre of consciousness. global workspace theory, Darwish, A., El Zarka, A., and Aloul, F. (2012). “Towards understanding phishing
a rigorous scientific theory of consciousness. J. Conscious. Stud. 4, 292–309. victims’ profile,” in 2012 International Conference on Computer Systems and
doi: 10.1093/acprof:oso/9780195102659.001.1 Industrial Informatics (Sharjah: IEEE), 1–5. doi: 10.1109/ICCSII.2012.6454454
Bohm, M. (2011). Why Russians Don’t Smile. Available online at: https:// DeValois, R. L., and DeValois, K. K. (1990). Spatial Vision, Vol. 14. Oxford
themoscowtimes.com/articles/why-russians-dont-smile-6672 University Press. doi: 10.1093/acprof:oso/9780195066579.001.0001
Bullee, J.-W., Montoya, L., Junger, M., and Hartel, P. (2017). Spear Dhamija, R., Tygar, J. D., and Hearst, M. (2006). “Why phishing works,” in
phishing in organisations explained. Inform. Comput. Secur. 25, 593–613. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
doi: 10.1108/ICS-03-2017-0009 (New York, NY: ACM), 581–590. doi: 10.1145/1124772.1124861
Byrne, Z. S., Dvorak, K. J., Peters, J. M., Ray, I., Howe, A., and Sanchez, D. Digman, J. M. (1997). Higher-order factors of the big five. J. Pers. Soc. Psychol.
(2016). From the user’s perspective: perceptions of risk relative to benefit 73:1246. doi: 10.1037/0022-3514.73.6.1246
associated with using the internet. Comput. Hum. Behav. 59, 456–468. Donnellan, M. B., and Robins, R. W. (2009). “The development of personality
doi: 10.1016/j.chb.2016.02.024 across the lifespan,” in The Cambridge Handbook of Personality Psychology,

Frontiers in Psychology | www.frontiersin.org 15 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

eds P. J. Corr and G. Matthews (Cambridge: Cambridge University Press), 191. Society Annual Meeting (Los Angeles, CA: SAGE Publications), 1012–1016.
doi: 10.1017/CBO9780511596544.015 doi: 10.1177/1541931213571226
Downs, J. S., Holbrook, M. B., and Cranor, L. F. (2006). “Decision Horn, J. L., and Cattell, R. B. (1967). Age differences in fluid and crystallized
strategies and susceptibility to phishing,” in Proceedings of the Second intelligence. Acta Psychol. 26, 107–129. doi: 10.1016/0001-6918(67)
Symposium on Usable Privacy and Security (Pittsburgh, PA: ACM), 79–90. 90011-X
doi: 10.1145/1143120.1143131 Howe, A. E., Ray, I., Roberts, M., Urbanska, M., and Byrne, Z. (2012). “The
Elzinga, B. M., and Roelofs, K. (2005). Cortisol-induced impairments of psychology of security for the home computer user,” in Proceedings of the
working memory require acute sympathetic activation. Behav. Neurosci. 119:98. 2012 IEEE Symposium on Security and Privacy, SP ’12 (Washington, DC: IEEE
doi: 10.1037/0735-7044.119.1.98 Computer Society), 209–223. doi: 10.1109/SP.2012.23
Evans, J. S. B. (2008). Dual-processing accounts of reasoning, Hutchins, E. M., Cloppert, M. J., and Amin, R. M. (2011). Intelligence-driven
judgment, and social cognition. Annu. Rev. Psychol. 59, 255–278. computer network defense informed by analysis of adversary campaigns and
doi: 10.1146/annurev.psych.59.103006.093629 intrusion kill chains. Lead. Issues Inform. Warfare Secur. Res. 1:80.
Farhat, N. F. N. (2017). Scam Alert - Blackmail Email. Available online at: https:// Indrajit, R. E. (2017). Social engineering framework: Understanding the deception
www.linkedin.com/pulse/scam-alert-blackmail-email-ned-farhat approach to human element of security. Int. J. Comput. Sci. Issues 14, 8–16.
Ferreira, A., Coventry, L., and Lenzini, G. (2015). “Principles of persuasion in social doi: 10.20943/01201702.816
engineering and their use in phishing,” in International Conference on Human Jagatic, T. N., Johnson, N. A., Jakobsson, M., and Menczer, F. (2007). Social
Aspects of Information Security, Privacy, and Trust (Los Angeles, CA: Springer), phishing. Commun. ACM 50, 94–100. doi: 10.1145/1290958.1290968
36–47. doi: 10.1007/978-3-319-20376-8_4 Jakobsson, M. (2007). The human factor in phishing. Privacy Secur. Cons. Inform.
Ferreira, A., and Lenzini, G. (2015). “An analysis of social engineering principles 7, 1–19.
in effective phishing,” in 2015 Workshop on Socio-Technical Aspects in Security Jalali, M. S., Bruckes, M., Westmattelmann, D., and Schewe, G. (2020). Why
and Trust (Verona: IEEE), 9–16. doi: 10.1109/STAST.2015.10 employees (still) click on phishing links: investigation in hospitals. J. Med.
Finn, P., and Jakobsson, M. (2007). Designing ethical phishing experiments. IEEE Internet Res. 22:e16775. doi: 10.2196/16775
Technol. Soc. Mag. 26, 46–58. doi: 10.1109/MTAS.2007.335565 Jansen, J., and Leukfeldt, R. (2016). Phishing and malware attacks on online
Franklin, D. W., and Wolpert, D. M. (2011). Computational mechanisms of banking customers in the Netherlands: a qualitative analysis of factors leading
sensorimotor control. Neuron 72, 425–442. doi: 10.1016/j.neuron.2011.10.006 to victimization. Int. J. Cyber Criminol. 10:79. doi: 10.5281/zenodo.58523
Gavett, B. E., Zhao, R., John, S. E., Bussell, C. A., Roberts, J. R., and Yue, C. (2017). Junger, M., Montoya, L., and Overink, F.-J. (2017). Priming and warnings are not
Phishing suspiciousness in older and younger adults: the role of executive effective to prevent social engineering attacks. Comput. Hum. Behav. 66(Suppl.
functioning. PLoS ONE 12:e0171620. doi: 10.1371/journal.pone.0171620 C), 75–87. doi: 10.1016/j.chb.2016.09.012
Gigerenzer, G. (2008). Why heuristics work. Perspect. Psychol. Sci. 3, 20–29. Kahneman, D. (2011). Thinking, Fast and Slow. New York, NY: Farrar, Straus and
doi: 10.1111/j.1745-6916.2008.00058.x Giroux.
Goel, S., Williams, K., and Dincelli, E. (2017). Got phished? Internet security and Kaivanto, K. (2014). The effect of decentralized behavioral decision making on
human vulnerability. J. Assoc. Inform. Syst. 18:2. doi: 10.17705/1jais.00447 system-level risk. Risk Anal. 34, 2121–2142. doi: 10.1111/risa.12219
Gragg, D. (2003). A multi-level defense against social engineering. SANS Reading Kandel, E. R., Schwartz, J. H., and Jessell, T. M. (2000). Principles of Neural Science,
Room 13, 1–21. doi: 10.1093/acprof:oso/9780199253890.003.0002 Vol. 4. New York, NY: McGraw-Hill.
Grandstrand, O. (2013). “Cultural differences and their mechanisms,” in The Keil, F. C. (2003). Folkscience: Coarse interpretations of a complex reality. Trends
Oxford Handbook of Cognitive Psychology, ed D. Reisberg (Oxford: Oxford Cogn. Sci. 7, 368–373. doi: 10.1016/S1364-6613(03)00158-X
University Press), 970–985. Kenrick, D. T., and Funder, D. C. (1988). Profiting from controversy:
Gupta, S., Singhal, A., and Kapoor, A. (2016). “A literature survey on social lessons from the person-situation debate. Am. Psychol. 43:23.
engineering attacks: phishing attack,” in 2016 International Conference on doi: 10.1037/0003-066X.43.1.23
Computing, Communication and Automation (ICCCA) (Greater Noida: IEEE), Kimball, D. R., and Holyoak, K. J. (2000). “Transfer and expertise,” in The Oxford
537–540. doi: 10.1109/CCAA.2016.7813778 Handbook of Memory, eds E. Tulving, and F. I. M. Craik (New York, NY:
Halevi, T., Lewis, J., and Memon, N. (2013). “A pilot study of cyber security Oxford University Press), 109–122.
and privacy related behavior and personality traits,” in Proceedings of the 22nd Kirmani, A., and Zhu, R. (2007). Vigilant against manipulation: the effect of
International Conference on World Wide Web (Singapore: ACM), 737–744. regulatory focus on the use of persuasion knowledge. J. Market. Res. 44,
doi: 10.1145/2487788.2488034 688–701. doi: 10.1509/jmkr.44.4.688
Halevi, T., Memon, N., Lewis, J., Kumaraguru, P., Arora, S., Dagar, N., et al. Klein, G. A., and Calderwood, R. (1991). Decision models: some lessons from the
(2016). “Cultural and psychological factors in cyber-security,” in Proceedings field. IEEE Trans. Syst. Man Cybernet. 21, 1018–1026. doi: 10.1109/21.120054
of the 18th International Conference on Information Integration and Web- Kruger, J., and Dunning, D. (1999). Unskilled and unaware of it: how difficulties
based Applications and Services, iiWAS ’16 (New York, NY: ACM), 318–324. in recognizing one’s own incompetence lead to inflated self-assessments. J. Pers.
doi: 10.1145/3011141.3011165 Soc. Psychol. 77:1121. doi: 10.1037/0022-3514.77.6.1121
Halevi, T., Memon, N., and Nov, O. (2015). Spear-phishing in the wild: a real-world Kumaraguru, P., Acquisti, A., and Cranor, L. F. (2006). “Trust modelling
study of personality, phishing self-efficacy and vulnerability to spear-phishing for online transactions: a phishing scenario,” in Proceedings of the 2006
attacks. SSRN Electron. J. doi: 10.2139/ssrn.2544742. International Conference on Privacy, Security and Trust: Bridge the Gap
Harrison, B., Svetieva, E., and Vishwanath, A. (2016). Individual processing of Between PST Technologies and Business Services (Markham, ON: ACM), 11.
phishing emails: how attention and elaboration protect against phishing. Online doi: 10.1145/1501434.1501448
Inform. Rev. 40, 265–281. doi: 10.1108/OIR-04-2015-0106 Lawson, P. A., Crowson, A. D., and Mayhorn, C. B. (2018). “Baiting the hook:
Herley, C. (2012). “Why do Nigerian scammers say they are from Nigeria?” in exploring the interaction of personality and persuasion tactics in email phishing
WEIS (Berlin). attacks,” in Congress of the International Ergonomics Association (Florence:
Hirsh, J., Kang, S., and Bodenhausen, G. (2012). Personalized persuasion: tailoring Springer), 401–406. doi: 10.1007/978-3-319-96077-7_42
persuasive appeals to recipients’ personality traits. Psychol. Sci. 23, 578–581. Lin, T., Capecci, D. E., Ellis, D. M., Rocha, H. A., Dommaraju, S., Oliveira, D. S.,
doi: 10.1177/0956797611436349 et al. (2019). Susceptibility to spear-phishing emails: Effects of internet user
Hof, P. R., and Mobbs, C. V. (2001). Functional Neurobiology of Aging. Amsterdam: demographics and email content. ACM Trans. Comput. Hum. Interact. 26:32.
Elsevier. doi: 10.1145/3336141
Hofstede, G. H., Hofstede, G. J., and Minkov, M. (2010). Cultures and Linvill, D. L., Boatwright, B. C., Grant, W. J., and Warren, P. L. (2019).
Organizations: Software of the Mind, 3rd Edn. McGraw-Hill. “The Russians are hacking my brain!” investigating Russia’s internet
Hong, K. W., Kelley, C. M., Tembe, R., Murphy-Hill, E., and Mayhorn, C. research agency twitter tactics during the 2016 United States presidential
B. (2013). “Keeping up with the joneses: assessing phishing susceptibility campaign. Comput. Hum. Behav. 99, 292–300. doi: 10.1016/j.chb.2019.
in an email task,” in Proceedings of the Human Factors and Ergonomics 05.027

Frontiers in Psychology | www.frontiersin.org 16 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

Luo, X. R., Zhang, W., Burd, S., and Seazzu, A. (2013). Investigating phishing AHFE 2017 International Conference on Human Factors in Cybersecurity, eds
victimization with the heuristic-systematic model: a theoretical framework and D. Nicholson (Los Angeles, CA: Springer International Publishing), 185–196.
an exploration. Comput. Secur. 38, 28–38. doi: 10.1016/j.cose.2012.12.003 doi: 10.4018/978-1-5225-4053-3.ch004
Lupien, S. J., McEwen, B. S., Gunnar, M. R., and Heim, C. (2009). Effects of Sawyer, B. D., and Hancock, P. A. (2018). Hacking the human: the
stress throughout the lifespan on the brain, behaviour and cognition. Nat. Rev. prevalence paradox in cybersecurity. Human Factors 60, 597–609.
Neurosci. 10:434. doi: 10.1038/nrn2639 doi: 10.1177/0018720818780472
Mackworth, N. H. (1948). The breakdown of vigilance during prolonged visual Schaie, K. W. (2005). What can we learn from longitudinal studies of adult
search. Q. J. Exp. Psychol. 1, 6–21. doi: 10.1080/17470214808416738 development? Res. Hum. Dev. 2, 133–158. doi: 10.1207/s15427617rhd0203_4
Mather, M., and Sutherland, M. R. (2011). Arousal-biased competition Schechter, S. E., Dhamija, R., Ozment, A., and Fischer, I. (2007). “The emperor’s
in perception and memory. Perspect. Psychol. Sci. 6, 114–133. new security indicators,” in 2007 IEEE Symposium on Security and Privacy
doi: 10.1177/1745691611400234 (SP’07) (Oakland, CA: IEEE), 51–65. doi: 10.1109/SP.2007.35
McBride, M., Carter, L., and Warkentin, M. (2012). Exploring the role of Schwabe, L., and Wolf, O. T. (2013). Stress and multiple memory systems: from
individual employee characteristics and personality on employee compliance ‘thinking? to ‘doing’. Trends Cogn. Sci. 17, 60–68. doi: 10.1016/j.tics.2012.12.001
with cybersecurity policies. RTI Int. Instit. Homeland Secur. Solut. 5:1. Shaffer, L. (1975). Control processes in typing. Q. J. Exp. Psychol. 27, 419–432.
Mesulam, M.-M. (1998). From sensation to cognition. Brain 121, 1013–1052. doi: 10.1080/14640747508400502
doi: 10.1093/brain/121.6.1013 Sharevski, F., Treebridge, P., Jachim, P., Li, A., Babin, A., and Westbrook, J. (2019).
Mitnick, K., and Simon, W. L. (2003). The Art of Deception: Controlling the Human Social engineering in a post-phishing era: ambient tactical deception attacks.
Element of Security. Indianapolis, IN: Wiley Publishing. arXiv preprint arXiv:1908.11752. doi: 10.1145/3368860.3368863
Miyake, A., and Shah, P. (1999). Models of Working Memory: Mechanisms of Active Sheng, S., Holbrook, M., Kumaraguru, P., Cranor, L. F., and Downs, J. (2010).
Maintenance and Executive Control. Cambridge, UK: Cambridge University “Who falls for phish?: a demographic analysis of phishing susceptibility and
Press. doi: 10.1017/CBO9781139174909 effectiveness of interventions,” in Proceedings of the SIGCHI Conference on
Navon, D., and Gopher, D. (1979). On the economy of the human-processing Human Factors in Computing Systems, CHI ’10 (New York, NY: ACM),
system. Psychol. Rev. 86:214. doi: 10.1037/0033-295X.86.3.214 373–382. doi: 10.1145/1753326.1753383
Nisbett, R. E., and Wilson, T. D. (1977). Telling more than we can Shepherd, G. M. (2004). The Synaptic Organization of the Brain. New York, NY:
know: verbal reports on mental processes. Psychol. Rev. 84:231. Oxford University Press. doi: 10.1093/acprof:oso/9780195159561.001.1
doi: 10.1037/0033-295X.84.3.231 Simons, D. J. (2000). Attentional capture and inattentional blindness. Trends Cogn.
Nosek, B. A., Hawkins, C. B., and Frazier, R. S. (2011). Implicit social Sci. 4, 147–155. doi: 10.1016/S1364-6613(00)01455-8
cognition: from measures to mechanisms. Trends Cogn. Sci. 15, 152–159. Stajano, F., and Wilson, P. (2009). Understanding Scam Victims: Seven Principles
doi: 10.1016/j.tics.2011.01.005 for Systems Security. Technical report, University of Cambridge, Computer
Ovelgönne, M., Dumitras, T., Prakash, B. A., Subrahmanian, V. S., and Wang, Laboratory.
B. (2017). Understanding the relationship between human behavior and Starcke, K., and Brand, M. (2012). Decision making under stress: a selective
susceptibility to cyber attacks: a data-driven approach. ACM Trans. Intell. Syst. review. Neurosci. Biobehav. Rev. 36, 1228–1248. doi: 10.1016/j.neubiorev.2012.
Technol. 8:51:1–51:25. doi: 10.1145/2890509 02.003
Parasuraman, R., and Rizzo, M. (2008). Neuroergonomics: The Brain at Work, Vol. Tembe, R., Zielinska, O., Liu, Y., Hong, K. W., Murphy-Hill, E., Mayhorn, C., et al.
3. New York, NY: Oxford University Press. (2014). “Phishing in international waters: exploring cross-national differences
Park, D. C., and Reuter-Lorenz, P. (2009). The adaptive brain: aging in phishing conceptualizations between Chinese, Indian and American sample,”
and neurocognitive scaffolding. Annu. Rev. Psychol. 60, 173–196. in Proceedings of the 2014 Symposium and Bootcamp on the Science of Security
doi: 10.1146/annurev.psych.59.103006.093656 (Raleigh: ACM), 8. doi: 10.1145/2600176.2600178
Pattinson, M., Jerram, C., Parsons, K., McCormac, A., and Butavicius, M. (2012). Tulving, E., and Craik, F. I. (2000). The Oxford Handbook of Memory. New York,
Why do some people manage phishing e-mails better than others? Inform. NY: Oxford University Press.
Manage. Comput. Secur. 20, 18–28. doi: 10.1108/09685221211219173 Tversky, A., and Kahneman, D. (1974). Judgment under uncertainty:
Pendleton, M., Garcia-Lebron, R., Cho, J.-H., and Xu, S. (2016). A survey on heuristics and biases. Science 185, 1124–1131. doi: 10.1126/science.185.4157.
systems security metrics. ACM Comput. Surv. 49, 1–35. doi: 10.1145/3005714 1124
Pfleeger, S. L., and Caputo, D. D. (2012). Leveraging behavioral science to mitigate Valecha, R., Gonzalez, A., Mock, J., Golob, E. J., and Rao, H. R. (2020).
cyber security risk. Comput. Secur. 31, 597–611. doi: 10.1016/j.cose.2011.12.010 “Investigating phishing susceptibility–an analysis of neural measures,”
Pinker, S. (2009). How the Mind Works (1997/2009). New York, NY: WW Norton in Information Systems and Neuroscience (Vienna: Springer), 111–119.
& Company. doi: 10.1007/978-3-030-28144-1_12
Purkait, S., Kumar De, S., and Suar, D. (2014). An empirical investigation van der Heijden, A., and Allodi, L. (2019). Cognitive triaging of phishing attacks.
of the factors that influence internet user’s ability to correctly identify arXiv preprint arXiv:1905.02162.
a phishing website. Inform. Manage. Comput. Secur. 22, 194–234. Van Schaik, P., Jeske, D., Onibokun, J., Coventry, L., Jansen, J., and Kusev,
doi: 10.1108/IMCS-05-2013-0032 P. (2017). Risk perceptions of cyber-security and precautionary behaviour.
Rajivan, P., and Gonzalez, C. (2018). Creative persuasion: a study on Comput. Hum. Behav. 75, 547–559. doi: 10.1016/j.chb.2017.05.038
adversarial behaviors and strategies in phishing attacks. Front. Psychol. 9:135. Vishwanath, A., Harrison, B., and Ng, Y. J. (2018). Suspicion, cognition, and
doi: 10.3389/fpsyg.2018.00135 automaticity model of phishing susceptibility. Commun. Res. 45, 1146–1166.
Redmiles, E. M., Chachra, N., and Waismeyer, B. (2018). “Examining the doi: 10.1177/0093650215627483
demand for spam: who clicks?” in Proceedings of the 2018 CHI Conference Vishwanath, A., Herath, T., Chen, R., Wang, J., and Rao, H. R. (2011). Why do
on Human Factors in Computing Systems (Montreal, ON: ACM), 212. people get phished? Testing individual differences in phishing vulnerability
doi: 10.1145/3173574.3173786 within an integrated, information processing model. Decis. Support Syst. 51,
Rocha Flores, W., Holm, H., Svensson, G., and Ericsson, G. (2014). Using 576–586. doi: 10.1016/j.dss.2011.03.002
phishing experiments and scenario-based surveys to understand security Voyer, D., Voyer, S., and Bryden, M. P. (1995). Magnitude of sex differences in
behaviours in practice. Inform. Manage. Comput. Secur. 22, 393–406. spatial abilities: a meta-analysis and consideration of critical variables. Psychol.
doi: 10.1108/IMCS-11-2013-0083 Bull. 117:250. doi: 10.1037/0033-2909.117.2.250
Salahdine, F., and Kaabouch, N. (2019). Social engineering attacks: a survey. Future Wang, J., Herath, T., Chen, R., Vishwanath, A., and Rao, H. R. (2012).
Internet 11:89. doi: 10.3390/fi11040089 Phishing susceptibility: An investigation into the processing of a targeted
Salthouse, T. (2012). Consequences of age-related cognitive declines. Annu. Rev. spear phishing email. IEEE Trans. Profess. Commun. 55, 345–362.
Psychol. 63, 201–226. doi: 10.1146/annurev-psych-120710-100328 doi: 10.1109/TPC.2012.2208392
Sample, C., Cowley, J., Hutchinson, S., and Bakdash, J. (2018). “Culture + cyber: Wickens, C. D. (2008). Multiple resources and mental workload. Human Factors
exploring the relationship,” in Advances in Human Factors in Cybersecurity, 50, 449–455. doi: 10.1518/001872008X288394

Frontiers in Psychology | www.frontiersin.org 17 September 2020 | Volume 11 | Article 1755


Montañez et al. Human Cognition and Social Engineering

Workman, M. (2008). Wisecrackers: a theory-grounded investigation of phishing Zhou, Z., Zhao, J., and Xu, K. (2016). “Can online emotions predict the stock
and pretext social engineering threats to information security. J. Am. Soc. market in China?” in International Conference on Web Information Systems
Inform. Sci. Technol. 59, 662–674. doi: 10.1002/asi.20779 Engineering (Shanghai: Springer), 328–342. doi: 10.1007/978-3-319-487
Wright, R. T., Jensen, M. L., Thatcher, J. B., Dinger, M., and Marett, 40-3_24
K. (2014). Research note-influence techniques in phishing attacks: an
examination of vulnerability and resistance. Inform. Syst. Res. 25, 385–400. Conflict of Interest: The authors declare that the research was conducted in the
doi: 10.1287/isre.2014.0522 absence of any commercial or financial relationships that could be construed as a
Wright, R. T., and Marett, K. (2010). The influence of experiential and potential conflict of interest.
dispositional factors in phishing: an empirical investigation of the
deceived. J. Manage. Inform. Syst. 27, 273–303. doi: 10.2753/MIS0742-12222 Copyright © 2020 Montañez, Golob and Xu. This is an open-access article distributed
70111 under the terms of the Creative Commons Attribution License (CC BY). The use,
Xu, S. (2019). “Cybersecurity dynamics: a foundation for the science of distribution or reproduction in other forums is permitted, provided the original
cybersecurity,” in Proactive and Dynamic Network Defense, Vol. 74, eds author(s) and the copyright owner(s) are credited and that the original publication
Z. Lu and C. Wang (Cham: Springer International Publishing), 1–31. in this journal is cited, in accordance with accepted academic practice. No use,
doi: 10.1007/978-3-030-10597-6_1 distribution or reproduction is permitted which does not comply with these terms.

Frontiers in Psychology | www.frontiersin.org 18 September 2020 | Volume 11 | Article 1755

You might also like