By Mark Stamford, Founder and CEO at OccamSec
Aug. 9, 2023 7:43AM GMT-7 Updated Aug. 10, 2023 1:48PM GMT-7
As a cybersecurity professional who has conducted numerous risk assessments and penetration tests, I have seen firsthand how vulnerable organizations and individuals remain despite investments in security awareness training. During social engineering assessments, employees at all levels routinely provide account access, sensitive data, and system credentials that enable attackers to bypass controls. These real-world experiences, combined with an understanding of human psychology and the fallibility of technology, have led me to conclude that traditional security awareness training provides little more than a false sense of security for most.
For example, in 2019, sophisticated phishing emails were used to steal over $4 million from a manufacturing company, bypassing their annual security awareness training. Municipal governments, hospitals, and schools have also been victims of phishing that led to ransomware infections and theft of personally identifiable information, with awareness programs failing to prevent these incidents. A well-known anecdote from Kevin Mitnick's book "The Art of Deception" describes how he obtained passwords and access to an organization during a social engineering audit despite the awareness efforts of the organization being tested beforehand.
Time and time again in assessments we have found that identifying people's basic needs and wants allows for targeted attacks. Whether that’s being friendly with folks in a coffee shop, (and ultimately being invited to the in-office happy hour), or using the picture of an attractive woman on a LinkedIn profile and invariably having many friend requests, some of whom are only too eager to provide sensitive security info. Wearing the right outfit to blend in with the smoking crowd at a warehouse was surprisingly easy and successful. And then there is the old classic of setting up a fake email account that matches someone’s boss, seeing on social media when the boss is on vacation, and then sending an urgent message to the employee saying that you need the company's source code sent to that account because your work email is not accessible.
The problem lies in both human nature and the technical limitations of awareness training. No matter how much people learn about phishing, malware, and cyber risks, human instinct will always override that knowledge during moments of perceived opportunity or threat. When an urgent-seeming email appears or an attachment promises information or access, the "reptilian brain" takes over, causing people to click, download, and disclose sensitive data without considering the hypothetical warnings and examples from their awareness training. The lessons fade from memory, and the cycle repeats.
Most phishing simulation tools and vendor services take a simplistic approach by sending benign templates of fake Amazon receipts or UPS shipping notifications that look obviously suspicious. But real malicious phishing is customized, targeted, and designed to seem authentic using spoofed sender information and references to the recipient's genuine personal information. Simulated phishing is staged, it is not a realistic representation of how criminals actively manipulate people through carefully crafted deception. Awareness of these rudimentary templates does not actually help identify or prevent sophisticated phishing. People learn to spot the simulations but remain vulnerable to criminal hacking.
Context also matters more than decontextualized awareness training can provide. Hypothetical scenarios and examples do not match the sophistication of customized phishing emails and social engineering tactics. What seems to be an "obvious" threat in training may appear entirely normal at the time of encounter. Only experience provides the instincts to detect subtle signs of malicious manipulation, but regular experience with real-world criminal hacking techniques would be unethical and dangerous. So people remain perpetually vulnerable no matter how much awareness they have.
In addition, the resources consumed to raise employee awareness are substantial but often underestimated. Developing and managing high-quality, dynamic training programs requires enormous investments of time, money, and staffing that detract from other priorities. Awareness training gives the illusion of mitigating risk, but the opportunity cost ultimately weakens an organization's actual controls and defenses by neglecting multi-layered solutions in favor of unrealistic reliance on individuals alone.
Finally, an overemphasis on threats during frequent mandatory training cultivates unhealthy fear and paranoia. Continually warning people about cyber risks, scams, and vulnerabilities trains them to become overly suspicious and pessimistic, leading some to avoid certain digital activities altogether due to a distorted view of the probability and severity of potential harm. Awareness of prudent caution is needed, but awareness training often crosses the line into fearmongering, which is counterproductive.
In some cases, organizations invest heavily in awareness training to satisfy compliance requirements, but this is often misguided. Regulations and standards frequently call for "regular user training" as a control, but the motivation behind such guidance is merely to "check the box," not to meaningfully improve security. Like overly complex password policies, awareness training is a relic of a simpler time that continues as a matter of tradition, not evidence-based decision making. What matters most is whether a particular solution works, not just that it exists.
Awareness training cannot work as a primary defense or compensate fully for human fallibility, but compliance pressures can drive overspending and overconfidence in unrealistic program expectations. The result is wasted resources, ingrained vulnerability, and unrecognized risk that inevitably comes to light during a successful social engineering attack, data breach, or other incident. It is disingenuous and potentially legally questionable to represent that compliance with simplistic "regular user training" guidance adequately protects organizational assets or personal data. More focus should be placed on threat analysis, risk measurement, and multi-layered defenses suited to the scale of the problems faced.
In my experience, the shortcomings of awareness training are evident when social engineers can still readily manipulate people and infiltrate systems. Investing heavily in training but neglecting other safeguards merely provides an illusion of safety. For organizations to meaningfully improve their risk management posture, awareness must be balanced and not prioritized above critical controls like software patches, encryption, monitoring, and access control.
High-profile cases make clear that social engineering succeeds when people are vulnerable regardless of preparation or vigilance. No amount of investment in awareness training alone matches the persistence and ingenuity of human threats combined with technical exploits. Individual and organizational security depends upon a combination of controls, realistic assessments of human fallibility, and balanced skepticism about overreliance on any one approach. While not entirely without merit if judiciously applied, awareness training should not be trusted as a primary defense, given that it cannot overcome human nature or eliminate risk. False confidence in awareness as a standalone solution puts many at a disadvantage that they tragically do not recognize until their defenses have already been breached.
Overall, with real-world examples and acknowledgment of practical limitations, awareness training alone unwisely provides a false sense of security. It must be combined with other robust controls to mitigate the risk of social engineering and cyber threats more broadly. By recognizing what people and technology cannot realistically achieve, organizations can make strategic investments in security that rely less on unrealistic expectations for awareness and more on multi-layered solutions suited to the scale and sophistication of the problems they face. Simulated phishing and decontextualized advice do not adequately prepare people for the psychology of real criminal manipulation or the persistence of actors motivated by profit and access. While education has a place, its value depends entirely on how well it is supported and on it not being overemphasized relative to human weakness and technical controls.
There are no shortcuts to managing risks that depend so completely on continual adaptation to human nature. Organizations should invest in what works over what might seem sufficient on the surface or easy to justify for compliance reasons alone. When awareness training is judiciously applied and balanced as one piece of a strategic control set, it may still have value. But on its own, it is mostly useless against threats that arise from human vulnerability and criminal ingenuity. By accepting this reality, we can work to implement solutions on a scale that matches our challenges.
email@example.com Follow the author
About the author
Mark Stamford is a cybersecurity veteran with over 20 years of professional experience in the field, having started his journey in security at the young age of 11. As the founder and CEO of OccamSec - the developers of the market-defining Continuous Threat Exposure Management (CTEM) platform, Incenter - he leads efforts to help organizations worldwide proactively address and minimize exposure to cyber threats. Mark's expertise spans various domains, from hacking networks and applications to conducting intelligence work, making him a leading figure in the ever-evolving realm of cybersecurity.