The People Aspects of Social Engineering Audits

Lax employee security habits are a major source of breaches and are causing more companies to consider social engineering audits as part of due diligence.

Mary E. Shacklett, President of Transworld Data

March 7, 2024

6 Min Read
Concept of hacking or phishing a login and password with malware program
Anatolii Babii via Alamy Stock

In 2022, 74% of security breaches were from human interactions, according to Verizon. This pattern has inspired more companies to perform social engineering audits so they can check for security vulnerabilities in user areas. But what happens when you find the holes and must go back to the user departments to take corrective steps?

No one likes telling users there are flaws in their business operations, because it’s natural for people to become defensive and even hostile. This unenviable job is further complicated because many IT staffers, including CIOs, still view social engineering audits as unnecessary expenditures of time and money. After all, aren’t IT security and network penetration and vulnerability audits performed every year? Aren’t they enough?

The short answer is no. General IT security audits, network perimeter checks, penetration and vulnerability testing are routinely performed annually as security audits. They can virtually provide a 360-degree survey of how your IT security is holding up. However, then there is that human element that has nothing to do with technology security robustness.

Security expert and former cyber hacker Kevin Mitnick, who died last year, once explained it well:

“Instead of imploring brute force to attack cybersecurity barriers, social engineers are masters of the art of deception. These cunning engineers use the principles of human psychology to build trust with a user -- often someone directly associated with their targeted organization -- knowing that the person may be their ‘in’,” Mitnick wrote.

Related:What Are the Biggest Lessons from the MGM Ransomware Attack?

Phishing attacks, which are in the hundreds of millions annually, can be in the form of an email that looks to be from your boss or a co-worker. The employee sees there is an attachment and opens it, and the bogus attachment unleashes malware into the network that spreads from workstation to workstation and is difficult to stop.

Phishing isn’t the only human activity that can lead to a security breach. There is willful theft of intellectual property and information, such as a customer list that an employee walks out with. There are also simple security bad habits such as sharing a user ID or password or leaving a workstation unattended for long periods of time, even overnight. HR and IT try to combat these lax security habits by training new employees and providing refresher security training for existing employees, but there are still security holes that must be plugged.

Finding the Holes

From IT’s standpoint, security might seem to be going fine on the technology side, but there can be holes in human behavior.

Related:What Security Leaders Need to Know About the ‘Mother of All Breaches’

KPMG, one provider of social engineering audits, says these audits are for the purpose of “assessing the level of security awareness of employees.”  The social engineering audit consists of information gathering, testing whether activities within the enterprise campus can be carried out without authorization, impersonating parties on telephone calls, waste bin inspection, phishing checkups, bringing in infected files and testing for security, and inspecting office data carriers.

It should be noted that many of these audit activities relate to physical premises inspections, and do not directly involve IT. However, it is usually IT that contracts for the audits, and is seen as the audit “lead.”

Plugging the Holes

When social engineering security holes are detected, the findings are documented and the departments where the holes were found are listed.

No one likes to be on these lists, but if they are, corrective action must be taken, and someone must inform them and work with them. This is where the resentment, defensiveness and hostile feelings can set in. It’s important for IT and others that have lead responsibility for administering these audits to be mindful of this.

Related:The Rise of Deepfakes and What They Mean for Security

What steps can you take?

First, social engineering audits are not solely IT’s responsibility. There is almost equal weighting between IT issues and facility and/or human resources issues that come out in social engineering audits. From a project coordination perspective, this means that facilities, IT, HR and, possibly, other areas in the company, such as a separate audit or regulatory group, should collaborate in these audits. As a collaborative team, they should also meet with users in any department that is cited for lax security practices. If the company decides to form a steering committee for social engineering audits, it is also smart to have a couple of seats on the committee reserved for end user managers, who would rotate annually out of these positions until all user departments in the company have an opportunity to participate.

Second, social engineering audits should be woven into the fabric of routine company safety. With user habits a major source of company security breaches, it is hard to argue against the idea of making social engineering audits mandatory and performing them on a biennial basis. If social engineering audits are mandatory, they begin to fit into the general company strategy with other safety activities, like the routine performance of fire and earthquake drills.

Of course, no one likes to participate in any of these drills or activities because they disrupt daily operations. Nevertheless, if these activities are integral parts of the company safety strategy, it’s accepted that they are necessary. In this vein, a social engineering audit should be viewed as a required “security health checkup” that is regularly performed and is earnestly supported by the CEO and others at the C-level.

Third, emphasis should be placed on security problems, not on individuals. Even with inter-departmental collaboration and C-suite support, many user managers and users can’t help but feel personal about getting cited for a security hole. Consequently, it’s important to focus on the issues that need to be resolved, and not on the people who might have caused them.

If an office carrier or waste bin is found to be carrying sensitive document drafts that anyone can take, a shredder can be installed to cure the problem. If a workstation is left unattended and “open” all night, IT can deploy a procedure that automatically signs off a workstation after 15 minutes of idle time.

Both are methods of issue resolution that avoid slapping the hands of individual users, while they also signal to users where the security holes were.

Fourth, use as many non-invasive security fixes as you can. User cooperation improves whenever you can minimize disruption to daily operations and business processes. In many cases, this can be done non-invasively,

For example, if a server cage is unsecured in a manufacturing area, badge entry and a locking system with camera can be installed to monitor for tighter security. If workstations are left unattended for long periods of time, IT can automate a process that monitors them and shuts them down after 15 minutes of idle time. If too many users have access to certain shared files on the network with potentially sensitive information, a zero-trust network with security boundaries around these sensitive files can be created so that only those authorized can use them.

In all cases, the goal should be to keep actions as non-invasive to user business operations as possible.

Final Thoughts

Boards, CEOs, corporate stakeholders, business partners and customers all care about corporate responsibility. A critical prong of this responsibility is safeguarding company information and physical assets.

This is where the social engineering audit fits with its focus on human responsibility for secure business practices. Although social engineering audits are not yet mandatory for many industry regulators, they may well be in the future.

It’s not too early for IT to consider how these audits can best be administered, and what the best approaches are for ensuring that human vulnerabilities and holes in security can be fixed.

About the Author(s)

Mary E. Shacklett

President of Transworld Data

Mary E. Shacklett is an internationally recognized technology commentator and President of Transworld Data, a marketing and technology services firm. Prior to founding her own company, she was Vice President of Product Research and Software Development for Summit Information Systems, a computer software company; and Vice President of Strategic Planning and Technology at FSI International, a multinational manufacturer in the semiconductor industry.

Mary has business experience in Europe, Japan, and the Pacific Rim. She has a BS degree from the University of Wisconsin and an MA from the University of Southern California, where she taught for several years. She is listed in Who's Who Worldwide and in Who's Who in the Computer Industry.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights