Incident Response, Malware, TDR

Model behavior: User education in the workplace

Share

Informing employees about best practices is still crucial to an enterprise security plan, reports Alan Earls.

Security company Mandiant recently outed the sophisticated hackers it identified as affiliated with the Chinese government, blaming them for the theft of hundreds of terabytes of information from more than 140 organizations over a period of years. Many of these so-called advanced persistent threats (APTs) are based on software that has quietly found a home within unsuspecting enterprises. Fundamentally, such APTs represent a technical quandary. However, training corporate insiders to be more resistant to security threats is also a continuing challenge for organizations of all sizes. 

According to security pros, organizations are and should be concerned about the way APTs and less sophisticated attacks often leverage human vulnerability – particularly a staff's susceptibility to phishing attacks and the gullibility that leads individuals to open malware files or visit malicious sites. And, while an imperfect solution – some, like security veteran Bruce Schneier, argue resources are better spent elsewhere, such as building more secure systems – many still believe there is still no substitute for continued education of end-users to instill awareness and, therefore, make them more resistant to threats.

Insiders have more access to vital assets than an outsider trying to break in, says Sam Erdheim, senior security strategist at AlgoSec, a Boston-based company focused on network security policy management. Education is important, he says, because it can greatly reduce risks related to social engineering, an age-old tactic employed to manipulate people into divulging proprietary information. Erdheim recalls a recent incident when his aunt received an email that purported to be from him. “However, she was able to recognize that something was wrong and actually contacted me to tell me my account had been hacked,” he says.

Beyond social engineering, there are other serious risks involving employees, such as the loss of storage and mobile devices. “Employees who aren't alert to the danger might find a ‘lost' memory device and decide to use it, not realizing that it could contain malware,” Erdheim says.

Although education is important, one-off reminders or burying policies and training in an HR manual isn't very useful, he says. Instead, training needs to be continually enforced. For example, he says, companies have used posters, quizzes and games to get people's attention and convey learning. “No one likes to be lectured to, so it needs to be a little more exciting,” Erdheim says. “Doing it in a way that will grab people's attention helps, as does having a security team that is a little more proactive – putting out alerts about known and emerging threats, for example.”

Erdheim says there is no magic number for how often to conduct training activities – it depends on the organization and its culture. “I think at least once a quarter would be a minimum, and maybe doing it in different ways so each time you communicate, it is fresh, and not just the same old lecture,” he says.

Furthermore, he says, there should be some component that addresses potential malicious activities by insiders, too. Highlighting HR agreements and potential enforcement actions is key. Management also needs to ensure that employees who separate from the company immediately forfeit their credentials and confidential material.

Some experts are consistently amazed at how much organizations are willing to spend on hardening their networks, while spending so little on “the most critical component, the user,” says David Amsler, president and CIO at Foreground Security, a security services, training and solutions company based in Lake Mary, Fla.

“For most companies, security awareness is a check box that is covered by an annual exposure to a PowerPoint presentation,” he says. But, in fact, most users are hungry for more information – especially since they often face many of the same vulnerabilities on their home computers or personal devices. He recommends providing periodic, 30-minute training sessions and also implementing metrics to evaluate how successful the training is. Assessments can be in the form of a game that simulates social engineering attacks and hones user sensitivities to potential threats. “If you repeat those tests over several months, you can get a sense of whether users are actually learning,” Amsler says.

But, getting training to really stick will remain a constant challenge when it is not hands on, which is the thinking behind another approach championed by Chad Caison, project task lead at KEYW Corp., a Hanover, Md.-based security solutions firm. Caison's approach is to get users to think like the adversaries. “By showing them how to gain access to a network, instead of just lecturing them from a PowerPoint, they get a real clear sense of their vulnerabilities,” he says. “When people see how this works, it stops being a mystery and becomes just another part of computer science.”

However, Caison admits that training at this level is expensive – requiring two weeks or more of an individual's time – and may not be appropriate for all users. Still, he says, it does provide a perspective on cyber dangers that can make users dramatically more capable of responding to attacks.

On the other hand, training and practice must not discourage people from doing their jobs. The “dumb” things that users often do, aren't so dumb in the context of business, says Amichai Shulman (left), CTO of Imperva, a data security company based in Redwood Shores, Calif. “We pay people to open invoices and respond to inquiries,” he says. “People want and need to be in an environment where things behave as expected.” That's what makes exploits so fiendish: They prey on sensible, learned business behaviors.

The cure, says Shulman, is to make sure people are sensitive to “red flags” as much as possible and then focus on protecting the enterprise's most critical assets through technological means. “Stop chasing the mice and start protecting the cheese,” he says.

The biggest training mistake, however, is to humiliate users, says Randy Abrams (right), research director of NSS Labs, an Austin, Texas-based information research and advisory company. “Training should help them to succeed,” he says. “If you test them with phishing exploits, make it an opportunity to learn more, not a chance to criticize them.”

Further, Abrams says companies shouldn't be averse to developing their own training methods to meet their specific needs. “Sometimes, outsiders can provide additional expertise or personnel to help, but often companies can succeed on their own.”

And, be patient, he adds. “Cars have been around for 100 years, so society has finally made safety almost second nature. It will take time for IT users to get to that same level.” 

“Change management, for example, is not handled well at many organizations, which could increase potential vulnerabilities,” he says. That's because the people who have the power to make changes to a process may take shortcuts, but these changes may have security consequences that need to be highlighted in training, he says.

“Poor process and lack of visibility into what is going on can have a big impact, especially for highly trusted people with system access,” he adds.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.

You can skip this ad in 5 seconds