The promise of User Behavioral Analytics is that it can go beyond simply detecting insider threats to predicting them. Some experts say that creates a significant privacy problem
The recent arrest by the FBI of a former employee of JP Morgan Chase for allegedly trying to sell bank account data, including PINs, ended well for the bank.
According to the FBI, the former employee, Peter Persaud, was caught in a sting operation when he attempted to sell the data to informants and federal agents.
And such incidents are providing increasing incentives to use technology to counter it.
The threat of employees going rogue – wittingly or not – is significant enough that some organizations are turning to behavior analytics that, according to its advocates, are able not only to detect insider security threats as they happen, but even predict them.
Such protection would likely be welcomed by most organizations, but it comes with an obvious consequence: Worker privacy. Predicting security threats calls up images of “Minority Report,” the 2002 movie starring Tom Cruise, in which police arrested people before they committed crimes.
In that sci-fi world, it was “precogs” – psychics – who predicted the impending crimes. The IT version is User Behavior Analytics (UBA).
According to Gartner, “UBA is transforming security and fraud management practices because it makes it much easier for enterprises to gain visibility into user behavior patterns to find offending actors and intruders.”
Saryu Nayyar, CEO of Gurucul Solutions, in a recent statement, said her firm’s technology, “continuously monitors hundreds of (employee behavior) attributes to detect and rank the risk associated with anomalous behaviors.
“It identifies and scores anomalous activity across users, accounts, applications and devices to predict risks associated with insider threats.”
This should not be a surprise. Data analytics are being applied to just about every challenge in the workplace, from marketing to efficiency. So it is inevitable that it would be used to counter what has always been the weakest link in the security chain – the human.
Americans have also been told for years that personal privacy is essentially dead. Still, some of them may not appreciate just how dead it is, or soon will be, in the workplace.
But Nayyar and others note that there should be no expectation of privacy in the workplace when it comes to corporate data.
“This technology is simply monitoring activity within a company’s IT systems,” she said. “It does not read emails or personal communications.”
She added that monitoring of employee behaviors by IT has been going on for a long time. “This is nothing new,” she said. “What’s different today is the use of big data analytics, machine learning algorithms and risk scoring being applied to these logs.”
Michael Overly, technology partner at Foley & Lardner LLP, said companies should notify their employees that, “business systems should not be used for personal or private communications and other activities, and that the systems and data can and likely will be reviewed, including through automated means.”
But he agreed with Nayyar that privacy is necessarily limited in the workplace. “Employees must understand that if they want privacy with regard to their online activities, they need to use a means other than their employer’s computers, like a smartphone or a home computer,” he said.
That is also the view of Troy Moreland, chief technology officer at Identity Automation. “In general, if employees are using employer-provided equipment, they have no right to privacy as long as it’s clearly expressed,” he said.
But Joseph Loomis, founder and CEO of CyberSponse, said such policies, if they are too heavy handed, can cause morale problems. “I believe it’s justified,” he said, “it’s just that there are various opinions on what type of privacy someone is entitled to or not.”
He said it would likely take significant “training, education and explaining” to eliminate the feeling of a “Big Brother” atmosphere in the workplace.
Gabriel Gumbs, vice president of product strategy at Identity Finder, said he believes the potential for morale problems is real. “At the core of UBA is an unspoken distrust of everyone, not just the rogue employees,” he said.
Matthew Prewitt, partner at Schiff Hardin and chairman of its cybersecurity and data privacy practice, said one problem with predicting misconduct is that it can become self-fulfilling. “An employee who is viewed with mistrust and suspicion is more likely to become a rogue employee,” he said.
He agrees that there is a limited expectation of privacy in the workplace, especially on the corporate network. But he said a “creative advocate” for an employee could argue that, “UBA is so different from other types of monitoring that some sort of express reference to UBA needs to be provided in the notice.”
Loomis added that in states not governed by “right-to-work” laws, UBA, “will cause legal issues if one terminates without cause other than predictive intelligence.”
And Gumbs said U.S. courts have ruled that workers have a reasonable expectation of privacy in the workplace. “I could not envision a scenario where behavioral prediction would not cross this line,” he said. “Only matters of national security could plausibly supersede such rulings.”
Advocates of UBA emphasize that it is not aimed just at tracking those with criminal intent. While malicious rogue employees can cause the most damage and tend to get the most headlines, they are relatively rare.
The much larger problem, they say, is from unintentional rogues – those with too many access privileges, who use “shadow” IT and/or who are simply lazy or careless.
“In our experience over-privileged scenarios account for approximately 65% of insider threat incidents, shadow IT 20% and carelessness 15%,” Nayyar said.
Moreland has a list of labels for such employees, including “access hoarders” who “gobble up as much access as they possibly can and refuse to relinquish any of it, even when it’s no longer needed.”
Others, who he calls “innovators,” are well intentioned – they are trying to be more productive – but one of the ways they do so is by circumventing IT policies.
Gumbs noted that the Verizon Data Breach Investigations Report found that, “privilege abuse is the most damaging of insider threats.”
But he added that not all abuse of access privileges is innocent, and does not necessarily mean an employee is over-privileged. “In the majority of cases, users had the proper level of privilege for their roles, they simply abused those privileges for personal or financial gain,” he said.
In those cases, he and other experts say identity and access management can reduce the security risks significantly.
“Over-privilege is a substantial concern,” Overly said. “In general, the majority of users in businesses today are over-privileged. The concept of least privilege is seldom implemented properly and even more seldom addressed as personnel duties change and evolve over time.”
Dennis Devlin, cofounder, CISO and senior vice president of privacy practice at SAVANTURE, said he sees the same thing. “In my experience most individuals who have been with an organization for a long time are over-privileged,” he said. “Access privileges are accretive and tend to grow over time. The law of least privileges exists not just to prevent malicious access, but to also to prevent accidental or inadvertent disclosure.”
He said better access management could reduce the need for intrusive monitoring. “Appropriate privileges keep individuals in their respective ‘swim lanes,’ reduce the need for excessive monitoring and make SIEM analysis much more effective,” he said.
Beyond the legal and morale questions, however, the verdict is still out on how well UBA works.
Overly said in his experience, “it has a long way to go with regard to accuracy. All too often, the volume of false alarms causes the results to be disregarded when an actual threat is identified.”
Nayyar said it does work, through analysis of unusual or “anomalous” behaviors in things like geolocation, elevated permissions, connecting to an unknown IP or installing unknown software for backdoor access to sensitive data (see sidebar).
What does UBA track?
According to Saryu Nayyar, CEO of Gurucul Solutions, User Behavioral Analytics can detect behavioral anomalies by monitoring activities including:
- Geo-location: Access to resources from different geographies, locations not seen before, or from unauthorized locations, etc. This is a very simple use-case.
- Elevated Permissions: Employee elevating their access privileges to perform a task they or their peers have never performed in the past.
- Device Outbound Access: Certain high-value assets might be connecting to an unknown IP/geo-location they shouldn’t connect to. This behavior could be an anomaly when compared to past behavior or peer group behavior.
- Employees accessing resources and installing unknown software for backdoor access to sensitive data that can be transmitted outside network.
She provided an example of flagging rogue behavior: A software engineer who had resigned from a company and was leaving in a month, exhibited behavior never seen before.
While on vacation, the employee, “logged in from a previously unseen IP address, accessed source code repository and downloaded sensitive files from a project he wasn’t assigned to,” she said.
“Two days later, the engineer accessed multiple servers and moved the downloaded files to a NFS (Network File System) location, which he made mountable and attempted to sync the files to prohibited consumer cloud storage service.”
She said the user was flagged as soon as he created the NFS mount point, “based on predictive modeling, and his VPN connection was terminated.”
But as effective as that sounds, even advocates of UBA warn that, like any security tool, it is a “layer” of protection, not a guarantee.
“Perfection cannot be achieved,” Overly said. “If an insider is intent on causing harm to the business, it may be impossible to prevent it.”