The recent arrest by the FBI of a former employee of JP Morgan Chase for allegedly trying to sell bank account data, including PINs, ended well for the bank.
According to the FBI, the former employee, Peter Persaud, was caught in a sting operation when he attempted to sell the data to informants and federal agents.
But such things don’t always end so well for the intended victims. The arrest was yet another example of the so-called “rogue insider” threat to organizations.
And such incidents are providing increasing incentives to use technology to counter it.
The threat of employees going rogue – wittingly or not – is significant enough that some organizations are turning to behavior analytics that, according to its advocates, are able not only to detect insider security threats as they happen, but even predict them.
Such protection would likely be welcomed by most organizations, but it comes with an obvious consequence: Worker privacy. Predicting security threats calls up images of “Minority Report,” the 2002 movie starring Tom Cruise, in which police arrested people before they committed crimes.
In that sci-fi world, it was “precogs” – psychics – who predicted the impending crimes. The IT version is User Behavior Analytics (UBA).
According to Gartner, “UBA is transforming security and fraud management practices because it makes it much easier for enterprises to gain visibility into user behavior patterns to find offending actors and intruders.”
Saryu Nayyar, CEO of Gurucul Solutions, in a recent statement, said her firm’s technology, “continuously monitors hundreds of (employee behavior) attributes to detect and rank the risk associated with anomalous behaviors.
“It identifies and scores anomalous activity across users, accounts, applications and devices to predict risks associated with insider threats.”
This should not be a surprise. Data analytics are being applied to just about every challenge in the workplace, from marketing to efficiency. So it is inevitable that it would be used to counter what has always been the weakest link in the security chain – the human.
Americans have also been told for years that personal privacy is essentially dead. Still, some of them may not appreciate just how dead it is, or soon will be, in the workplace.
But Nayyar and others note that there should be no expectation of privacy in the workplace when it comes to corporate data.
“This technology is simply monitoring activity within a company’s IT systems,” she said. “It does not read emails or personal communications.”
She added that monitoring of employee behaviors by IT has been going on for a long time. “This is nothing new,” she said. “What’s different today is the use of big data analytics, machine learning algorithms and risk scoring being applied to these logs.”
Michael Overly, technology partner at Foley & Lardner LLP, said companies should notify their employees that, “business systems should not be used for personal or private communications and other activities, and that the systems and data can and likely will be reviewed, including through automated means.”
But he agreed with Nayyar that privacy is necessarily limited in the workplace. “Employees must understand that if they want privacy with regard to their online activities, they need to use a means other than their employer’s computers, like a smartphone or a home computer,” he said.
That is also the view of Troy Moreland, chief technology officer at Identity Automation. “In general, if employees are using employer-provided equipment, they have no right to privacy as long as it’s clearly expressed,” he said.
But Joseph Loomis, founder and CEO of CyberSponse, said such policies, if they are too heavy handed, can cause morale problems. “I believe it’s justified,” he said, “it’s just that there are various opinions on what type of privacy someone is entitled to or not.”
He said it would likely take significant “training, education and explaining” to eliminate the feeling of a “Big Brother” atmosphere in the workplace.
Gabriel Gumbs, vice president of product strategy at Identity Finder, said he believes the potential for morale problems is real. “At the core of UBA is an unspoken distrust of everyone, not just the rogue employees,” he said.
Matthew Prewitt, partner at Schiff Hardin and chairman of its cybersecurity and data privacy practice, said one problem with predicting misconduct is that it can become self-fulfilling. “An employee who is viewed with mistrust and suspicion is more likely to become a rogue employee,” he said.