I don’t have to belabor the point that information security is a thankless job. Do it well and no one notices. One thing goes wrong, everyone notices, and your job is at risk.
I couldn’t help but think of the parallels of the Germanwings crash this week in the French Alps and the parallels that burden our heroic information security professionals each day. Initial reports about the crash cause were all over the map: terrorists, inexperienced pilots, a vulnerably aircraft with 60,000+ flying hours, cockpit fires, catastrophic electrical failure, drug overdose, food poisoning, and….the list of conjecture was endless.
Then the data came in. And guess what? All the standard protocols for protecting the passengers and crew were performed flawlessly. The rules were followed and there was just no explanation as to why the airliner exploded into the side of a mountain. Since data rarely lies, the investigation had to go well beyond the flight recorder information and telemetry from radar systems. The data said systems were fine, the investigation proved people were the ultimate vulnerability. It turns out Andreas Lubitz hacked the system with secrecy and covert methods to shield his true medical condition from his employer.
And make no mistake about it. Threat indicators were there for years. Doctors, relatives, friends, all knew that the Germanwings co-pilot of flight 9525, Andreas Lubitz, was struggling with mental illness and clinical depression in his quiet suburban neighborhood. Between medical leave notes and a long medical history, not a single mental health professional, friend, or family member through he would be a risk to others.
So this human behavioral data is validating the flight and telemetry data. This is precisely what our information security professionals are faced with each day. The data looks just fine, and security indicators are green, risk levels acceptable. Until they are not. And then the human behavioral aspect of security takes center stage to bring to light the real risks, the real events. In reality the most important control we can have in place for security, is tracking human behavior.
So perhaps the silver lining of the Germanwings Flight 9525 disaster will be that we must really respect how others behave, and share this information in a timely manner before disaster strikes. And having systems in place that can detect irregular, irrational or outlier behavior suddenly seem much more relevant in the face of preventing risks that we all would rather turn away from.