Data Insecurity: Why We Fail to Protect Our Data

Dilbert_data_securityIt seems that every week a new company, organization or government agency has become the poster child for what not to do when protecting valuable data. This week alone, the U.S. Government announced that one of its biggest defense contractors lost 24,000 files in an attack by a foreign intelligence service.  The defense company wasn’t named, nor was the foreign intelligence service, but we do know that Lockheed Martin was compromised in June. “It was 24,000 files, which is a lot,” Deputy Secretary of Defense William Lynn said. “But I don’t think it’s the largest we’ve seen.” When asked if he knew who was responsible for the attack, Lynn responded, “We have a pretty good idea,” and some pundits are pointing the finger at China as the villain in this cyber drama.

In another example, more than 80,000 residents of the Durham Region of Ontario, Canada are suing the Region in a $40 million class action that accuses the Region Health Authority of losing a USB key that contained personal information for people vaccinated against the H1N1 flu virus. In that case, a public health nurse lost the key in a parking lot. Also on the healthcare front, a former patient of a cancer treatment center in St. Louis, Missouri is suing the hospital for the loss of her confidential information when a laptop “stuffed” with patient information. The problem? The information on the laptop was unencrypted.

One more example: unless you’ve been vacationing on Mars for the past few months, you’ve probably heard a lot about a little matter known as the Sony PSN breach. The highly-publicized outage of the PlayStation Network became a bit of a joke, especially since it seems that much of the compromised data was unencrypted. Sony was quick to counter that the credit card information was secure, but they were also quick to insist (it wasn’t optional) that all users change their passwords once the network was brought back up. CBC news quoted a security expert as saying that “any website worth its salt these days should be built to withstand such attacks.”

The Human Factor

See a pattern here? If not, let’s spell it out: Mr. Lynn of the Department of Defense states “I don’t think it’s the largest we’ve seen”; The public health nurse from Durham Region lost a USB key in a parking lot; the stolen laptop in St. Louis contained confidential information that wasn’t encrypted; and data on more than 100 million Sony PSN users was unencrypted.

There are two parallel issues here. The first one is easy: a lack of proactive planning. The security expert quoted in the CBC article is correct. How could a defense contractor which builds weapons systems and other military hardware for the United States allow itself to be breached, especially since the Defense Department admitted to knowing that it’s happened before? How could Sony compromise the data of 100 million users and lose hundreds of millions of dollars in the ensuing cleanup? The answer isn’t complicated. People didn’t do their jobs. Now, it might be tempting to argue that a group of hackers aged 15 to 28 know far more, and have more in the way of resources, than the largest military power in the world, and one of the globe’s leading technology firms. In case you missed it, that was sarcasm.

It’s the human factor. Look no further than the second parallel issue: a nurse who dropped a USB key, and a misplaced laptop loaded with unencrypted information on cancer patients. No matter how you look at these stories, the dominating factor is basic human error.

Planning, Training and Vigilance

Information is the lifeblood of any organization, but people are the body which makes the blood flow. Take spam, for example. Spam is dangerous, but not always for the reasons you think. Any IT technician is smart enough to detect spam and give it what it deserves – an unceremonious trip to the trash can. In fact, most educated people, IT professionals or not, can recognize spam for what it is: ridiculous, ill-conceived and at times, mind-numbingly stupid. However, while organizations spend tremendous amounts of money on technology, it’s distressing that they spend little educating the people who use the technology.

A few years back, I worked for a government agency that employed thousands of people. Every day, I received hundreds of emails and a substantial amount of those were ‘social spam’ – messages sent by coworkers peddling a funny joke, an interesting video, or a pithy piece of pseudo-wisdom. In fact, the task of cleaning up the social junk often represented a chunk of my time, detracting from doing what I was there to do – what I was paid to do. A week didn’t go by where I didn’t pull the IS manager aside and suggest that she convene a training session to educate the employees on the dangers of social spam. Those requests were met in the form of an agency-wide email and nothing more.

Most organizations have the planning part down, but they don’t seem to be able to educate their organizational structure. They don’t teach vigilance – some call it paranoia – the way IT people know vigilance, and that’s why data protection is so tenuous.

The fear is constant: the people who engage in social spam – you know the type, because they adopt similar practices on Facebook and Twitter – are the ones who will click an errant link, succumb to a phishing scam, lose a USB key, leave a laptop with patient data lying around, and yes, even fail to protect U.S. military documents from foreign countries. So before you go to sleep tonight, ask yourself this: can you sleep with confidence, knowing that every person in your organization – every person who has access to a PC – has your back? Ask yourself if they know enough to recognize a phishing site or a spam email when they see it.

And then strenuously lobby your senior management for rigorous training policies.

 

Leave a Reply