Tuesday 6 October 2009

Social hacks and fire drills

Bruce Schneier often points out that the biggest gap in any company's security is the employees themselves, who'll often give away the farm to phishers who email seeking company passwords.

We have fire drills here at Canterbury once per semester to make sure that folks know what to do when the buzzers ring. Very annoying, and it's pretty unclear to me that they do much to improve preparedness (what's so hard about walking down the stairs anyway?)

But it makes me wonder whether company IT departments, including our University's, oughtn't run phishing drills. The IT department could send a phishing message (from an external server, obviously) to all staff, making sure it gets through the spam filters straight to the inbox. Standard drill phish requesting user details. Watch then to see which staff respond. Then, go and fix those staff. Do it a few times a year. It imposes zero additional cost on sensible users, who'll just delete it with the 5 other spam messages that make it through the filters each morning. But it'll help to identify the geniuses who'd give the phishers a way into our intranet.

Best I'm aware, we're not doing this. Is anybody? Why not? About twice a year we get emails from IT warning about a phishing scam that's making the rounds, so they must think, and are probably right, that some folks are ripe for pwning. Best to identify them quickly and get 'em sorted, no?

2 comments:

  1. Reminds me of a story I read once on dailywtf.com: company hired a guy to try hacking through their security. He paid the janitor $10 to pull a plug - powering the server.

    ReplyDelete
  2. I like it!

    This sort of policy would help to address the big problem with trying to train/educate people to follow security guidelines - that there are usually no penalties imposed if they don't follow those guidelines.

    In addition, for a lot of people, the risks that the guidelines are meant to prevent are perceived as low, because people haven't yet encountered any attacks themselves. So it's not really surprising that people routinely ignore security issues.

    Imposing an external 'test' with a penalty for failing ought to fix it. Maybe just the threat of being publicly listed as having failed a drill would be enough to change behavior, and encourage people to think more about security.

    ReplyDelete