Social savvy, yes – tech savvy, not so much

It is easy to look at ‘the younger generation’ and think, “Wow, these kids really know computers and networking.” I used to think along the same lines. I mean, how could they use such cool tools and not want to know how they work, not take the time to figure out what makes them tick.

But when you talk to these kids, you quickly realize that most of them don’t have a clue about how it all works. I had this epiphany a few years ago when I was talking to a couple of teenagers about some piece of tech, probably video game tech, and realized that if something went wrong they would be out of luck.

Disc drive not working? No sound? Internet connection is down? Bummer, dude.

Bruce Schneier puts it very well in a recent discussion about privacy and the individual (which I found thanks to @jmcgee at McGee’s Musings):

The younger generation is very fluent about how to use the internet, but completely clueless about how it works technically. Socially very savvy, technically very unsavvy.

Of course, this isn’t really anything new and I probably shouldn’t have been so surprised. Just looking back to my own teen years gives me all the insight I really need. Back then the main way to communicate was the telephone. I can count on one hand the other kids my age that knew – or cared – how the telephones worked; you should have seen the looks on their face when I tried to engage them in discussions about DTMF replacing pulse-dialing.

The reason the iPhone is so popular, and services like Facebook, Twitter, etc have so many users is because you don’t really have to understand how they work in order to use them. And that was fine when all you had was the telephone. But, as Schneier points out, today’s online social exchanges are different. All of our interactions on these digital services create an incredible amount of data; data about us, about our interests, and about our connections.

Understanding how the systems that own – and yes, they do own – this data about you work is critical. And not just for the kids, either.

The biggest security threat in the digital age is…

… not Microsoft, not social media tools, but: PEOPLE.

A recent blog post by Dave Snowden and some commentary by Luis Suarez have reminded me of something Bruce Schneier said a while back (in 2004, actually):

Since the beginning of time, people have always been the biggest security threat. That hasn’t changed because of computers. People are why firewalls are invariably misconfigured. They’re why social engineering works. They’re why good security products are rarely deployed properly. Securing the computer and network is hard, but it’s much easier than securing the person sitting on the chair in front of the monitor. (emphasis is mine)

In his commentary, Luis makes an interesting point that social networking – not the tools, but the activity – may be in part responsible for these types of lapses in security and uses it as a teaching point.

And, for once, social networking didn’t have anything to do with it. Oh, did it? Well, perhaps it has got plenty to do with it!; after all, don’t social software tools encourage us all to listen to what’s happening out there? Maybe they will also help us understand how we can mitigate those perceived risks by having each and everyone of us walking the talk, i.e. behaving responsively with the information and knowledge that we are exposed to, and share across accordingly, day in day out, for that matter… You wouldn’t want a total stranger to know, coming out right out of your mouth!, your full credit card number, your date of birth and any other kind of identification material, right? (emphasis his)

In the military this is called OPSEC, or Operational Security, and it is drilled into soldiers’ heads almost daily. It is, in other words, a way of life.

On the other hand, there is a fine line between appropriate security and being paranoid. With an understanding of what you really need to protect, and what is not so vital, and a bit of thought, you should be able to find that line.

And it is a line that you need to find.

A War on the Unexpected

In November 2007, security consultant Bruce Schneier wrote an article for Wired.com entitled The War on the Unexpected, which he opened with the following paragraph:

We’ve opened up a new front on the war on terror. It’s an attack on the unique, the unorthodox, the unexpected; it’s a war on different. If you act different, you might find yourself investigated, questioned, and even arrested — even if you did nothing wrong, and had no intention of doing anything wrong. The problem is a combination of citizen informants and a CYA attitude among police that results in a knee-jerk escalation of reported threats.

As the parent of a soon-to-be-adult son with autism, the words I’ve highlighted in Schneier’s quote above seemed to jump out at me.  All of them apply to my son, and I’m sure to many – if not all – autistic children and adults. This article came back to my mind as I read Kristina’s post Arrested: The Charge? Bad Behavior, in which she describes the arrest of a 13 year old autistic boy and a 19 year old man with fetal alcohol syndrome.  This is, of course, not the first such incident to have happened, only the most recent that I’ve become aware of.

There is a legitimate issue concerning what consideration, if any, should be given to a person’s autism diagnosis with respect to criminal activity.  (See, for example, the case of Gary McKinnon.)  But all too often people with autism are approached, and often apprehended, by law enforcement personnel simply because they are “acting weird” and making bystanders “uncomfortable”.

In his article, Schneier has two recommendations to stop this war on the unexpected.

We need to do two things. The first is to stop urging people to report their fears. People have always come forward to tell the police when they see something genuinely suspicious, and should continue to do so. But encouraging people to raise an alarm every time they’re spooked only squanders our security resources and makes no one safer.

Equally important, politicians need to stop praising and promoting the officers who get it wrong. And everyone needs to stop castigating, and prosecuting, the victims just because they embarrassed the police by their innocence.

More awareness by the public at large, and law enforcement specifically, about autism and autistics is key to at least remove autism and autistics from the category of “unexpected”.

Schneier on Security: Communications During Terrorist Attacks are Not Bad

In Communications During Terrorist Attacks are Not Bad, Bruce Schneier calls Twitter a “vital source of information” during the recent attacks in Mumbai.  But not everyone agrees, as there were reports that Indian authorities were trying to get people to stop posting information, apparently fearing that the terrorists would be able to use this information.  To that, Bruce says: 

This fear is exactly backwards. During a terrorist attack — during any crisis situation, actually — the one thing people can do is exchange information. It helps people, calms people, and actually reduces the thing the terrorists are trying to achieve: terror. Yes, there are specific movie-plot scenarios where certain public pronouncements might help the terrorists, but those are rare. I would much rather err on the side of more information, more openness, and more communication.

Schneier also includes a quote from David Stephenson in his post US officials must monitor, learn from use of Web 2.0 in Mumbai:

I can’t stress enough: people can and will use these devices and apps in a terrorist attack, so it is imperative that officials start telling us what kind of information would be relevant from Twitter, Flickr, etc. (and, BTW, what shouldn’t be spread: one Twitter user in Mumbai tweeted me that people were sending the exact location of people still in the hotels, and could tip off the terrorists) and that they begin to monitor these networks in disasters, terrorist attacks, etc.

The challenge, of course, is to get authorities to be able to monitor these tools in the event of disaster (man-made or natural) and yet resist the urge to turn it into yet another wholesale surveillance program.

Protecting important files with TrueCrypt

In Information wants to be free, but you still have to protect it, I talked about Bruce Schneier‘s recommendation to encrypt an entire disk instead of just your key files.  But sometimes you want to protect your key files as well, either on your system drive, an external hard drive, or a USB thumb drive. 

TrueCrypt is one option for this and, as Lifehacker tells us today, it now supports Mac OS in addition to Windows and Linux.  For more on how to install and use TrueCrypt on Windows, check out this Lifehacker article.

A cow’s eye view of airport security

If you travel frequently by air, I think you’ll understand where I’m coming from. Originally posted 15 May 2003.

= = == === =====

Moo, moo…..

That’s how I felt earlier this week going through security at Newark airport. I was recently re-reading parts of Thinking In Pictures : and Other Reports from My Life with Autism by Temple Grandin for some posts on my autism blog, 29 marbles, in which she talks about her job designing cattle chutes for slaughterhouses (she’s world renowned for this, despite [because of?] being autistic. Ever on the lookout for connections between apparently unrelated things, my brain presented me with the following thought: “I wonder if Temple Grandin could come up with a better design for airport security queues?”

Maybe not, but this got me thinking about cross-functional lessons learned. Too often, in my experience at least, lessons learned and best practices are explored only from the perspective of a specific functional area. There is a lot to be learned from looking at stories from similar, but completely different, functions.

Using the case of the airport security queue as an example:

  • Many people going through an airport security checkpoint have never done so before (like most [all!] cows at the slaughterhouse)
  • For all practical purposes, the way through the process is to simply follow the person in front of you
  • Occasionally, you will get redirected by a security person to a different line, told to stop, etc with little or no explanation (as if you don’t deserve it or won’t understand it anyway)
  • etc.

The situation of people in a strange (as in unknown) queue system that has no obvious explanation in some ways is not really much different from that of a cow going through cattle chutes. What lessons can we take from Temple Grandin’s success in designing cattle chutes that result in smoother operation and apply to the security line problem?

My real point here is that sometimes you can take insights learned from one thing and apply them to something completely different with great success.

Note: Temple Grandin’s personal choice of a title for Thinking in Pictures was Cow’s Eye View, a reference to how she comes up with her designs. Maybe that’s the simple lesson to be learned here: look at the problem from the point of view of the one going through the process.

How paranoid, er, security conscious, are you?

I originally posted the following in October 2005 and thought it would be a nice follow-up to my recent post Information wants to be free, but you still need to protect it.

= = == === =====

Just as there is a fine line between genius and madness, there is a fine line between appropriate security and paranoia. On which side of that line are you?

Shred your sensitive personal documents before throwing them away? Appropriate security. Spread the shreds in the garden as mulch? Paranoia.

Passwords on your home network? Appropriate security. Issuing smart cards to your wife and kids? What do you think?

For a quick peak into a paranoid security expert’s approach to security, check out Security for the paranoid, which I found via Schneier on Security (one of the few things I make myself check every day).

I have to admit I don’t know if the author is serious or not, mainly because I don’t know him. My first thought when I read it was that he was serious, and seriously paranoid. I know people who think, and act, like this. And, in fact, some of the things he says make sense. For instance:

I frequently see people posting PGP signed e-mails to security mailing lists. It’s not that these people are afraid of someone actually spoofing fake comments from them on the latest CGI flaw; they just make it a practice to sign every e-mail, no matter how trivial it might be. Sure, these people are signing e-mails when it’s really not important, but I doubt they get caught not signing when it is important.

Or

I also delete unused services on my servers. I block unused ports.

But a few things make me think it is just a bit over the top, including:

  • I keep my PC’s turned around so I can tell if anyone has installed a hardware keylogger.
  • I never check in luggage when I fly.
  • It takes five passwords to boot up my laptop and check my e-mail. One of those passwords is over 50 characters long.

One of the keys to establishing good, and appropriate, security is an analysis of the risk/threat, the consequences of becoming a victim, and the cost of the security measure against the cost. This is what the author of this piece misses, as evidenced by comments such as:

  • Sure, the threat might not be real. No one may ever actually want what you have on your PC. But does that really matter? Does the threat have to be real to warrant strong security?
  • There’s no need to analyze the threat of every situation. Just practice strong security always and you should be okay.
  • I don’t do it because I think someone is going to go through my trash to reassemble bits of my research notes. I do it because it’s good security.

I’ve been giving some thought lately to the challenges of enterprise solutions to problems and my belief that “one size can’t fit all”. Though there are some security best practices (for lack of a better phrase) that can be applied in many situations, blind application of these practices to unique situations will likely result in more harm (less security) than it does good.