Encryption - so necessary but so dangerous?
The public are an alarmist lot. Despite the fact that most people spend their time online sending pointless messages to each other on Facebook or reading rude jokes, as soon as there is the possibility that GCHQ or the NSA can read your data, everyone gets up in arms. How dare they read about my visit to a Birmingham shopping centre or my latest status update involving a large glass of beer.
So we end up with quite a wide deployment of SSL/TLS. Not such a bad idea. For most of us, especially those in business, the cost of an SSL cert although much greater than it needs to be is fairly cheap in the scheme of it. The server overhead is minimal and everyone's happy right?
No. Of course not.
Enter the Systems Engineers.
You see, TLS is all OK and everything but what happens after the TLS is terminated at some server somewhere? How is the data transmitted around the data centre or stored on disk? Your password is usually sent to the server to login so even though it isn't stored in a readable format, it could still be interecepted reasonably easily when you are in the process of logging in. Even if you have it setup wonderfully well, I'm pretty sure the NSA can sign their own certificates and could presumably man-in-the-middle most sites without most people spotting it so what can we do?
Apparently we should encrypt everything at rest and encrypt our comms end-to-end from the browser itself right through to some trusted other end.
So now we're good?
No. You see theoretically, if someone has a computer the size of the moon and 50 years, they could potentially crack your Facebook traffic because it uses TLS 1.0. TLS 1.0? You still use that ancient, weak, crackable scheme? You might as well just send stuff in plain text. Or so some of the Systems Engineers would have us believe. The reality, of course, is that most of these theoretical weaknesses are so hard to achieve they are only of interest to people who have things of real value to crack. No-one is going to spend a year trying to access my Facebook account - although they will spend that time trying to access Lockheed Martin or Boeing.
We are running to stand still, the perfectionists are dictating good-practice and we all get sucked up into it. Including me.
I needed to re-install OS-X on my MacBook (although according to one blogger, I should never have to do that because it has never been necessary for him to do it!) and I was REALLY careful to backup everything and to double-check that the backups at least appeared to be on the backup disk, some Western Digital NAS box. It all seemed good and after taking a deep breath, I took the plunge and reinstalled Yosemite.
And then I accessed the Time Machine backup. Which I had encrypted, of course. I'm not a privacy junkie but I have work code on my home laptop and I would be more comfortable if I had the NAS stolen to know that it's encrypted.
But I couldn't remember the password. I tried all the usual ones and nothing. I can't even get the "hint" because that's stored in Key Chain and the MacBook is reinstalled. Apparently there is no recovery process because these systems are designed to be perfect.
And that brings me to my point. Why do we insist on pefect security? Our houses are not perfectly secure by a long shot and they are much more likely to be attacked than any of my software systems. My front door key can be easily copied, the lock could be bumped or snapped fairly easily, the windows could be shattered and an intruder could get in so easily but somehow I live with that risk. In the computer world though, we are told that this risk is not acceptable. We are not taught, certainly by the vocal engineers, to risk assess what we are doing. Things like encryption key rotation and such like are all very well but are they really necessary or do they just increase the chance that they will cause us to be stuck with a lost key and an inaccessible system?
So we end up with quite a wide deployment of SSL/TLS. Not such a bad idea. For most of us, especially those in business, the cost of an SSL cert although much greater than it needs to be is fairly cheap in the scheme of it. The server overhead is minimal and everyone's happy right?
No. Of course not.
Enter the Systems Engineers.
You see, TLS is all OK and everything but what happens after the TLS is terminated at some server somewhere? How is the data transmitted around the data centre or stored on disk? Your password is usually sent to the server to login so even though it isn't stored in a readable format, it could still be interecepted reasonably easily when you are in the process of logging in. Even if you have it setup wonderfully well, I'm pretty sure the NSA can sign their own certificates and could presumably man-in-the-middle most sites without most people spotting it so what can we do?
Apparently we should encrypt everything at rest and encrypt our comms end-to-end from the browser itself right through to some trusted other end.
So now we're good?
No. You see theoretically, if someone has a computer the size of the moon and 50 years, they could potentially crack your Facebook traffic because it uses TLS 1.0. TLS 1.0? You still use that ancient, weak, crackable scheme? You might as well just send stuff in plain text. Or so some of the Systems Engineers would have us believe. The reality, of course, is that most of these theoretical weaknesses are so hard to achieve they are only of interest to people who have things of real value to crack. No-one is going to spend a year trying to access my Facebook account - although they will spend that time trying to access Lockheed Martin or Boeing.
We are running to stand still, the perfectionists are dictating good-practice and we all get sucked up into it. Including me.
I needed to re-install OS-X on my MacBook (although according to one blogger, I should never have to do that because it has never been necessary for him to do it!) and I was REALLY careful to backup everything and to double-check that the backups at least appeared to be on the backup disk, some Western Digital NAS box. It all seemed good and after taking a deep breath, I took the plunge and reinstalled Yosemite.
And then I accessed the Time Machine backup. Which I had encrypted, of course. I'm not a privacy junkie but I have work code on my home laptop and I would be more comfortable if I had the NAS stolen to know that it's encrypted.
But I couldn't remember the password. I tried all the usual ones and nothing. I can't even get the "hint" because that's stored in Key Chain and the MacBook is reinstalled. Apparently there is no recovery process because these systems are designed to be perfect.
And that brings me to my point. Why do we insist on pefect security? Our houses are not perfectly secure by a long shot and they are much more likely to be attacked than any of my software systems. My front door key can be easily copied, the lock could be bumped or snapped fairly easily, the windows could be shattered and an intruder could get in so easily but somehow I live with that risk. In the computer world though, we are told that this risk is not acceptable. We are not taught, certainly by the vocal engineers, to risk assess what we are doing. Things like encryption key rotation and such like are all very well but are they really necessary or do they just increase the chance that they will cause us to be stuck with a lost key and an inaccessible system?