So this week, we have been made aware that Apple have refused a US warrant obtained by the FBI to assist them in hacking the iPhone of Syed Farook, the man who killed a group of people in San Bernardino after which he was killed in a shootout with police.

As with many stories like this, we understand why the FBI would want access to his phone. Clearly, there is a chance that there are other people involved who would be found out if they could read the contents of his phone (but that isn't guaranteed). Because he is dead, they cannot coerce his PIN number from him and since iOS has an automated kill mechanism, too many wrong guesses and the phone wipes itself, for good. The line that is taken by law enforcement, is basically, "If you don't help, you are helping the terrorists".

The question, however, is not as simple as whether Apple support terrorists or not, clearly they would not. The question is a deeper one about privacy, about where the line is drawn, about a country who are famously liberal with application of law when they want to obtain something (which is presumably why they have the "fruit of the poisonous tree defence"). The government, or at least the executive branch, have an unprecedented level of power, as they do in many countries, to play fast-and-loose with the law, to mislead judges in order to obtain warrants, or simply to do what they want under the protection of these secret services on the basis that it mostly doesn't get found out. If it does e.g. Edward Snowden, they justify it, get some judge to rule that it is OK and use their vast resources to fight out any court challenges. In the UK, even the oversight committees are so secret that you still don't really know if things are covered up "for national security", since that defence can apply to anything that the police or secret services do.

Anyway....

What the FBI are actually asking Apple for is to allow them to bypass the lockout mechanism on the iPhone by modifying or replacing the software on the phone, ideally to allow them to electronically brute-force the PIN and then, of course, they will have access and can carry on their job.

Apple have refused by citing privacy concerns and needing to not only protect their customers, some of whom operate in very dangerous parts of the world, but to make it public that Apple are serious about privacy - which is all very commendable.

However...

There is something that the tech community have started smelling and it is an out-of-band attack that would potentially undermine everyone's iPhone. But let's go back a step.

Ideal encryption mechanisms are not only based on algorithms that make the encrypted data basically unusable without brute-forcing a key (we'll ignore subtle weaknesses like padding attacks and such) but it also has to include everything in the entire encryption system. For instance, using something solid like AES256 encryption but then leaving the encryption key stored on a flash disk would undermine the encrypted data. We would call this a side-channel attack - effectively not needing to attack the main thing by attacking something that gives you the same result, a bit like Luke Skywalker shooting into the Death Star's ventilation shaft.

On the surface, the iPhone is secure since even though it only uses a PIN (10,000 combinations for 4 digits) by only allowing 3 guesses, it restricts the abuse potential - all good so far.

But because the FBI are asking for a bypass and because Apple have not said that it is impossible, the implication is that somebody with the correct source code or tools could carry out the same attack - bypass the lock mechanism, brute-force the 10,000 PIN combinations (which let's be honest is not hard) and access an iPhone, even if it has the most wonderful and secure encryption algorithm known to mankind.

So my own opinion is that Apple are not just worried about privacy from a libertarian point-of-view but also the worry that any production or exposure of their software to this backdoor would quickly render all iPhones worldwide vulnerable to the same attack and their much touted security credentials go down the pan.

To be fair, there aren't many other options for Apple that couldn't be bypassed by their own software team but presumably it would be possible to engineer something, in software or physically, that would make it impossible to crack a phone at a later date without causing the data to be wiped. They could perhaps also create a large kill switch which any would-be terrorist would simply press before committing their crimes to leave no trace. I would be interested to know whether any useful data is found on phones. I would think a terrorist would be smart enough to use a prepaid mobile that is only used to call other prepaid ones...