Crypto Wars 2.0: Apple vs FBI & 18th century law

Ben CreetA blog post from Ben Creet, Senior Issues Advisor at InternetNZ
18 February 2016

You may have seen some ‘brouhaha’ (as the media like to call it) online about Apple either: giving the FBI & US courts the finger; that Apple has ‘sided with the terrorists’; Apple are standing up for Freedom, Justice and all that good stuff.

This is a great example of how the right (or wrong) case can lead to some rather interesting tensions.

So here’s the basic rundown of the situation:

  1. The San Bernardino shooter (Syed Rizwan Farook) had an iPhone 5C and the FBI want to be able to access it, but they don’t know the PIN. 
  2. Knowing what systems he used, who he was in contact with etc is all legitimate, useful things to know in a suspected terrorism / national security investigation.
  3. If you put a PIN or passphrase on your iPhone it encrypts the content and means that if someone can’t unlock the phone, they can’t read the contents (the whole point of having a PIN right?)
  4. The FBI are concerned that if they guess too many times then the iPhone’s security features will lock the phone permanently. This is a fair concern as that is a pretty standard security feature designed to stop ‘brute forcing’ the PIN (i.e. try the PIN as many times as it takes to get in).
  5. A US court has ordered Apple to build a custom version of iOS (the operating system of iPhones) just for use in this case that disables the auto-lock functionality and enables the FBI to run brute-forcing software (so they don’t have to manually enter the PIN 10,000,000 times).
  6. Best of all, the legal authority under which this order has been made is the All Writs Act of 1789. Yes, you read that right. 1789 - as in the 18th Century.

Apple is resisting this order, saying that it creates a dangerous precedent that other countries could use, that creating a custom OS for a single device is, in effect asking them to create a backdoor for all iOS devices using a compatible version. If the US can get that, why wouldn’t Iran, China, Russia or Burma demand a backdoor too?

Now, Apple HAS helped the FBI, as is right and their duty under law. As they say themselves.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

When I read about these things I think about the following:

  • Why is Apple in this position?
  • Is Apple right?
  • Could this happen here in NZ?

Why is Apple where it is?

The real driver here is since the Snowden leaks (yes, sorry for invoking HIS name), Apple and other tech companies have been upping their technical security. They’ve also realised that they are valid targets for stealing keys from. So they’ve removed themselves from the equation. Apple has designed newer versions of iOS so that your device is encrypted, and they don’t have a key - only you do. That movement has effectively been provoked by public fallout from people learning what the NSA & other intel agencies have been doing.

If Apple can’t access your content, then that’s one less weakness in their encryption and one less thing for them to worry about. Also - if you stuff it up and lose your ability to get access to the phone you’ll either use an apple licensed store to reset it, or buy a new iPhone (Apple’s VP of Sales preferred option I’m sure).

Apple don't do backdoors (now) & they won't put one in. They've argued against it and they appear to be playing hardball with US Courts on this matter too.

Is Apple right?

Apple will certainly be hoping that they are on the right side of history and that their customers will agree with them. But is requiring Apple to build a new version of iOS, for a single criminal investigation really just ‘assistance’? Or is it asking too much from a private company? The US Government’s lawyers certainly argued that doing this would not be ‘overly burdensome’ as Apple does software stuff all the time.

So here’s my take (and remember, I am not a lawyer). 

I think this is a burdensome order. Asking a company to stop/slow their own work to dedicate developers to fork iOS, then remove functionality while making sure it’s stable and can be loaded into an existing iOS device (that is locked) is not a trivial 5 minute job. IF that software already existed, then yeah sure - that’s not terribly burdensome. But asking someone to ‘assist’ you by developing brand new software, for free, is a bit of a stretch of what constitutes 'assistance'. Asking Blackberry to decrypt content with their master key - that's assisting. 

It could also be considered a burden on the company as a whole if a large proportion of their customers chose to leave based on these actions - i.e. if the iPhone becomes the least private device on the Internet, then they’ll go to Android or Windows Phone.  (just jokes, no-one uses WinPhone).

But, this is where the current angst is. Apple, and other tech companies are writing themselves out of their systems, so they can’t be served with warrants to decrypt phones and content. If they don’t have a decryption key, then they can’t actually comply with a court order. What the US is asking Apple to do is, in essence akin to asking someone not to help you (which is a fair request as a part of court orders), but to dramatically change their business model, back to the way things were in the 'good old days'.

Could this happen in New Zealand?

This has been subject some discussion today on Twitter. Paul Brislen has covered this angle over at the IITP's TechBlog like so:

In New Zealand, the Crimes Act requires individuals to provide decryption keys should law enforcement officials require them. Failure to do so may result in fines or a prison sentence.

The relevant parts of the Crimes Act were probably designed to deal with people who know PINS, passwords etc and don’t want to give them to law enforcement agencies. 
In NZ law, we are typically not held accountable for things we don’t have, own or know. You can’t be held liable for information you don’t have. Police can’t arrest you for possession of drugs that you don’t actually have. You can’t demand a public servant write a briefing on X topic to respond to an OIA. Therefore, in the case of decryption keys: surely, if you don’t have any, you can’t be expected to give law enforcement agencies something you don’t have. 

However, there is a slightly different expectation on some NZ organisations - our network providers. There is also part of TICSA (section 10.3 which requires network operators (and maybe service providers?) to provide a decryption mechanism for any encryption mechanisms they develop. 

So - yes, from my reading of that, if NZ Police asked Spark to decrypt something in one of their products, they’d be obliged to do it IF they were the ones that provided the encryption. But, the flipside of that is that a network provider couldn’t be asked to decrypt Signal content, iMessages or iPhones - because they don’t provide the encryption in those apps and devices.

In summary

  • Apple have a fair point and the FBI is being ‘cute’ with the law and the potential scope for future use/misuse is large
  • Software is hard, encryption is harder, but this isn't about breaking encryption - it's about a backdoor-by-any-other-name (a side-facing ranch-slider perhaps?).
  • If disabling iOS security features to enable brute-forcing is something the US government wants and thinks it’s not that hard to do:  
    Maaaate - build it yourself...

Would you like to know more?

Fill your boots:


Let's suppose that Apple caved in. At the best this one iPhone would be broken into and useful information obtained. That this happened would become known to the "bad guys" who would subsequently use their own tools to encrypt critical data. The benefits of future use of such a tool are debatable.

In the worst case the tool would be leaked or reverse engineered and become available to the "bad guys". They would be safe, but all our iPhones would be hackable.