UPDATED APRIL 16, 2016: Although the FBI backed out of the hearing with Apple the day before court, the saga continues and more cases are happening and will happening (and both sides have been asked to testify before congress next week) and I wanted to update this post on understanding the difference between security in the digital realm and security in the physical realm with a video made recently by CP Grey on this very topic, called ; Should all locks have keys? Phones, Castles, Encryption, and You.
I’ve been reading everything I have time to read on the Apple-FBI case where the FBI has publicly asked Apple to build a new version of their operating system with a “backdoor” to install on an iPhone used by dead terrorists so they can have a better chance at breaking into it. The FBI has warrants and permission from the legal owner of the device and Apple has helped in every way possible since the start of the investigation into the terrorists, but is now being asked (being given an Order) to help in a way currently not possible – so, they’re being asked to build a new way of breaking in. They haven’t stated whether they can actually build such a thing, they simply refused on principle and very publicly.
While this post isn’t a summary of the situation – you can read up on it here (I may write a guide to it with the help of our resident lawyer Paul Creech, but we’re betting others will do a better job that we can link to soon… maybe)
This post is a commentary on Orin Kerr’s latest post in a wonderful series he’s been writing on the FBI-Apple security stand-off, coming from the legal perspective and attempting from his viewpoint to clear the air of PR nomenclature to the facts that will matter in court and congress. His latest post relates to policy and the idea of Physical Box Security entitled “Preliminary thoughts on the Apple iPhone order in the San Bernardino case: Part 3, the policy question”
I think he brings up a hugely important element in the case, but I think he’s missing an important part of understanding Physical Box Security. No matter how you slice it, this case will set a precedent, though some argue over which one it would set.
He asks what might be the important question
The question is, what is the optimal amount of physical box security?
He defines Physical Box Security
I use that term to mean the degree of control that a person exercises over others opening his movable property — such as a box, suitcase, computer, knapsack, package or cellphone — when another person has physical access to the property.
He runs through the relatively low-level of security we’ve had in our physical world up to this point in time in history, and characterizes the relative security of the iPhone as on a whole other tier unseen in history.
Until computers, levels of physical box security have generally stayed around a 1 or 2 or 3, with an occasional rare 4 or 5. Computers have changed that incredibly quickly. The iPhone is the most obvious implementation of the shift. Suddenly a large proportion of the population is walking around with physical boxes in their pockets that might have had a security score of (say) 7 in 2014 and (say) 8 today. Looking ahead, a 9 in 2017 seems possible, with 9.5 or 9.7 looking possible for a few years beyond that. It’s an incredible change.
He therefore re-stated the impending question as
The big policy question in the Apple case is this: To the extent governments can control it, what is the optimal amount of physical box security that people should generally have in their phones? Is a 4 the best? Maybe 6? Is 10 the ideal? And is it practically possible to have different levels of physical box security in different contexts?
He points out that, at least in the past, the US government has been very much against the idea of any technology that would be perfectly secure, a “10” on his scale. And so since it has rejected the idea of total encryption, how much security is it going to be, especially in light of the new frontier of security that an iPhone can have that he argues thinks like little diary locks can’t have.
In a sense, he affirms that idea that the case is about encryption, but states that for all intents and purposes, the government already has taken a stand on encryption in a “total” sense, that it’s not what they want, as seen in past cases like the CALEA act of 1994 where congress made sure Internet and such service could still be tapped since they didn’t want them “10” level completely secure.
His whole point to is to get us to think about the issue as a complex and hard one
My bottom line is that I don’t know the answer. I want to convince you, however, that it’s a really hard problem. The answer hinges on unknown answers to empirical questions and predictions about the future. I don’t know what answer is best because I don’t know how the empirical questions shake out and I don’t have a crystal ball. My goal in this post won’t be to provide answers, but instead to suggest a way to think about the questions.
So I don’t take issue with is answers as much as with one of the key ways he want’s us to think about the question of physical box security.
In my reading, he’s totally missing the key element of access in Physical Box Security, and how, besides having the unprecedented ability to have “high physical box security” there is also unprecedented ease of access through interest connections that color the question quite a it.
I see Orin as focusing too much on the locking mechanism itself. A diary might have a little “1” level lock, but the total Physical Box Security goes up meaningfully in context, because you can put in another safe, in a house, or hide it (maybe buried with treasure), and increase it’s security by physical limits, especially distance. In Oklahoma the Chinese government would need to do a fair bit of work to get that diary than if it was on a government desk in Beijing. All that puts the total physical box security of the little diary actually much higher in an real-world non-vacuum scenario. But where iPhones can have much better “base” physical box security – a “7” perhaps- because in its most essential functional case it lives connected to networks, and various ones, there is unprecedented ease of access to an iPhone that a diary is safe from. It’s crazy easier for China to get into an Oklahoma iPhone then an Oklahoma diary, and in that context, the iPhone’s total physical box security is dramatically lowered and the diary’s is raised. Physical limits help a diary be more secure but they mean very little to connected devices.
In fact, I think the whole point of the extreme security on an iPhone has to do with that crazy ease of access through the networks. It doesn’t even require human effort. You could (and people do) write scripts/etc. that do the work for you. Bot-nets are created in the same or analogous fashion, not by people individually lifting even a finger to infect your PC, but.
Since people don’t have to lift a finger to infect your computer (they can write a chunk of code and turn it loose) the physical box security level for connected devices needs to be exponentially greater or we couldn’t trust even a casual iMessage from another person or do banking online. We simply couldn’t be connected at any real level of safety to make connecting useful.
Therefore to truly compare the iPhone’s security level and the diary’s security level means to really consider the diary’s monumental security advantage – despite it’s little lock.
Is the iPhone really some lock box with unheard of levels of security or a vulnerable connected device? With the level of hacking happening in the world it’s obvious how quickly that question becomes rhetorical -and that the iPhone needs higher physical box security levels to combat the total security level that’s tremendously lowered by virtue of its connected use and ease of access.
Two notes: I’m not a lawyer or implying that the level “access” that obviously impacts how we think of Physical Box Security has any bearing in the US courts. I’d argue It should have an impact on policy and lawmaking. Secondly, I’m not a security expert and try to speak commonly, so my dad can understand, and so any nitpicking over my use of words like “access” and the actual security/encryption lingo will fall on deaf ears as I avoided such words on purpose.