Saturday, February 20, 2016

Apple vs. The FBI - Is Tim Cook Objectively Pro-Terrorist?

Over/Under: 45 Million Attempts
On December 2, 2015, Syed Farook had three mobile phones. Two were his own devices, while the other, an iPhone 5, was issued by his employer, the County Department of Public Health. In the aftermath of the shootings, the investigation determined that he had destroyed his private devices, but not his company iPhone. The Department of Justice issued a warrant for his iCloud backup, which Apple dutifully turned over to the FBI through its compliance office. The FBI was frustrated to discover that, for a variety of reasons not clearly understood, the last iCloud backup was from October 19th. That meant that any relevant information that device might contain would still be on the phone.

The good news is that, being an older iPhone, the passcode is a simple numeric string. The bad news - at least from the government's standpoint - is that the iOS operating system, like virtually all operating systems, contained a lockout function. After a series of ten unsuccessful login attempts, the phone would wipe it's memory and storage. This is one of the oldest, most basic OS security functions, a very straightforward common sense protection of private information in case the device itself is stolen or compromised.

Now, the FBI forensic team is pretty certain they can crack the passcode on Farook's iPhone using brute force - just by connecting it to a program designed to try millions of numeric combinations until, ultimately and by complete accident, hitting on the right one. But before they can do that, they need to the core OS lockout functionality to be disabled. Of course, there IS no version of iOS without that function - it's been a common security feature in all operating systems for decades. Enter your friendly authoritarian surveillance state.

The DoJ simply issued another court order, this one requiring Apple to provide a version of iOS compatible with the device in question that did not include the lockout function. Only Apple can do this because any operating system upgrade or change would have to be 'signed' with Apple's own digital signature - another form of encryption that is used for authentication - or the phone itself would refuse to install it. Apple has so far refused to comply, and is challenging the legality of the court order.

There are many, many reasons why Apple should not be required to provide the requested code, very few of which have anything to do with Syed Farook or the tragic events in San Bernardino. After all, if he had any real concerns about the information on the iPhone, it seems as if he would have destroyed it too. The fact that he chose to destroy his privately owned devices would indicate that there isn't much, if anything, of interest on the iPhone. But even so, let's think about what the government is demanding here. A warrant requires a person or organization to produce something in their possession - an item - a gun, a computer, a safe, specific documents - or knowledge, such as GPS data or billing records. In this case, the government isn't asking Apple for the passcode - they know Apple couldn't possibly know what it was, and therefore could not provide it if they wanted to - instead the government is asking Apple to create a security-crippled version of their operating system and hand it over to the FBI for use in cracking the Farook iPhone.

But think about it - this 'back doored' version of iOS will be nothing but a digital file - copies will be saved on hard drives just in the process of making the intended use of it. Even if we accept that the FBI will follow the rules of the court around how and when they could use such information - hah! - does anyone truly doubt that within a week it will be in the possession of the NSA? And once a piece of malware is in the wild, it will show up in all sorts of places - it's just too easy to copy and share.

Further, the government's legal premise here is a piece of the Judiciary Act of 1789 called the "All Writs Act". The AWA is quite brief and tremendously broad, and essentially says that in the absence of alternative remedies, United States Federal Courts may "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." If the courts begin to determine that there is no limit to the kinds of requirements they can place on people and companies to access private information, it's very hard to see where any of this ever ends. Compromised encryption, real-time data feeds, targeted malware - what limits would exist?

And, of course, looking at it from the other side, the government is treading awfully close to 'be careful what you wish for' territory here. If a global community, such as the old Open Software Foundation or the GNU project or the Linux community, came together to develop a set of free open source encryption tools that were hardened against any kind of compromise, there would be no owner or organization who the writs could be issued against. The government could try to crack these tools, but since they can't even crack oPGP - 20 year old technology - one might think they'd want to be a little more cautious in their demands.

1 comment: