Hello? Hello? Yeah, how’re you? Sure, fine, fine. Hey, the FBI wants Apple to unlock some terrorist’s iPhone. What’s wrong with that? I don’t even live in the U.S. So should I care about this thing between Apple and the FBI?
Yep, you should care. Very much. Hang up the phone and listen.
What This is All About
The FBI wants Apple to help unlock an iPhone used by one of the attackers who killed 14 people in the December San Bernardino shooting. Specifically, the Bureau wants Apple to create new software that would override a security system on the phone designed to erase its contents after ten unsuccessful password tries. The new software would also eliminate the built-in pause required between tries.
The software on the San Bernardino shooter’s phone, after ten tries, will automatically destroy any data on it as a security measure. The FBI needs that ten try limit, plus the required pauses between tries, taken away so that they can run a “brute force” attack against the password. A brute force attack runs an unlimited number of passwords (a1, a2, a3… aa1, aa2, aa3…) at high speed against the system until one works.
Court documents show that Apple has been a reliable ally of the government since the iPhone debuted in 2007, unlocking phones used by criminal suspects at least 70 times between then and last fall. For unclear reasons, this time Apple said no. The FBI took Apple to court, where it successfully argued an 1789 law that compelled cooperation with court orders applied to Apple’s encryption in 2016. Apple is appealing.
What This is Really All About
This is really all about encryption, and whether the U.S. government can force companies to bypass their own security systems on demand. It is about whether a tech company’s primary obligation is to provide secure products that protect the privacy of its customers (good and bad people), or to act as a tool of American law enforcement to strip away that privacy as the government requires.
The battle is actually even more significant. Since the Ed Snowden revelations exposed the NSA spying on persons worldwide, including inside the United States, the Federal government has been demanding a “back door” into commercial encryption systems.
Some simplified tech talk: encryption turns data from something that can be read into 23hd892k*&^43s. Two “keys” are needed; one to turn the data into unreadable text, and one to reverse the process. In the case of the iPhone, Apple holds the encrypting key, and the user the unencryption key, her password. A backdoor is a bit of computer code that would allow law enforcement to bypass that second key and read anyone’s data. That’s what the Feds want, as, per Snowden, some current, commercially available encryption may still be beyond the NSA’s ability to break, and some other encryption can only be broken slowly, with expensive computers.
What This is Really, Really All About
The fight isn’t over whether Apple can comply with the government’s request; technically it can. It’s whether it should.
Efforts to force companies to create that desired back door have proven unsuccessful. Many tech companies resent that the NSA hacked into their systems whenever possible up until the Snowden revelations, and others fear a consumer backlash if they cooperate too broadly. Congress so far has been unable to pass laws compelling the creation of back doors. The FBI is so desperate that they even deleted “safety” advice they once issued recommending people do encrypt their phones.
The San Bernardino shooter’s iPhone is seen by many as a test case.
The request is technologically doable, the shooter is dead, fully without privacy and cannot countersue, a search warrant for the phone exists, the phone is physically in the FBI’s possession on U.S. soil and the circumstances are very much PR-friendly — the guy was a terrorist, and who knows, maybe the phone holds clues to prevent some future attack. You really can’t do better than that.
Some 40% of Americans agree that Apple should unlock the phone. And just in case you still don’t get it, remember the government took the provocative step of asking the court to unseal the case, which would normally be secret by default.
Apple is pushing back.
The company filed a request to vacate response to the court order, claiming it violated the First and Fifth Amendments, and exceeded the powers granted to the government in the All Writs Act, that 1789 law. Facebook, Microsoft, Twitter and Google plan to file briefs supporting Apple’s position. Meanwhile, both the FBI and Apple want Congress to weigh in, and indeed the House Judiciary Committee will hold a hearing on encryption issues.
It is very likely the case will reach the Supreme Court.
The Broader Implications
The case the Supreme Court will almost certainly hear is not about a single phone, but about creating a legal precedent for the United States government to demand whatever cooperation it needs from private companies with stockholder obligations to bypass security and encryption as it wishes; FBI director Comey stated the case will “be instructive for other courts” when interpreting how far third parties have to go in helping the government hack their products.
In an op-ed, the New York Police Department Commissioner and his intelligence and counterterrorism chief admitted that what Apple has been asked to do will drive how the government demands tech companies provide access to secured devices in the future.
Why You Should Care
If Apple fails, the U.S. government will be able to read the contents of any electronic device in the U.S., regardless of encryption. The legal precedent will absolutely spill out past the iPhone to all other devices. For anyone who lives, travels or passes through America, this will touch you. In addition, phone, email and social media data passes through the U.S. from many parts of the world even if the users on both ends are outside the country.
In addition, what would Apple’s (Google’s, et al) response be to a request from your favorite bad government? What if China were to require it hold a backdoor key as a condition for sales in the Mainland? What if your favorite bad government overtly decided to use that backdoor to “legally” gather proprietary data from your company, against journalists and dissidents, or to amass blackmail information on a colleague?
A win for the government in the Apple case would also further stretch the applicability of the All Writs Act to ever more information inside the U.S., or held by companies with ties to the U.S. — medical records, for example.
For investors, will knowing the U.S. and your favorite bad government now have access to a device help or hinder sales (Apple has already claimed compliance will “tarnish the Apple brand”)?
And of course once backdoors exist, who, in the age of leaks (Snowden hacked the NSA itself), can assure that the knowledge will not end up your favorite set of wrong hands, say perhaps those Russian gangsters who are always sending you Spam emails?
Bottom Line: everyone has something they wish to keep to themselves. The Apple case will significantly affect how possible that will be going forward.