iPhone Security: Apple Refuses FBI's Demands to Create iOS Backdoor
In a letter dated February 16th, Apple CEO Tim Cook responded to the FBI's demand that Apple create a "backdoor" to bypass the encryption on an iPhone used by one of the perpetrators of last year's terrorist attack in San Bernardino, CA.
Cook began the letter by stating that Apple has "no sympathy for terrorists" and has cooperated in giving the FBI all of the information that it has available.
However, Cook says that Apple itself does not have technology that could bypass iOS encryption, and that creating such a backdoor has the potential to compromise the security of all iOS users, and is not a step the company is willing to take.
The U.S. government claims that it's only interested in the bypass for this specific phone—a work phone used by Syed Rizwan Farook—and not any others, which, uh... well, you all know what laughter sounds like.
Even if the government has no honest intentions of using an iOS backdoor for nefarious purposes, it's pretty hard to believe that such sensitive material wouldn't wind up in the wrong hands—especially considering its history keeping track of laptops with classified information, firearms, and nuclear weapons.
You also have to wonder if there's even anything important on Farook's work iPhone. Farook and his wife/accomplice, Tashfeen Malik, destroyed their personal phones to the point that any data was unrecoverable, and a hard drive from the couple's computer is still missing. Sure, it's possible that there's evidence on the work iPhone, but the odds seem low considering the steps these two went through to destroy other evidence.
What the FBI is demanding is basically an override of iOS's passcode encryption. Currently, the system will wipe all data off the phone after 10 consecutive incorrect entries. The FBI wants this fail-safe removed so that it can enter passcodes electronically and at high speeds, to "make it easier to unlock an iPhone by 'brute force,' trying thousands or millions of combinations with the speed of a modern computer," said Cook.
Once the encryption is breached, the FBI could access an iPhone, and the "government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge," Cook writes. Given its track record, there's basically no way to believe that the United States government wouldn't meddle with such powerful technology.
On the other hand, you can argue that having remote access to smartphones would come in handy when trying to snuff out potential terror threats. But considering the attention this is receiving, you'd be stupid to think that terrorists, or anyone else with malicious intentions, would continue using smartphones at all, knowing that law enforcement could be watching them at any or every moment.
You'll probably hear some noise about how Tim Cook and Apple are enabling terrorists by refusing the FBI's demands, but that's going too far. If you want to get cynical about it, Cook's statement is as much about principle as it is about retaining Apple's customer base (and maybe even using this protest win new customers).
But Cook's outlook is correct. The government's argument is paradoxical—it demands that we relinquish our security, so that we may be more secure. Hopefully the other tech giants (I'm looking at you Google) join in the fight to protect our data.
Sundar Pichai, CEO of Alphbet's Google, has chimed in (though we were hoping for more).
But I suppose another post from another big tech CEO isn't going to move the narrative along all that much.
Let us know what you think... Is the government extending its reach too far? Is Apple in the wrong for not complying? Sound off below.