Mimsy Were the Borogoves

Mimsy Were the Technocrats: As long as we keep talking about it, it’s technology.

How does Apple’s supposed anti-conservative bias matter?

Jerry Stratton, February 24, 2016

Door handle with key

Since you insist, we’re going to leave this key in this back door, but we promise never to enter this way without a warrant. Never, ever, ever. (Belchers Albert, CC-BY-SA 3.0)

The software that the FBI wants Apple to write so as to install it on an iPhone is commonly described as a “backdoor”, but really it’s more of a sliding window. A backdoor is a system currently in place that allows someone with knowledge of the backdoor to open it. What the FBI is complaining about is that Apple hasn’t built a backdoor into the iPhones they sell. And they’ve made it very difficult to guess an unknown password, because, potentially, after ten guesses the phone will erase all of its data—and even if the erase functionality is turned off, subsequent guesses take more and more time. And because the data is encrypted using the password, the data can’t be gotten out in any way other than knowing or guessing that password.

The FBI wants Apple to build a window that they can slide into place, allowing the FBI to keep trying passwords until they guess the right one, without slowing down and with no fear of erasing the data.1

If the iPhone has a 4-digit password, they’ll be able to guess the password in several minutes to several hours, depending on how quickly the sliding window lets them try new ones. If it has a 6-digit password, it might take several days.2

I see a lot of commenters on conservative blogs saying that if this were a Christian baker or a tea party member, Apple would turn over the iPhone’s key without even requiring a warrant.

This is an important point. It isn’t just that we don’t trust the government or Apple to keep the sliding window safe. It is that we don’t trust the motives of future governments or future Apple employees.

Currently, those who think Apple would “turn over the key” if it were a conservative are wrong. Apple can’t do it because Apple designed their phones so that even Apple cannot hack them. They don’t have the capability, because they haven’t yet built the capability, even to brute-force the passwords by trying every possible one.

But if they build the tool the FBI wants them to build, they will have the capability.

This is true of any backdoor that anyone puts into their system. It’s why some NSA employees spy on their lovers. It’s why your CIO asks members of your company’s IT department to read an employee’s private email when they check their mail on company time.

Because they can. Because they’ve built the mechanisms that make it possible, it becomes inevitable that they will be used in ways they shouldn’t. Neither Apple nor the government are a single individual. They are, instead, made up of many individuals, and any one of those individuals can be flawed.

Maybe Tim Cook would use this backdoor against conservatives; maybe he wouldn’t3. But there’s no guarantee that a successor wouldn’t, once the backdoor has been created.

Further, someone trying to social engineer an Apple employee into getting them into someone else’s phone can, today, legitimately be told “it doesn’t matter how sad your story is, we literally cannot do this. If you don’t know your password, there’s no way to find it.”

After this backdoor is built, the answer changes to, “we refuse to try running our tool on your phone.” It’s not hard to imagine a harasser successfully concocting a sad enough story to trick an Apple employee into opening up their ex-wife’s phone, or opening up a celebrity’s phone, or opening up a witness’s phone.

Even if you can trust Apple’s security to keep the backdoor safe, and the government’s word that they will not require Apple to hand the backdoor over once Apple creates it—very dangerous assumptions—there are still very good reasons for not wanting Apple to build it in the first place. This is especially true if you think they have or could have a bias against some groups of people.

  1. There’s another issue, too: the phone was not the terrorist’s main phone, it was his employer’s phone. His employer was the San Bernardino County Department of Public Health. If they had been using Apple’s Enterprise Deployment program, they could have co-secured the phone, allowing them to reset the phone’s password at any time. For a government agency, this is extremely important because it is necessary for adhering to various Public Records laws. But as far as I can tell, they didn’t do this for reasons I cannot find in any source.

  2. I’m assuming about a half-second to a second per try. Also, if it’s an alphanumeric password, even this tool probably won’t guess the password unless it’s very short, but chances are it isn’t an alphanumeric password, because few people want to type in a long password on a crowded keyboard on a phone-sized device.

  3. I think he wouldn’t, or I wouldn’t be a conservative Christian holding stock in the company.

  1. <- Memespreading
  2. TRS-80 100/200 ->