Wanting It Bad Enough Won't Make It Work: Why Adding Backdoors and Weakening Encryption Threatens the Internet
Wait a minute… If it’s technically possible for Apple to program the code that will serve as a backdoor, can’t anyone theoretically program it? It’s just code.
I understand that this would require access to highly confidential information regarding the fundamental ways in which iOS is put together… but what prevents these information from being leaked?
It really gets me that we have people in positions of real power and influence, both in the government and the media, who don’t understand that math and physics don’t care what you want. I really think that anyone who would make pronouncements like the ones we’re hearing from our security apparat without checking with an expert first is not qualified for any position with more responsibility than night shift at a convenience store.
No, I’ve worked night shift at a convenience store. It requires some comprehension of the fact that actions have consequences.
So have I, and I agree, but most of the actions and consequences involved in that are pretty easy for most people to grasp, and the worst case scenario when someone doesn’t understand that isn’t even likely to put the store out of business, much less wreck the internet or the ecosphere, so it’s a position you can take a chance on (and really, you have to take a chance on, because dealing with all the detritus of humanity you run across working graveyard at a convenience store is enough to make just about anyone who can get another job do so.)
I’m a little pissed that this wedge issue is causing people to conflate an OS update with back-dooring encryption. Don’t get me wrong, I think Apple is taking a stance on something important here, but it’s muddying the waters as far as understanding the issues goes.
The court order is asking for a custom version of the OS / firmware so they can access this one phone without it auto erasing as would happen if they tried to brute force the password. The phone won’t accept an OS update unless it is signed cryptographically with Apple’s key. That’s where the crypto argument should stop.
Apple is making a stand and stating that they should not be compelled to break their own software. I’m in agreement there. The problem is, by framing this as a “crypto” backdoor issue, if the FBI wins, then there is legal precedent to go after back-dooring cryptographic protocols. Basically the FBI is trying to erode legal precedence set since the early 1990’s and Apple is calling BS and going “all in”.
Yes, creating a specific opportunity to brute force a phone already in Gov’t possession is far different from backdooring the phone of someone walking down the street. This would do nothing to you unless they seize your phone, in which case you have a lot of other problems too. This is not “breaking crypto”, as I understand it.
It still opens a can of sand worms. There are a lot of things to criticize apple for, but not this one. This is a huge, wonderful FU to our burgeoning surveillance state. And I give apple a standing ovation.
I may send Tim cook a Christmas card this year
It might not be as cut and dried as you think. This forensic fone fellow discusses the legal requirements of implementing the order…the short version is that plenty of other people will hold Apple’s code, and whatever “instrument” is delivered will quickly have many new homes.
except that this is part of a larger pattern of trying to weaken security, which has previously (and notably) taken the form of trying to weaken encryption by asking for golden keys/front doors.
just because people are using the term backdoor doesn’t mean it’s a crypto backdoor. backdoors can refer to all sorts of things, and in this case the trojan firmware image they’re asking for is a backdoor that facilitates bypassing the passcode authentication mechanism by allowing them to be entered programmatically from another computer without any of the delay or destruction mechanisms being triggered…
Thank you for the explanation. I wasn’t aware of the Apple signature element of the problem. But this prompts another question - is the update process itself protected against brute-force attacks?
Its going to use a long key. 1024 bits would be short, but much longer than the pin. You would have to build a distribution for every key value, then test it against a phone. In theory you could buy a million iphones and test them in parallel but in practice its going to take a lot longer than trying permutations of the day. month and year of the owner’s birthday, etc.
That is a question of how many attempts per second/attempts in total does the system permit. If the update process is protected e.g. with delay, this would make the brute-force approach impractical. If it isn’t, forcing an update first and subsequently bruting the PIN could be an avenue. In addition, if the Apple key itself is leaked, that also results in this vulnerability (although I assume the key is periodically changed as a part of the normal updating process…?)
Apple is required to create a custom firmware update tool that, because it was required by a court order, has to be independently tested for reproducibility, meaning that it would have to leave Apple’s campus/personnel/security, potentially making many stops along the way. An even more important question is whether a version of the software that does not include Apple’s key signature for firmware updates would be sufficient to pass the tests, almost certainly not. Meaning that in order to do a real world test for reproducibility, the testing party would almost certainly need Apple’s key. The only other way to do it would be to create a one-off version of the iphone hardware that accepts a one-off key, which there’s no reason to believe the court would find as a suitable substitution.
Bottomline, this order could easily take Apple’s cryptographic key out of Apple’s hands. Given it’s value, it’s hard to imagine that it would survive long in the hands of multiple parties. I think that keeping key-signing secure is doable for Apple only because they don’t have to coordinate it with other organizations, which adds exponentially more opportunity for error, both in terms of technical logistics and in terms of overall process management, I don’t have any reason to believe the justice department is prepared to pull it off under threat from major state-sponsored attackers.
I understand the long game on this. The conflation of terms just makes it hard to explain to the uninitiated. I think the legal definition of “backdoor” may wind up rather blurry as well.
I’m curious about the public grandstanding surrounding this, Surely there was some silicone-valley-government intermediary who could have walked this phone into Apple’s inner sanctum and had it quietly unlocked? I’m no Neo, but isn’t it most likely the firmware security feature is likely overridden in a test build for high-speed automated regression?
To what purpose all this brouhaha?
Everyone is bent out of shape over the wrong thing. People should realize that closed ecosystems are evil. Apple shouldn’t be ABLE to give the encryption key because you should be able to use a variety of encryption systems on a general purpose computing device. Apple shouldn’t be able to control or vet what company or coder that YOU want writing your encryption system. They should have no control whatsoever about eh software on your device.
A closed ecosystem has pluses and minuses. For enough those pluses matter more. I don’t want to be on the phone explaining to my mom why she has to choose between Rijndael or Twofish from a set of packages to set up her phone and what this means. The chances are if we left the public to decide, most of them would make terrible choices and generally skip all encryption since it’s a major hassle to understand for normal people. Also on the iPhone AES 256/Rijndael is hardware, which is a bonus from a performance perspective.