Error 53: Apple remotely bricks phones to punish customers for getting independent repairs

The real risk is that a malicious sensor would have access to the iPhone decryption key stored in the Secure Enclave when the device is powered on and could pass that along (or use it maliciously).

Getting your fingerprints isn’t much of a risk since it doesn’t grant malicious actors access to anything by itself. Even when TouchID is used to unlock your iOS device, it’s not using that as the decryption method, but as a gatekeeper to the real key stored securely in the Secure Enclave (that’s why you have to enter your passcode on reboot before TouchID works at all)

There are counterfeit TouchID sensors out there that approve every touch as well. Shoddy repair centers use these.

The real fix here would be to verify the integrity of the TouchID sensor on every boot, not just when the cryptographic keys need updated (which happens on iOS update and device restore. It’s also been there for a while, it’s not new to iOS 9).

4 Likes

On devices that don’t share memory or processor, all you can do is intercept all communication to and from the device, and the device location, and in most cases the mic. Some clever folks have figured out how to use the radio receiver to determine what is being typed on a capactivie screen, but i’m not sure if that has ever made it to any of the exploits in active use or was just a security research project. They’ve done the same with the accelerometer, btw, and that has been used in active exploits.

nope.

what encryption scheme do your text and messaging apps use? what encryption do your phone calls use? sure your web page content might not be detectable but where the traffic is going is.

If you have all these questions I suggest doing some security research. You can read up specifically on various exploits and what they can and can not do for yourself!!! The more you know™!

Have you not been following the news about this. IF they detain you they use YOUR finger. It is happening all the time, approved in many states, and there have been multiple BoingBoing articles about it.

LOL, I see you are a time traveler from before 2011 and aren’t aware of all the phone cloning downloading devices made and sold to police and border officers. They do not need a warrant to search your phone if it is on your person at the time of your arrest, similar to searching your pockets, the arrest itself is now considered reasonable suspicion for the search.

You are saying stuff that makes it seem like you are thinking about these issues, which is great, but it doesn’t seem like you are actually reading about them or looking up any of the actual exploits or laws.

AND yes, sighs, I already mentioned that you can increase your security by using non standard prints and print locations, if you read my reply to you. Did you also catch the part where almost no one does, less then 1% of the population uses anything but the thumb or index.

That wouldn’t work, lol. it isn’t an optical sensor.

SOURCE PLEASE??? The touch sensor does not do the authentication.
There are sensor-less home button replacements, but they don’t authenticate.

1 Like

yes, and we agree this is good.

How flexible is it about the matching cryptographic hash? Because it sounds pretty serious about that part. It is irrelevant to me what input the sensor is getting. I am looking at what the system (the part I am concerned with being compromised, as a user).

I’m a little concerned about something collecting a fingerprint to be sent elsewhere I suppose, but not really.

I am very concerned, and the phone design seems to cover this, that a false sensor could be installed while the phone is being repaired, and that ANYTIME after that the phone might be compromised (in person OR remotely) quite without the use of any printed luncheon meats.

it only bricks phones using error 53, that you’ve had repaired by people outside of the walled garden, yeah. That’s good as far as I am concerned.

I hear you, but I don’t see that as a strong comparison at all. The fingerprint sensor is not a consumable, AND I can choose not to use it at all.

Back in the 70s I think it was, the US got a new embassy in Moscow. Built by the russians. Amazingly their contractors, unvetted by the USA, installed all manner of bugs and spy gear into the bones of the building.

I’d rather pay Apple to put in a 100% reliable sensor, and warranty their work, and have all the components work right for the security they assure me I can have with the correct sensor/hardware matching . Call me crazy, but…

if I didn’t want to do that I am sure the best course of action for me would be to buy Apple Inc outright and then dictate this change I need to see in them, to them, or buy a different brand of phone, or build a better one myself.

Thank you for the detailed info!

2 Likes

On my android tablet, I can use a fingerswipe that is self-erasing. It’s not hard to figure one out, but it takes discipline to train yourself not to shortcut the swipe.

Right, there’s a qualitative difference between real security, like a steel bank vault door, and the kind of locks that keep honest people out, like the Qwikset KW-1 lock that most people in the USA have on their front door. The Apple fingerprint reader is in the KW-1 category, which means anyone who has a real motive will simply google how to defeat it and practice how to do so - but it’s good enough to keep out small children and honest people. And you’re absolutely right that this is a totally appropriate level of security for a mass-market consumer phone. It’s a little weird when Apple people convince themselves that they’ve got serious security, though. The fingerprint reader is a very nice feature that doesn’t need to be oversold.

Exactly! There’s an unpluggable security hole anyway, so don’t kid yourself that you have bulletproof protection and proceed accordingly.

Presumably off-by-one for an invitation to self-SWATing…

Well, sure, but according to the story it does not brick upon being stolen, nor does it brick upon being tampered with. It bricks days or months AFTER your gpg store has already been compromised and all your passwords stolen, used, and changed by the criminals, and only in cases where the criminals are clever enough to hack your phone’s hardware, but then suffer sudden irreversible brain damage causing them to become dumb enough to allow an OS update on the hacked phone. I don’t think much of a security feature that only works in such limited circumstances.
I would prefer a security feature that wiped the phone at the time of tampering, personally, rather than one that offers no such protection yet forces me to use only Apple repair facilities.

A good idea! The ink was not water-based but it might’ve gotten sludgy, which could have at least kept it from splashing my face…

5 Likes

A Supreme Court ruling from 2014 would strongly disagree. http://www.nytimes.com/2014/06/26/us/supreme-court-cellphones-search-privacy.html?referer=&_r=0

3 Likes

That is fair. anytime anyone works on your home, your car, your phone, your computer, you are putting a certain level of trust in them. I do understand that and agree that if you have that level of concern you should only work with trusted vendors and technicians

[quote=“AcerPlatanoides, post:203, topic:73276”]
Call me crazy, but…
[/quote].
Okay Crazy Butt! :stuck_out_tongue_winking_eye: haha…:slight_smile: oh damn, you remembered the comma…

no problem!

2 Likes

That is good news. Now I have to do some reading myself. I know that at the border there are signs up stating that they do not need a warrant to search your device. I’m guessing similar to how you don’t have to be charged with a crime to be detained under suspicion of terrorism, that there are exceptions ecked out in the laws for various circumstances…like i said i’ll have to read more about this aspect as that supreme court ruling was news to me. Thanks for the link.

1 Like

Phone calls and SMS use effectively no encryption due to known weaknesses that Stringray devices exploit. There’s absolutely no need to attack the baseband to get access to either.

For messaging, nearly every messaging app in existence uses encryption between clients or between the client and server, even if they don’t encrypt data at rest.

You have yet to give an example of some magic power attacking the baseband gives you that sniffing traffic at Starbucks or attacking an insecure ISP provided modem doesn’t.

Furthermore, Apple recognizes this Starbucks weakness and, by default, apps developed for iOS 9 enforce TLS 1.2 for all network connections. This is because developers are lazy and often won’t enforce security, even if supported.

(The secondary reason is that if a major encryption flaw like POODLE, FREAK, Logjam, et cetera are found in the future, Apple can quickly release an update to disable the ciphers used in third-party apps that haven’t opted out.)

those weren’t my goalposts and is beyond the scope of this discussion. we are talking about phone security of the phone itself, not comparing and contrasting to what you can get from hacking an ISP or Cell Provider to what you can get hacking a phone. (although to be fair i did mention the mic, and @shaddack did mention the exploits against shared processors and memory.) Again there are a ton of actively used known baseband exploits, if you are curious about what each one can and cannot do please feel free to read up on them right after you post your source for “counterfeit TouchID sensors out there that approve every touch”, because i’m waiting to read up on those if they do indeed exist. it would be fascinating to read how they achieved this.

When you hear that an iPhone has been jailbroken due to a baseband exploit, it means that hackers have found a bug in the baseband OS or processor that gives them elevated access to the rest of the device.

Remember that these two OS do have to communicate in order to work together. They aren’t strictly isolated from each other. We haven’t even dug into the third OS on your phone, the SIM OS. Some phones even have other embedded OS for various hardware features. Pretty fun stuff.

2 Likes

There are guides on the internet that will tell you how to lift fingerprints from objects and transfer them to rubber for the purpose of fooling fingerprint readers.

3 Likes

FYI, The Guardian posted an article about this error, and quoted your comment (toward the end of the article).

3 Likes

From a couple of years ago:

https://www.blackhat.com/html/bh-dc-11/bh-dc-11-briefings.html#Weinmann

[quote]The Baseband Apocalypse

Attack scenarios against smartphones have concentrated on vulnerable software executed on the application processor. The operating systems running on these processors are getting hardened by vendors as 

can best be seen in the case of Apple’s iOS, which both uses uses data execution prevention and code signing to make exploitation of memory corruptions and running malicious software harder. In contrast, the GSM/3GPP stack running on the baseband processor has been neglected. The advent of open-source solutions for running GSM base stations is a game-changer: Malicious base stations are not considered in the attack model assumed by the GSMA and the ETSI; similarly vendors of baseband stacks seem to not have taken malicious input from the network side into account. This paper explores the viability of attacks against baseband processors of GSM cellular phones, the focus being on smartphones.

We demonstrate the first over-the-air exploitations of memory corruption in GSM/3GPP stacks that result in malicious code being executed on the baseband processors. [/quote]

5 Likes

and this is why my real name isn’t on my account here now.

7 Likes

Hell, you don’t even need to Google; there are Mythbusters episodes showing how to fool fingerprint scanners.

As a rule of thumb (ha!), if there’s a fingerprint scanner involved, then the “security” is pure theatre.

6 Likes

All they do is pass the same hash for every input. You train it with your index finger, then you pass a hotdog over it, then you let your dog lick it, same hash every time…
They’re cheaper because instead of an actual AD chip with real code to figure margins for error and calculate a real hash it’s just some low-level logic to spit back a set value when nudged.

I’m still trying to find a source for this information. I know there are third party home button replacements without scanners that can’t unlock a locked phone. i’ve yet to find any info on ones that can pass a hash to the apple equipment because it only accepts the hash from a signed paired scanner. you can’t even use a scanner from another identical iphone. if anyone achieved this i want to read up and learn how.

The sensor is designed to be secure and it is uniquely paired with each CPU.

Wait. Just so I’m understanding correctly, this is a webcam on someone’s RSA token? OMG.

3 Likes

I’m not sure what you mean?

I’ve looked around at the Secure Enclave stuff and I think most everyone in this thread should read this:

It pretty much lays out the hows and whats that are going on. From the way I understand it a malicious sensor is somewhat useless in doing anything, as it merely acts a third party. The touch sensor is simply an input for the Secure Enclave, the data from the sensor does flow through the main processor but it is encrypted and no processing is actually done on it. So there isn’t really a way for the touch sensor to directly interact with the OS. That means your finger print isn’t going to be showing up on a Russian server somewhere, and even if it did that is only half of the data needed for the encryption. The other being hardwired and unreadable from the Secure Enclave.

If your data was compromised a new finger print would only be warranted if you used the same phone - replacing the phone would replace the other half of the data used in making the cryptographic signature/encryption.

A different point here about Error 53:
This is directly from Apple’s iOS Security Guide I like to above:

For devices with an A7 or later A-series processor, the Secure Enclave coprocessor also utilizes a secure boot process that ensures its separate software is verified and signed by Apple.If one step of this boot process is unable to load or verify the next process, startup is stopped and the device displays the “Connect to iTunes” screen. This is called recovery mode. If the Boot ROM is not able to load or verify LLB, it enters DFU (Device Firmware Upgrade) mode. In both cases, the device must be connected to iTunes via USB and restored to factory default setting

It seems rather odd to me that a non matching touch ID sensor causes a such serious error when it appears many other security related errors force the phone into recovery mode, not into becoming a brick. Also something I think maybe a few if anyone pointed out: The Touch ID sensor obviously has an id associated with it, and Apple obviously has a way to pair a new sensor or program it to make it talk to the phone (either programming of phone or programming of sensor) - so what keeps a skilled third party from figuring this out? Realistically I think the answer is the fact they need to have your physical phone and replacing the touch ID sensor isn’t going to get them anywhere in cracking what’s on your phone.

So circling back - this is a way to screw non Apple authorized repair centers and to make people feel good about their “security”.

rant
Besides what the hell do you people keep on your phone that’s so damn important? I don’t use Google Wallet/Apple pay, access banking sites, and rarely buy anything from my phone… At best you’d get my email, maybe crack into my Google account, and maybe get the low limit credit card linked to it for the app store… Of course you could do all that without my phone as well. Now get off my lawn you damn kids.

2 Likes

When I dug deeper into the problematics, it appears that the baseband processor exploit is used for phone operator unlocking, which is a separate problematics from jailbreaking. The two processors may be separated.

…now, the question is if the application processor is immune against malformed responses from the baseband one… If I had to sneak a deniable government backdoor into such architecture, I’d put it here.

1 Like

I’m sure it’s upstream somewhere, but this isn’t a security measure, it’s a brute-force sales stimulator.

None of these examples protected the users data/biometric(HA)data. All examples I read up to post 50 or what had their phones bricking long, long after being serviced.

No protection is no protection.

Any examples of a phone bricking upon being serviced>? As in, it was serviced, and when powered on afterwards, immediately went error 53?

If the update pushed would allow that, then great, but the update should accept whatever components it finds and only brick if tampered afterwards, AND/OR warn consumers that any service that voids a warranty would cause the device to brick upon accepting the update.

Without those simple consumer protections, Apple is just stealing it’s phone back and asking you to please deposit them in the garbage to save it the trouble.

Also, fingerprints may in fact be biometric data, but it’s not worth a damn, don’t fool yourself, it’s gimmickry, not security.

Now, if it makes you feel all cool to think you’re a secret agent or a high value target, and that the gimmickry you employ proves it, that feeling has value to you and hopefully that value exceeds the monetary value of your iphone, so fantasize away.

6 Likes