Reasons (not) to trust Apple's privacy promises

Agreed, like, a million percent. Especially your point about reading source code. Just because you can read something doesn’t mean you can understand something. I sometimes wonder what many open source security advocates really expect those of us who don’t program to do. Just take their word for it? What do we do if they’re wrong?

When it comes to certain technical matters, if you cannot understand the discussion (and I rarely do, if ever), there’s no functional accessibility. It may as well be completely closed source; ramifications and implications are completely opaque if you’re not an expert.

But: I see the value in critiquing Apple here. (I also see the value in critiques of Google and MS and whomever at the same time; displays of … dis-favoritism can appear to be biased—and that’s a potential hindrance to the overall message.)

1 Like

I sometimes wonder what many open source security advocates really
expect those of us who don’t program to do. Just take their word for it?
What do we do if they’re wrong?

It’s an age old problem. You either learn how it works yourself, or decide who you can trust and hope you chose wisely. The issue has existed far longer than the computer industry.

The main point of open source in security is the code can be reviewed by anyone, including others in the field, and problems are sorted out faster that way. Closed source is not open to review and critique beyond the organization that created it. Even if you don’t understand the code, that should be a red flag.

4 Likes

Excellent post! But one real reason – because they were the last to cave to PRISM (according to the Snowden documents), only did so a couple years ago, and may well be the first back out the door.

1 Like

No. 3 doesn’t exist. Why do Apple fans want to pretend that their amoral corporation of choice is somehow different? It isn’t. And sorry, but your #1? You have to reach back a decade (or more) for that one. And “perfectly willing to abuser their position to the detriment of users”? Get into the current decade and look up APPL.

1 Like

It is. But can isn’t always the same as does. Code review is no doubt easier to review if it’s open source, but is that really knowably better than closed source code if that code gets reviewed as a matter of company procedure?

How long did it take for someone to realize something was wrong with OpenSSL?

Whoa. Dude. Calm down.

3 Likes

It is. But can isn’t always the same as does. Code review is no doubt easier to review if it’s open source, but is that really knowably better than closed source code if that code gets reviewed as a matter of company procedure?

How long did it take for someone to realize something was wrong with OpenSSL?

The Open SSL issue was a failure in that it took as long as it did for the problem to be noticed. The success is that it was.

All software is a work in progress, and there are going to be failures of multiple types, including security ones. I’d rather have a company put the security critical bits out there for people to bang on than not.

Having an issue in the open and not discovered is not an argument for keeping critical security software closed source, where problems are harder to discover.

If you feel you can entirely trust a single company that hides security related code from the rest of the community over ones that share it, then that is your right.

4 Likes

[NASDAQ:AAPL] Apple Inc. > 100.96 -0.83 [-0.82%] Sep 19 - Close

To be fair to Apple, they have an interesting white paper that documents their security model on the iPhone & iPad. Assuming the document isn’t a complete work of fiction, they appear to have done a reasonable job of securing the phone when it’s powered off. e.g.

  1. There is hardware-based AES encryption between the Flash memory and the microprocessor

  2. One of the keys used to perform encryption is written into the silicon on a per-chip basis during manufacture. There is no way to recover the key in software

This means that if TSA could slurp the contents of each iPhone during a baggage check, for instance, the information they gain is unusable, because they don’t have the key held in the phone hardware.

If the NSA have a known-plaintext attack against AES-256, then the iPhone security is broken - otherwise it looks pretty robust at the technical level, because the most effective way to get data from a powered-off iPhone is to subpoena the owner for the passcode.

In the USA, the 5th Amendment should allow the owner to refuse to provide a passcode without allowing a jury to make an inference of guilt.

In the UK you might be screwed, since refusing to provide an encryption key when lawfully requested is a crime in itself that carries a 5 year jail term, and the jury is allowed to infer guilt from silence.

Nevertheless, it significantly raises the cost of getting data from each person and frustrates automated mass surveilence - the targeted invidivual is at least aware that something is happenning.

8 Likes

I have the 2nd option now. I will be going to the 3rd option today (to replace the phone I currently have that is running like crap).

You want complete privacy? Stay off of the Internet. Don’t use banks or credit cards. Don’t pay taxes. Live in the woods by yourself. Congrats, your info is now secure and safe.

2 Likes

I understand (and to an extent support the windmill tilting of) Cory’s thinking on this, but the practice leaves a lot to be desired. At best it looks goofy and incorrect to careful readers. At worst, like here, it’s actively harder to understand (especially in BoingBoing’s non-serif font), and really only serves to shift attention away from the thing he wants to call attention to in the first place.

2 Likes

It’s the basis of political systems, for one.

Until the Bigfoot hunters find you.

3 Likes

If end users can be liable for violating a company’s TOS, shouldn’t the company be liable for violating their privacy policy?

2 Likes

Ah, yes. Another round of Apple-bashing from Cory. A rant built largely of speculation. Yet, he never comments on real, actual security problems with other vendors…Like Android, for instance.

6 Likes

I’m not really a fan of iOS (as in, I’m not a fan of the UI, not some weird sort of argument thing) but this is one of the better discussion points on this page. :smile:

I think the point is that Apple’s announcing means approximately nothing. Though one can see it being worse than nothing, since the illusion of security is worse than no security.

Why bother caring about it, when the services you use claim to be secure?

1 Like

Probably none of them… Why trust any of them? Realize that ALL of them are compromised, and try to work within those confines.

At least #2 is somewhat open source, and its hardware allows third party distributions. I trust #2’s software more, but the hardware is probably where any potential problems will lie in any of these options.

Now danegeld, just what are you trying to do? Injecting real, relevant data into one of our host’s periodic Apple rants? That’s like trying to convince an anti-vaxxer or a climate-change denialist - facts aren’t going to do anything, as they’ll promptly be ignored. I’d argue that the Android web browser flaw that affects 1/2 of the extant Android phones is a lot more relevant (Arstechnica link) to security concerns. These rants pop up every time Apple gets some press. They’re indicative of either a blatant grab for clicks or a serious inability to contextualize. Either way I end up feeling sad, like when a favorite uncle, usually erudite and sane, happily tells you how they’re blatantly cheating on their taxes or launches into a conspiracy-theory diatribe over the Thanksgiving turkey…

4 Likes