On the Android security bug

Peter Biddle, who helped invent trusted computing when he was at Microsoft, discusses the serious Android security bug that was just reported. It’s a good, short read, and most alarming is the news that Google’s had information on this critical bug since February: "The entire value of a chain of trust is that you are… READ THE REST

1 Like

Google’s had information on this critical bug since February:

I wonder if the NSA has enjoyed the fruits of this delay along with all the other miscreants?

1 Like

Why would this matter to the NSA? When you are handed a master key, why bother to pick a lock?

Hello,

Given Google’s own policies about disclosure of security vulnerabilities why didn’t they disclose this seven days after they were notified? Or am I misunderstanding how the policy is supposed to be applied?

Regards,

Aryeh Goretsky

There are roll-your-own versions of Android, where a corporation (or individual) can take an Android distribution, strip out all the existing keys, drop in their own, and be “assured” that only those who they want to have access to the device, have access to the device.

The nature of this bug seems to be the ability to arbitrarily change the software version on the system manifest, which would seem to be involved with checking the hash signature when the code is installed, which should only be accessible to the system firmware running in a secure mode — and perhaps never being checked again after install.

That bug and the fact that some Motorola phones running Android have been discovered to phone home tells you that Google isn’t exactly security-conscious with respect to your data.

2 Likes

A typical corporate response would be something like “We didn’t want to call attention to a viable security vulnerability”.
However, this breaks down when you realize that the malicious hacker community would probably have known about this from day one and in the interim, many Android systems could have been compromised. The more likely answer would come from Google’s crack legal and marketing teams.

I’m shocked, I tell you, shocked!

Why would this matter to the NSA? When you are handed a master key, why bother to pick a lock?

I see your point, but there are many locks, not just one. And, how do we know these are truly vulnerabilities and not just exposed backdoors? Doing it this way adds an element of plausible deniability for the corporations when the NSA wants even more illegal/unethical/controversial/unconstitutional access (foreign and domestic) beyond a few basic backdoors that can be sealed by knowledgable targets.

If you doubt this happens, Microsoft (with ~95% OS market share) and others have already been exposed for doing this:

In other words, this is all to bypass security safeguards put in place by targets (a.k.a. innocent people who have proprietary business secrets, etc.).

If you own a business and don’t lock down your business secrets with your own methodologies and trusted, solid, well-researched third-party solutions, then it’s likely some quasi-governmental corporation already has your potentially valuable secrets in a database right now, ripe for the plucking.

It adds an entirely new meaning to security exploitation, doesn’t it?

Given Google’s own policies about disclosure of security vulnerabilities why didn’t they disclose this seven days after they were notified? Or am I misunderstanding how the policy is supposed to be applied?

The “policy” goes out the window once you factor in the demands of NSA spying:

They feel like it’s “worth it” for your business to go under from rampant exploits in the name of “protecting you”. Basically, the same twisted logic of every other authoritarian police state that’s ever infested this Earth.

If you know what’s good for you, you’d better know they know what’s good for you.

Hello,

I’m sorry, Cowicide, but I don’t understand what responsible disclosure policies has to do with the National Security Agency. I was curious about whether or not Google was willing to follow it’s own stated policies or if this was a prima facie example of a double standard.

I tend to believe that at any government agency, their job is to annoy, harass and generally interfere with citizens going about their business. Aside from having better capabilities and methods, the NSA does not sound too much different from any other bureaucracy in that respect.

Regards,

Aryeh Goretsky

I don’t understand what responsible disclosure policies has to do with the National Security Agency

I’m positing that responsible disclosure policy is affected by the demands of the NSA as shown by the arstechnica link I gave. Aside from that article, there’s also other considerable evidence that companies like Google work hand-in-hand with the NSA to put their profitable, corporatist interests above the average security and privacy concerns of the citizenry. That includes backdoors into devices and software along with irresponsible disclosure for exploits.

Given that the Android app distribution outside of the Google Play Store is always a tricky proposition with respect to malware, I fail to see this as such an earth-shattering security failure. There is no substitute to trusting your sources.

It certainly is annoying to have secure signing broken, but I wouldn’t scream that the sky is falling.

This topic was automatically closed after 5 days. New replies are no longer allowed.