CyanogenMod rolls out secure device-locating feature

[Permalink]

1 Like

As a CM user (but waiting for something mildly stable for the HTC One), I like this a lot.

In the meantime I’m still using AndroidLost, which is powerful, useful and pretty transparent. The author claims that its use of your Google Account means only you can control your phone (see Privacy and Security).

It’s not really comparable, security-wise. Not open source, and if your google account is compromised (by poor password hygiene, hackery, subpoena, or other) so is the app.

It says this at the bottom of that section on the AndroidLost page.

Trust
Basically all of the above is just text. You will have to trust me that I am a nice guy and all that I say is true. If you do not trust me that is quite OK - then you should not install this app. No hard feelings from my part.

Hmmm… I agree with @kstop, not being open source is the deal breaker for me. Plus, Google will hand out your password like candy if the NSA asks for it (or maybe even without asking at all).

CyanogenMod’s offering seems much more legit to me.

I have/use a crapload of Google accounts and services, but when it comes down to trust, I’d believe in CyanogenMod long before Google…

5 Likes

At the risk of getting off-topic, I generally place more faith in these guys than Google, but there’s still a prevailing mentality that developers are more important than users, which really rubs me the wrong way.

Look at the direction they took over an experimental feature to give users fine-grained control over permissions: https://plus.google.com/100275307499530023476/posts/iLrvqH8tbce

While a lot of reasons were given, all the talk about “backlash from developers who don’t want their apps running in unpredictable environments” and a desire to avoid “app developers pissed off at CM and blacklisting us” really felt like the root motivation behind the decision to nuke the feature.

I want final say over what happens on my phone. If that upsets a software developer, too damn bad. Don’t release your code if you don’t want people running it any way they see fit.

1 Like

So, someone explain how this gets past Oracle’s supposed ownership of the basic nut of Android. This whole thing has me even more confused than usual.

Also, excuse my loose language. I don’t want anyone thinking I know what I am talking about here.

In it’s current state, the UX was awful. It was easy to wander into
these settings and break things in weird ways that were not obvious at
all.

That seems like excellent reasoning. Putting things into the Android core that could cause weird breaks unless developers handle appropriate feedback is one thing – putting things into a non-standard rom (even one as widespread as CM) that could cause weird, seemingly-random breaks for developers who are coding for standard Android devices is quite a different thing entirely.

That’s not favoring the developers, that’s favoring “normal” users… and I say that as someone who really would like to have much more fine-grained permission control in Android.

1 Like

I’m glad Cyanogenmod is doing this. I installed a tracking-and-wiping app a while ago, and while I’ve had no cause for complaints, it has worried me that I might be trading one set of vulnerabilities for another.

1 Like

But is CM even for “normal” users in the first place? Being able to break something is just a side effect of having more control over it. I’ll take features over stability any day.

1 Like

CM is used by tons of “normal” people simply as a way to get away from the bloated, crappy roms that come as defaults on provider phones. A huge part of their focus is on being stable enough to be a “daily driver” for users across a wide spectrum of equipment, and being easy for people to install and keep updated.

Features are fine, and the post you linked to even discusses ways to add features. But it’s not a “feature” to introduce things that can cause seemingly-random crashes in apps that would work without issue across the rest of the ecosystem. That’s just going to cause both users and developers to avoid CM.

Something like the tracking-and-wiping in this article, which adds something new and needed without negative consequences? That’s a feature.

I’ve been using/promoting FindMyPhone for this application, because it is the only currently available solution I’ve found that doesn’t have a serious trust problem. It avoids the issue by not having a server-side component - the whole setup is a little daemon (GPLv2) running on the phone that responds to user-chosen code-words by ringing and/or replying with its current GPS coordinates.

This CM one is the first “service” type I’ve felt remotely comfortable about, assuming their cryptographic client-side key required to perform action assurance survives scrutiny.

Seems to me that unless you are building it from source yourself you are still trusting whoever made the build that it uses the source they say it uses.

Of course open source gives the chance to verify that their claimed source is correctly implemented.

This is why there are checksums on compiled sources and developer signatures on them. You can match the checksum listed for a compiled version of the code against the checksum of the copy you just downloaded, see that they are the same, and be pretty sure that this is, indeed the software you intended to install. The developer signatures also mean that you can tell that the file was compiled by the person who says that they compiled it and is as trustworthy as that person.

Of course, you don’t generally check the checksum or the developer signature yourself. Your package management software (for Android think “Google Play”) does that for you under most circumstances, at least it does after you have installed the basic operating system. That’s why the security conscious always check the signatures and checksums by hand when downloading a ROM, or software that they intend to sideload onto their device.

1 Like

Of course open source gives the chance to verify that their claimed source is correctly implemented.

And, that’s the crux of it right there. Why “trust” when you can verify?

1 Like

They also seem to be moving towards a user-focused stable version and a dev version (with test deployment keys for added hackability at the cost of security). So I wouldn’t be surprised to see that kind of functionality - albeit implemented differently - coming back in a more user-friendly form.

The signatures and checksums do not verify that what is in that build is what that person says is in the build. You are still trusting on that.

That’s why we have the concept of Webs of Trust (sometimes called circles of trust). http://en.wikipedia.org/wiki/Web_of_trust

If you trust the people that trust the person who signed that package, then you can feel pretty comfortable that you are getting what you expect. If you don’t, then don’t install it.