#1 By: Cory Doctorow, October 4th, 2013 10:01
#2 By: Box of Cotton Swabs, October 4th, 2013 11:34
The linked article needs to use the word “innovation” more often, as the author only hits it seven times. Innovation. Open innovation platform. The power of open innovation.
And the penultimate paragraph draws a bizarre comparison and conclusion.
A locked down approach has worked for Apple in protecting iOS from malware because it controls both hardware and software towards the goal of maximizing its profits. In contrast Google has used an open model to maximize Android market share in which it licenses Android for free and controls neither the hardware or software ultimately sold to the end customer. This model has allowed for rapid innovation that resulted in a large market share but has created the need for the open malware defense framework that Ludwig presented.
#3 By: agraham999, October 4th, 2013 11:57
Cory loves to post any story with the word "open" in it...in this case the "open" is the same company that has expressed we have no right to privacy and has demonstrated their contempt for that ideal again and again.
According to Ludwig, only 0.12 percent of Android apps have
characteristics that Google thinks of as "potentially harmful"
Coming from the company that drives around collecting data from personal wifi, reads your email and everyone who sends you email, has been caught snooping on mobile browsing by bypassing security settings, been threatened with contempt of court for not deleting data they shouldn't have collected in the first place, and a CEO who has said if you are worried about privacy...don't do anything you would be embarrassed about...on and on...I am not sure I trust Google's opinion on what's potentially harmful.
But hey...open is always better.
#4 By: Aryeh Goretsky, October 4th, 2013 12:44
I work for an anti-malware developer and am currently in Berlin for VB2013, which just finished up a couple of hours ago. This is a two-track conference, so while I didn't get to actually attend every presentation (including the one referenced in the article), there were at least seven specifically on Android, and many of the other presentations referenced it as well. You can view the conference program here.
One thing that Google got right is that most Android malware appears from outside of Google Play, from things unofficial app stores run by third-parties, direct download of APKs, and so forth. And most of that occurs in regions where Google Play is not always available, such as China and Russia. Android users in the Americas and Europe are far less likely to encounter Android malware.
There is an additional area of concern for users of the Android platform, and that is potentially unwanted applications, which offer an app that performs some function for free, usually in exchange for performing targeted advertising. These are far more common than outright malicious apps and because they cover the grey(er) areas of software, can often be much more difficult for analysts to classify because it is entirely possible that the user agreed to being spied upon in exchange for the 'free' application.
It is also important to keep in mind that Google's mechanisms are relatively new: In January 2013, Google removed over 60,000 apps from their app store, which then had about 800,000 apps, giving a "removal ratio" of 1:13. By comparison, Windows Phone had just about 60,000 apps in their entire appstore, of which around four has been removed, giving a "removal ratio" of 1:15,000. Perhaps there is some security though obscurity, at least in the marketplace.
In any case, I think that Google's acquisition of companies like GreenBorder and VirusTotal, as well as the fact that they sending employees to VB to speak, says something about their recognition of the threats awaiting their customers, and desire to protect them.
#5 By: Glaurung-Quena, October 4th, 2013 15:49
Especially rich was the claim that since the OS is open source, we shouldn't be worried that the NSA has been a major contributor to "improving" the security of the OS. Because it's so easy and simple to vet thousands of lines of code to see if there's a backdoor implementation hidden in there somewhere.
#6 By: Cory Doctorow, October 9th, 2013 10:01
This topic was automatically closed after 5 days. New replies are no longer allowed.