Most Android security apps are worthless when it comes to protecting you against mobile malware, according to a new study.
If you're worried about Android malware, choose wisely. There's a good chance that your trusted security app does little to protect you, says a new report from independent testing organization AV-Test.
In a preview of the study e-mailed yesterday, AV-Test's CEO Andreas Marx revealed that desktop antivirus vendors that have migrated to Android performed the best. Avast, Lookout, Dr. Web, Zoner, F-Secure, Ikarus, and Kaspersky detected 90 percent or more of the 618 types of malicious Android APK files that they were tested against. Lookout and Zoner are notable standouts because they are only available as mobile apps, and have no PC-based counterpart.
"Using these products you don't have to worry about your malware protection," wrote Marx. He also emphasized that the security apps that tested between 65 percent and 90 percent were very good and could easily improve their detection because several apps in this category missed one or two malware families. These malware families may not be threats in "certain environments," he wrote, which may account for the lower scores.
The apps in this second group include the PC antivirus vendors AVG, Bitdefender, ESET, Norton (Symantec,) QuickHeal, Trend Micro, Vipre (GFI,) and Webroot; and two mobile-only vendors, AegisLab and SuperSecurity.
A third group comprised entirely of PC security suite vendors scored between 40 and 65 percent detections, and included Bullguard, Comodo, G Data, McAfee, NetQin and Total Defense. The report explains that their Android woes might also be due to insufficient sample collection infrastructure--basically, they might be too new to the field.
Twelve more apps detected more than zero percent but less than 40 percent of the samples, and a final group of six apps detected nothing at all. While the report hedges its bet and says that its possible that the apps detect threats that weren't among the 618 samples, it's more likely that they simply do not work at all. Alphabetically, these final six are Android Antivirus, Android Defender, LabMSF Antivirus beta, MobileBot Antivirus, MT Antivirus, and MYAndroid Protection Antivirus.
Overall, less than half of the 41 apps tested during February were found worthy; only 17 made the cut above 65 percent. The test was conducted using a combination of the Android SDK, which replicates a scalable environment, and an actual Android device for when the SDK wouldn't work. The SDK would not suffice when the app called for SMS activation, or when the 3G network was too finicky to provide a stable cloud connection. The end result was that all results were cross-checked in the SDK, emulating API level 10 (Gingerbread 2.3,) and on a real device, a Samsung Galaxy Tab running Froyo 2.2 and a Samsung Galaxy Nexus with Ice Cream Sandwich 4.0. Apps were allowed to update to their latest versions before testing, and to connect to the cloud during testing.
But is my favorite app really worthless?
The authors state in the report how challenging it is to correctly identify active, threatening malware on Android. In the report, they chalk this up to three factors: the relatively small number of malware samples; the challenge in figuring out how prevalent malware apps are; and the fact that, problems aside, malware apps are removed fairly rapidly from the Android Market and the even the user's own device.
Also, none of the ancillary security features were tested, such as remote lock and wipe, lost or stolen device location options, or data backup.
AV-Test concludes that it's possible to have sample sets that are easily marred by malware that is no longer or never has been relevant. Why even bother testing if the results are so hard to replicate? The report explains that to limit the problems of these variables, only the most widely known malware families were used, and only those discovered between August and December 2011. By looking at the family detection rates, AV-Test says, "it is still possible to get a fairly accurate picture of the absolute detection rate." For malware detection, AV-Test recommends any of the 17 apps that finished above 65 percent. It advocates using one because of the ability for ostensibly benign apps to download malware after they've been installed.
All these problems are symptomatic of the larger challenge that the security industry has in justifying its existence.
If you're worried about Android malware, choose wisely. There's a good chance that your trusted security app does little to protect you, says a new report from independent testing organization AV-Test.
In a preview of the study e-mailed yesterday, AV-Test's CEO Andreas Marx revealed that desktop antivirus vendors that have migrated to Android performed the best. Avast, Lookout, Dr. Web, Zoner, F-Secure, Ikarus, and Kaspersky detected 90 percent or more of the 618 types of malicious Android APK files that they were tested against. Lookout and Zoner are notable standouts because they are only available as mobile apps, and have no PC-based counterpart.
"Using these products you don't have to worry about your malware protection," wrote Marx. He also emphasized that the security apps that tested between 65 percent and 90 percent were very good and could easily improve their detection because several apps in this category missed one or two malware families. These malware families may not be threats in "certain environments," he wrote, which may account for the lower scores.
The apps in this second group include the PC antivirus vendors AVG, Bitdefender, ESET, Norton (Symantec,) QuickHeal, Trend Micro, Vipre (GFI,) and Webroot; and two mobile-only vendors, AegisLab and SuperSecurity.
A third group comprised entirely of PC security suite vendors scored between 40 and 65 percent detections, and included Bullguard, Comodo, G Data, McAfee, NetQin and Total Defense. The report explains that their Android woes might also be due to insufficient sample collection infrastructure--basically, they might be too new to the field.
Twelve more apps detected more than zero percent but less than 40 percent of the samples, and a final group of six apps detected nothing at all. While the report hedges its bet and says that its possible that the apps detect threats that weren't among the 618 samples, it's more likely that they simply do not work at all. Alphabetically, these final six are Android Antivirus, Android Defender, LabMSF Antivirus beta, MobileBot Antivirus, MT Antivirus, and MYAndroid Protection Antivirus.
Overall, less than half of the 41 apps tested during February were found worthy; only 17 made the cut above 65 percent. The test was conducted using a combination of the Android SDK, which replicates a scalable environment, and an actual Android device for when the SDK wouldn't work. The SDK would not suffice when the app called for SMS activation, or when the 3G network was too finicky to provide a stable cloud connection. The end result was that all results were cross-checked in the SDK, emulating API level 10 (Gingerbread 2.3,) and on a real device, a Samsung Galaxy Tab running Froyo 2.2 and a Samsung Galaxy Nexus with Ice Cream Sandwich 4.0. Apps were allowed to update to their latest versions before testing, and to connect to the cloud during testing.
But is my favorite app really worthless?
The authors state in the report how challenging it is to correctly identify active, threatening malware on Android. In the report, they chalk this up to three factors: the relatively small number of malware samples; the challenge in figuring out how prevalent malware apps are; and the fact that, problems aside, malware apps are removed fairly rapidly from the Android Market and the even the user's own device.
Also, none of the ancillary security features were tested, such as remote lock and wipe, lost or stolen device location options, or data backup.
AV-Test concludes that it's possible to have sample sets that are easily marred by malware that is no longer or never has been relevant. Why even bother testing if the results are so hard to replicate? The report explains that to limit the problems of these variables, only the most widely known malware families were used, and only those discovered between August and December 2011. By looking at the family detection rates, AV-Test says, "it is still possible to get a fairly accurate picture of the absolute detection rate." For malware detection, AV-Test recommends any of the 17 apps that finished above 65 percent. It advocates using one because of the ability for ostensibly benign apps to download malware after they've been installed.
All these problems are symptomatic of the larger challenge that the security industry has in justifying its existence.