Posted by Virus Bulletin on Apr 13, 2010
One third fail to gain certification.
Virus Bulletin has completed its largest ever test of anti-malware products, with a phenomenal 60 products being tested on Windows XP.
40 of the products submitted for testing were awarded the VB100 certification, while the other 20 failed to demonstrate the detection abilities required, with Microsoft, Frisk, Norman and Fortinet among the companies whose products failed to make the grade.
Stumbling blocks for the products in this test included failure to detect complex polymorphic viruses and false alarms produced on clean files from major providers including Adobe, Microsoft, Google and Sun.
VB's Anti-malware Test Director John Hawes said: "We put a huge range of products through their paces this month, and saw the usual problems with detection of complex viruses and false alarms on common software, with some splendid performances from some and pretty dire showings from others."
Hawes explains that VB's test team were also disappointed by the levels of instability in the products tested: "It was pretty shocking how many crashes, freezes, hangs and errors we encountered in this test. XP has been around for a long, long time now and is still the world's most widely used computing environment - so developers should be producing rock-solid software for it time after time. I'm sure any user who sees their system brought to a halt by their security software will vote with their feet and take their custom elsewhere."
The detailed review, available to Virus Bulletin subscribers, provides a wealth of data that gives a vital insight into how the various solutions stack up against each other in a wide range of ways, including some detailed performance analysis as well as detection rates and the testing team's thoughts on the user experience.
The results of the RAP ('Reactive And Proactive') tests conducted at the same time showed a continuation of the trends and patterns seen in recent tests, with a cluster of vendors including Trustport, Kaspersky Lab, ESET, Webroot and Sophos vying for space in the top right-hand corner of the RAP quadrant.
VB's cumulative RAP quadrant gives a quick visual reference as to products' reactive and proactive detection rates:
Virus Bulletin's RAP testing measures products' detection rates across four distinct sets of malware samples. The first three test sets comprise malware first seen in each of the three weeks prior to product submission and measure how quickly product developers and labs react to the steady flood of new malware. A fourth test set consists of malware samples first seen in the week after product submission. This test set is used to gauge products' ability to detect new and unknown samples proactively, using heuristic and generic techniques.
Virus Bulletin has been testing and certifying anti-malware products for more than ten years in the VB100 certification scheme. The stringent VB100 tests pit anti-malware products against a test set of malware from the WildList - a publicly available up-to-date list of the malware that is known to be circulating on computers around the world. To earn VB100 certification, products must be able to detect 100% of the malware contained in the WildList test set and must not generate any false alarms when scanning a set of clean files.
The results of the April 2010 VB comparative review can be seen here.
The full review, including detailed results tables, is available to Virus Bulletin subscribers here (Click here for details on how to become a Virus Bulletin subscriber.)
A full description of the RAP testing methodology can be seen here.
Posted on 13 April 2010 by Virus Bulletin