11 out of 26 anti-virus products fail VB100 certification

Posted by   Virus Bulletin on   Oct 9, 2009

VB reveals which products failed to meet VB100 certification criteria, and updated 'RAP quadrant' showing Reactive And Proactive detection abilities.

Virus Bulletin has revealed the results of its latest VB100 certification test on Windows Server 2008.

Of the 26 products tested 11 failed to achieve VB100 certification, all of the failures being at least in part due to incomplete detection of one or both of a pair of highly complex polymorphic file-infecting viruses.

The results of the RAP ('Reactive And Proactive') tests conducted at the same time showed a continuation of the trends and patterns seen in recent tests, with dual-engine products from Trustport and G Data showing particularly remarkable scores.

Virus Bulletin's Test Director John Hawes said: "This month's test was a real challenge for the products, with two separate variants of a particularly tricky polymorphic virus included in our core WildList set. We used large numbers of samples of each to thoroughly measure accuracy of detection, and showed that many products continue to have trouble with these nasties."

Hawes continued: "On a brighter note, there were some quite impressive scores in our RAP test, showing that some vendors are doing a good job handling the large volumes of new malware appearing every day. Looking at the long-term picture, we can also see some products achieving high levels of consistency month on month, which is also a good indicator of a solid, well-run lab. We're looking forward to seeing if these trends continue with a wider range of products in our first comparative on Windows 7, due soon."

VB's cumulative RAP quadrant gives a quick visual reference as to products' reactive and proactive detection rates - with the better performing products placed in the top right-hand corner:

Virus Bulletin's RAP testing measures products' detection rates across four distinct sets of malware samples. The first three test sets comprise malware first seen in each of the three weeks prior to product submission. These measure how quickly product developers and labs react to the steady flood of new malware. A fourth test set consists of malware samples first seen in the week after product submission. This test set is used to gauge products' ability to detect new and unknown samples proactively, using heuristic and generic techniques.

The results of the October 2009 VB100 certification review can be seen here.

The full review, including detailed results tables, is available to Virus Bulletin subscribers here or in PDF format here.

A full description of the RAP testing methodology can be seen here .

Posted on 09 October 2009 by Virus Bulletin

twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest posts:

VB2019 paper: Static analysis methods for detection of Microsoft Office exploits

Today we publish the VB2019 paper and presentation by McAfee researcher Chintan Shah in which he described static analysis methods for the detection of Microsoft Office exploits.

New paper: LokiBot: dissecting the C&C panel deployments

First advertised as an information stealer and keylogger when it appeared in underground forums in 2015, LokiBot has added various capabilities over the years and has affected many users worldwide. In a new paper researcher Aditya Sood analyses the…

VB2019 presentation: Building secure sharing systems that treat humans as features not bugs

In a presentation at VB2019 in London, Virtru's Andrea Limbago described how, by exploring data sharing challenges through a socio-technical lens, it is possible to make significant gains toward the secure sharing systems and processes that are vital…

VB2019 presentation: Attor: spy platform with curious GSM fingerprinting

Attor is a newly discovered cyber-espionage platform, use of which dates back to at least 2014 and which focuses on diplomatic missions and governmental institutions. Details of Attor were presented at VB2019 in London by ESET researcher Zuzana…

Why we encourage newcomers and seasoned presenters alike to submit a paper for VB2020

With the call for papers for VB2020 currently open, we explain why, whether you've never presented before or you're a conference circuit veteran, if you have some interesting research to share with the community we want to hear from you!

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.