AV-test.org issues latest figures

Posted by   Virus Bulletin on   Mar 13, 2008

In-depth testing covers multiple factors.

Independent testing body AV-Test.org has released its latest set of results, with a large group of products tested against a number of criteria including proactive detection, spotting and removing active infections, and outbreak response times, as well as simple detection rates.

The results show how companies and their products fare against the latest range of samples arriving at AV-Test, with results of checking new arrivals used to determine the accuracy of heuristics and the efficacy of behavioural detection systems. Updates were also monitored over the test period to determine when companies added detection for new items not spotted using heuristics or generic detection. Detection and effective removal of active malware, including rootkits, is also measured, as is the impact on system performance.

As in AV-Comparatives' recent figures, multi-engine products such as AEC's Trustport, G DATA's AVK and the gateway scanning product WebWasher all performed very strongly in the pure detection test, with Avira's AntiVir also achieving very high scores in both malware and 'potentially unwanted' categories.

The multi-engine products showed their weakness when it came to scanning times and false positives however, and also fared poorly against rootkits, while Avira did well across the board, ranking 'good' or 'very good' in all categories. The only other product to achieve this feat was Sophos, with Symantec and Panda let down only by their response times to outbreaks, marked as merely 'Satisfactory', and McAfee also failing to excel in scanning speed.

The results of the tests are shown in full below.

Overall results


Productmalware on demandadware / spyware on demandfalse positivesscan speedproactive detectionresponse timesrootkit detectioncleaning
AntiVir (Avira)++++ (*1)++++++++
Avast! (Alwil)+++++o+oo
AVG+++ (*1)++oo+o
AVK (G Data)++++o--+++---
BitDefender+++o-++++o
ClamAV--------++----
Dr Weboooo+o++
eScan+oo-+++----
eTrust / VET (CA)----++o---+++
Fortinet-GWoo--++++n/a (*2)n/a (*2)
F-Prot (Frisk)+o++-ooo
F-Secure+o+o++++++
Ikarus++++o+++oo
K7 Computing----o-------
Kaspersky+oo-+++++
McAfee+++++o+o+++
Microsoft+o++o---o++
Nod32 (Eset)+++++++++++
Normanoo+-+ooo
Norton (Symantec)++++++++o++++
Panda++++++o++o
QuickHeal (CAT)--oooo-o
Risingo++oooo+
Sophos+++++++++++
Trend Micro++++++++++
TrustPort++++---++++----
VBA32-ooo+oo+
VirusBuster----+o-oo+
WebWasher-GW++++o++++++n/a (*2)n/a (*2)
ZoneAlarm+oo-++++o

Index
++ = very good> 98%> 98%no FP  < 2 h  
+ = good> 95%> 95%1 FP  2 - 4 h  
o = satisfactory> 90%> 90%2 FP  4 - 6 h  
- = poor> 85%> 85%3 FP  6 - 8 h  
-- = very poor< 85%< 85%> 3 FP  > 8 h  

Notes
(1) the free (personal) edition does not include ad- and spyware detection, so the results would be "--"
(2) not available (this is a gateway product)

Detection rates for malware, adware and spyware


ProductMalware samplesAdware and Spyware
AntiVir (Avira)99.3%99.1%
Avast! (Alwil)98.8%97.9%
AVG96.3%98.6%
AVK (G Data)99.9%99.9%
BitDefender97.8%98.8%
ClamAV84.8%82.4%
Dr Web90.4%92.8%
eScan96.7%92.1%
eTrust / VET (CA)72.1%56.5%
Fortinet-GW92.4%91.2%
F-Prot (Frisk)96.7%92.0%
F-Secure96.8%93.5%
Ikarus98.0%98.8%
K7 Computing65.5%59.5%
Kaspersky97.2%92.0%
McAfee95.6%98.6%
Microsoft97.8%91.5%
Nod32 (Eset)97.8%96.3%
Norman92.8%91.9%
Norton (Symantec)95.7%98.6%
Panda95.6%95.6%
QuickHeal (CAT)85.7%86.7%
Rising94.1%95.9%
Sophos98.1%98.8%
Trend Micro98.7%95.1%
TrustPort99.6%99.8%
VBA3289.9%92.1%
VirusBuster76.2%77.8%
WebWasher-GW99.9%99.9%
ZoneAlarm96.4%94.5%
Number of samples113055683054

Full testing methodology is here.

Posted on 13 March 2008 by Virus Bulletin

 Tags

AV-Test testing
twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest posts:

The spam that is hardest to block is often the most damaging

We see a lot of spam in the VBSpam test lab, and we also see how well such emails are being blocked by email security products. Worryingly, it is often the emails with a malicious attachment or a phishing link that are most likely to be missed.

Throwback Thursday: We're all doomed

Mydoom turns 15 this month, and is still being seen in email attachments. This Throwback Thursday we look back to March 2004, when Gabor Szappanos tracked the rise of W32/Mydoom.

VB2019 call for papers - now open!

Have you analysed a new online threat? Do you know a new way to defend against such threats? Are you tasked with securing systems and fending off attacks? The call for papers for VB2019 is now open and we want to hear from you!

VB2018 paper: Unpacking the packed unpacker: reversing an Android anti-analysis library

Today, we publish a VB2018 paper by Google researcher Maddie Stone in which she looks at one of the most interesting anti-analysis native libraries in the Android ecosystem. We also release the recording of Maddie's presentation.

VB2018 paper: Draw me like one of your French APTs – expanding our descriptive palette for cyber threat actors

Today, we publish the VB2018 paper by Chronicle researcher Juan Andres Guerrero-Saade, who argues we should change the way we talk about APT actors.

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.