AV-test.org issues latest figures

Posted by   Virus Bulletin on   Mar 13, 2008

In-depth testing covers multiple factors.

Independent testing body AV-Test.org has released its latest set of results, with a large group of products tested against a number of criteria including proactive detection, spotting and removing active infections, and outbreak response times, as well as simple detection rates.

The results show how companies and their products fare against the latest range of samples arriving at AV-Test, with results of checking new arrivals used to determine the accuracy of heuristics and the efficacy of behavioural detection systems. Updates were also monitored over the test period to determine when companies added detection for new items not spotted using heuristics or generic detection. Detection and effective removal of active malware, including rootkits, is also measured, as is the impact on system performance.

As in AV-Comparatives' recent figures, multi-engine products such as AEC's Trustport, G DATA's AVK and the gateway scanning product WebWasher all performed very strongly in the pure detection test, with Avira's AntiVir also achieving very high scores in both malware and 'potentially unwanted' categories.

The multi-engine products showed their weakness when it came to scanning times and false positives however, and also fared poorly against rootkits, while Avira did well across the board, ranking 'good' or 'very good' in all categories. The only other product to achieve this feat was Sophos, with Symantec and Panda let down only by their response times to outbreaks, marked as merely 'Satisfactory', and McAfee also failing to excel in scanning speed.

The results of the tests are shown in full below.

Overall results


Productmalware on demandadware / spyware on demandfalse positivesscan speedproactive detectionresponse timesrootkit detectioncleaning
AntiVir (Avira)++++ (*1)++++++++
Avast! (Alwil)+++++o+oo
AVG+++ (*1)++oo+o
AVK (G Data)++++o--+++---
BitDefender+++o-++++o
ClamAV--------++----
Dr Weboooo+o++
eScan+oo-+++----
eTrust / VET (CA)----++o---+++
Fortinet-GWoo--++++n/a (*2)n/a (*2)
F-Prot (Frisk)+o++-ooo
F-Secure+o+o++++++
Ikarus++++o+++oo
K7 Computing----o-------
Kaspersky+oo-+++++
McAfee+++++o+o+++
Microsoft+o++o---o++
Nod32 (Eset)+++++++++++
Normanoo+-+ooo
Norton (Symantec)++++++++o++++
Panda++++++o++o
QuickHeal (CAT)--oooo-o
Risingo++oooo+
Sophos+++++++++++
Trend Micro++++++++++
TrustPort++++---++++----
VBA32-ooo+oo+
VirusBuster----+o-oo+
WebWasher-GW++++o++++++n/a (*2)n/a (*2)
ZoneAlarm+oo-++++o

Index
++ = very good> 98%> 98%no FP  < 2 h  
+ = good> 95%> 95%1 FP  2 - 4 h  
o = satisfactory> 90%> 90%2 FP  4 - 6 h  
- = poor> 85%> 85%3 FP  6 - 8 h  
-- = very poor< 85%< 85%> 3 FP  > 8 h  

Notes
(1) the free (personal) edition does not include ad- and spyware detection, so the results would be "--"
(2) not available (this is a gateway product)

Detection rates for malware, adware and spyware


ProductMalware samplesAdware and Spyware
AntiVir (Avira)99.3%99.1%
Avast! (Alwil)98.8%97.9%
AVG96.3%98.6%
AVK (G Data)99.9%99.9%
BitDefender97.8%98.8%
ClamAV84.8%82.4%
Dr Web90.4%92.8%
eScan96.7%92.1%
eTrust / VET (CA)72.1%56.5%
Fortinet-GW92.4%91.2%
F-Prot (Frisk)96.7%92.0%
F-Secure96.8%93.5%
Ikarus98.0%98.8%
K7 Computing65.5%59.5%
Kaspersky97.2%92.0%
McAfee95.6%98.6%
Microsoft97.8%91.5%
Nod32 (Eset)97.8%96.3%
Norman92.8%91.9%
Norton (Symantec)95.7%98.6%
Panda95.6%95.6%
QuickHeal (CAT)85.7%86.7%
Rising94.1%95.9%
Sophos98.1%98.8%
Trend Micro98.7%95.1%
TrustPort99.6%99.8%
VBA3289.9%92.1%
VirusBuster76.2%77.8%
WebWasher-GW99.9%99.9%
ZoneAlarm96.4%94.5%
Number of samples113055683054

Full testing methodology is here.

Posted on 13 March 2008 by Virus Bulletin

 Tags

AV-Test testing
twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest posts:

VB2019 paper: Cyber espionage in the Middle East: unravelling OSX.WindTail

At VB2019 in London, Jamf's Patrick Wardle analysed the WindTail macOS malware used by the WindShift APT group, active in the Middle East. Today we publish both Patrick's paper and the recording of his presentation.

VB2019 paper: 2,000 reactions to a malware attack – accidental study

At VB2019 cybercrime journalist and researcher Adam Haertlé presented an analysis of almost 2000 unsolicited responses sent by victims of a malicious email campaign. Today we publish both his paper and the recording of his presentation.

VB2019 paper: Why companies need to focus on a problem they do not know they have

Often unbeknownst to network administrators, many company networks are used to download child sexual abuse material. In a paper presented at VB2019 in London, NetClean’s Richard Matti and Anna Creutz looked at this problem and what companies can do,…

VB2020 update - currently business as usual

Here at VB we are keeping a close eye on the global situation regarding the COVID-19 outbreak and the various travel restrictions and health advice, but in the meantime planning and arrangements for VB2020 are going ahead as usual, including the…

VB2019 paper: Defeating APT10 compiler-level obfuscations

At VB2019 in London, Carbon Black researcher Takahiro Haruyama presented a paper on defeating compiler-level obfuscations used by the APT10 group. Today we publish both Takahiro's paper and the recording of his presentation.

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.