VB Comparative: Windows XP - June 2006

2006-06-01

Matt Ham

Virus Bulletin
Editor: Helen Martin

Abstract

In Matt Ham's final comparative review for Virus Bulletin he puts 26 products for Windows XP through their paces. Two products enter the test line-up for the first time this month: TrustPort Antivirus and the rather more well-known Microsoft OneCare. In his own inimitable style, Matt provides rude comments and/or praise for all products, as well as the all-important VB 100% results.


Introduction

Yet again the Windows XP comparative review is upon us, with the usual throng of products arriving to be tested and to test my patience. On this occasion two new products were submitted: TrustPort Antivirus and the rather more famous Microsoft OneCare. Rude comments and/or praise for these products can be found later in the review.

As this is the last review I will conduct for Virus Bulletin, I had hoped for an easy run overall – sadly this was not the case for several products. Although instability was less common than in previous tests, scanning speeds for some products were even slower than they have been in the past. There were also a number of products in this test whose feature sets can only have been designed by folk who are either totally ignorant of usability or bred for enhanced sadism.

The test sets

The test sets were aligned to the February 2006 WildList. As always, the contents of the WildList can be viewed at http://www.wildlist.org/.

When I first started anti-virus testing, the WildList consisted of some 300 different viruses, one third of which were boot sector types. I have none-too-fond memories of inserting 90 floppies into a machine for scanning on demand, then repeating the process on access. Thankfully for my successor, this month's tests saw a major, if long foreseen, change in that there are no longer any boot sector viruses that are considered to be in the wild. Similarly anticipated was the fact that all but a small number of macro viruses dropped out of the test sets this month, including all Excel and WM/ samples.

Numerous other files also dropped out of the test set this month – and, as ever, yet more were added to replace them. Overall numbers in the test set increased marginally; more than 100 samples were added and not quite as many removed. Samples of W32/Rbot, W32/Mytob and W32/Sdbot accounted for the majority of these changes and, together, these three fill around half of the space in the WildList.

AhnLab V3Pro 2004 6.0.0.574

Starting the line-up on this occasion, AhnLab's V3Pro managed one of the slowest installation routines I have witnessed. It also demonstrated some odd logging behaviour, so that detection was performed ultimately by deletion of infected files.

Unfortunately, a false positive and a suspicious file in the clean test set were sufficient to deny AhnLab a VB 100% this month, though scanning of these files was notably speedy. In addition there were numerous misses of samples in the In the Wild (ItW) test set, which suggests that slow updates could be the problem here.

Please refer to the PDF for test data

Alwil avast! 4.7.829

As ever, on-access detection for avast! was performed by copying the test set and deleting infected files – on-access scanning is not triggered simply by opening files. avast! also suffered from a round of false positives – a total of three being sufficient to dash any hopes of a VB 100%. However, there were no misses during the scanning of infected files in the ItW test set, and misses elsewhere were at the same low background level as ever.

NOTE: 01 July 2006. After discussions with the developers and further investigation of the files, VB now considers that the files Alwil produced false positives on were inappropriate for inclusion in the clean test set. The files have been removed and VB extends its apologies to Alwil. With a faultless performance across the ItW test sets, and an admirable performance elsewhere, a VB 100% is belatedly awarded to avast!.

Please refer to the PDF for test data

Avira AntiVir 330 7.00.00.07

At first glance, AntiVir looked very much to be taking a step backwards in this version, since many options seemed no longer to be present. Happily, it turned out that these are merely somewhat hidden in the default interface view. With this minor hitch disentangled, AntiVir went on to detect all infected files in all test sets – a performance that earned the product a well-deserved VB 100% award.

Please refer to the PDF for test data

CA eTrust (InoculateIT engine) 8.0.403.0 23.71.145.0

Having progressed to version 8, both the eTrust products now rejoice in a new interface. However, the new interface seems to prioritise looking new and trendy over being intuitive and easy to use. Something I found to be particularly irritating was the fact that the interface is launched as HTML in a browser window which is almost unusable on any lower resolution screens.

I was hoping for an improvement in eTrust's reporting of infections. However, hard to credit though it is, on-screen reporting proved to be even worse than it had been previously. In this version of the product infections are reported in a tiny text box which, by default, is truncated and cannot be resized.

It is thus impossible to tell which files are infected through the use of the on-screen display. This can be overcome by printing the log file, though there is no obvious way of obtaining a useful version of this as a file.

As in previous comparative reviews, this version of eTrust is not eligible for a VB 100% award, since the InoculateIT engine is not the product's default.

Please refer to the PDF for test data

CA eTrust (Vet engine) 8.0.403.0 12.4.2191.0

Of course, the comments made in the previous section also apply to this version of eTrust. As mentioned, the Vet engine is the default for use in scanning – in fact eTrust reverts back to Vet on each restart of the GUI.

Despite the interface woes, eTrust's detection rates were up to their usual good levels, and since no false positives were detected in the clean test set a VB 100% is the result. Scanning speeds were also good for both of the engines.

Please refer to the PDF for test data

CAT Quick Heal 2006 8.00

Problems for CAT started in the clean test sets, where the generation of a false positive denied the product any chance of a VB 100% immediately. On a truly bizarre front, Quick Heal reported internally that all scans of clean objects took exactly one hour each. In reality, scanning speeds were good. Unfortunately, there was a second major disappointment for CAT in that samples of W32/Bagle.X were missed in the ItW test set.

Please refer to the PDF for test data

Central Command Vexira Antivirus 2006 5.002 33

Vexira bears a very close resemblance to VirusBuster – which can be explained by the fact that it is a rebadged version of VirusBuster. Purists might point out that one product is red and the other blue, but my advanced skills of observation saw past this dissimulation.

Unfortunately stability was not a strength of this product, which caused a hang on the test machine after on-access scanning. On demand, matters were substantially worse, with there being repeated crashes while scanning PowerPoint files. After this performance had been tolerated for long enough to obtain results, there remained a number of misses of samples in the ItW test set, thus the product was prevented from obtaining a VB 100%.

Please refer to the PDF for test data

Command Authentium AntiVirus 4.93.7

Once again, the most irritating thing about this product was the log – which is available only in a very truncated RTF format. An extensive search of the machine did not help in finding a useful log, thus infected files were deleted to determine detection rates.

After having jumped through the appropriate hoops, the scanning results were good, with only very few, non-ItW, infected files being missed. As a result, Authentium earns itself a VB 100% award.

Please refer to the PDF for test data

Doctor Web Dr.Web 4.33.2

On the negative side, Dr.Web's on-access monitor SpIDer Guard lies about its configuration settings – option changes are only ever implemented after a reboot, a fact not reflected by the interface.

The story improved though, with scanning being perfect on demand, while missing only archived files on access. This performance was certainly ample for a VB 100% to be on its way to Doctor Web.

Please refer to the PDF for test data

Eset NOD32 1.1517

NOD32 was the first product in this month's test with which I could find no real fault. Full detection across all test sets and a lack of false positives leave me little to comment on and earn Eset a well-deserved VB 100% to add to its collection.

Please refer to the PDF for test data

F-Secure Anti-Virus Client Security 6.01

Another product that displayed no remarkably bad or notably new features, FSAV also obtains a VB 100% for its performance. Misses here were limited to viral code, which is a stored rather than directly executable form.

Please refer to the PDF for test data

Fortinet FortiClient 2.76 8.459

The trend of good results with few shocks is continued with Fortinet's offering. Although the product missed a noticeable number of polymorphic files, detection results across other test sets were very strong. As a result, FortiClient adds another VB 100% to its collection.

Please refer to the PDF for test data

FRISK F-Prot Antivirus 3.16f

Unfortunately, the run of products displaying excellent results and few faults is cut short here, since all was not perfection for F-Prot. Scanning speeds were fair, but unfortunately a smattering of misses across the test sets included a sample of W32/Aimbot, which is classified as in the wild. A VB 100% award therefore is out of the grasp of FRISK on this occasion.

Please refer to the PDF for test data

GDATA AntiVirusKit 2006 16.0.7

Despite a somewhat slow performance, GDATA managed full detection of all samples in all categories, with no false positives. AVK's developers should be pleased with this performance, and a VB 100% should add to their contentment.

Please refer to the PDF for test data

Grisoft AVG Anti-Virus 7.1.392

One of the more common user queries I have been faced with during my time at Virus Bulletin concerns how to delete infected files using AVG. Having tried to do so, the frequency of complaints no longer surprises me. Numerous files, although flagged as infected, were not subject to any automated deletion or disinfection.

Apart from this there were no surprises in either the clean or infected test sets, with a VB 100% being the pleasing result for Grisoft.

Please refer to the PDF for test data

Hauri ViRobot 5.0

Unfortunately, Hauri's chances of gaining a VB 100% evaporated with a false positive and suspicious file noted in the clean set – and scanning rates were not particularly speedy here either.

Misses in detecting infected files were plentiful too, although looking on the brighter side, none of the missed detections occurred in the ItW set.

Please refer to the PDF for test data

Kaspersky Anti-Virus 6.0.0.299

KAV includes various self-protection features which turn out to be a double-edged sword. The less-than-welcome aspect is that the virus definitions are so well protected that they are, by default, unable to be updated manually. Since the update function does not allow updates from a local folder, this is somewhat irritating.

There also seem to have been some changes in scanning methods, the effects of which are particularly unpleasant. On-access scanning was seemingly interminable, while the clean set scanning rate is pretty indicative of the speeds seen while scanning the infected sets. This is not an effect of low scanning priorities however – during scanning KAV remained steadily at 99% processor usage.

All of this work was, at least, for good reason as all files in all test sets were detected and no false positives were produced. A VB 100% award thus acts as a distraction from the various problems encountered.

Please refer to the PDF for test data

McAfee VirusScan Enterprise 8.0i 4400 4753

Happily, with VirusScan we return to a product that had no nasty surprises in store and gave a good performance with full detection of infected samples across all test sets. With no false positives noted in the clean test sets either, VirusScan is awarded a well deserved VB 100%.

Please refer to the PDF for test data

Microsoft Windows Live OneCare 1.0.0971.12

As might be expected of a Microsoft product, OneCare operates in the guise of paranoid nanny. The user is not trusted to make many decisions of their own, which made certain parts of the test process frustrating.

The progress counter that is displayed during scans is particularly laughable, reaching 99% in ten minutes and then remaining at that point for approximately another 20 minutes or so. This is a result of the automatic disinfection and quarantine (the user has no say in the matter). Indeed, Microsoft's idea of quarantining is somewhat novel, consisting of appending what looks like a checksum to the end of the file name.

What with constantly resetting the areas to be scanned and hanging after the on-access scan, this product cannot be said to be one of my favourites. However, its detection rates were sufficient for a VB 100% to be in order.

Please refer to the PDF for test data

MicroWorld eScanWin 8.0.659.1

eScan is a rebadged version of GDATA's AntiVirusKit, so it should come as no great surprise that the results for eScan include full detection of samples across all test sets, a VB 100% award and no adverse comment.

With little else to say, let's move on to a product that behaved badly instead.

Please refer to the PDF for test data

Norman Virus Control 5.81

Having been a source of frustration in previous reviews (see VB, April 2006, p.17), Norman Virus Control continued to manifest new problems on this occasion.

On-access scanning was subject to repeated crashes, whether dealing with infected or previously disinfected files. The effects were sufficient to reduce Windows to a state of complete paralysis, in which only a hard reboot had any effect on the test machines.

Upon reboot the splash screen displays the question 'Would you go for anything but green?' (green being Norman's corporate colour). My answer would be that anything would be better than this.

Unfortunately for the forces of truth and justice, after strenuous efforts scanning results were sufficient to warrant a VB 100% for this shockingly behaved product.

Note: 1 August 2006. In VB's June 2006 comparative review it was reported that the Norman product behaved badly, with repeated crashes on dealing with infected or previously disinfected files. VB would like to note that since then, neither Norman's developers nor VB's new resident product tester have been able to reproduce the bad behaviour described.

Please refer to the PDF for test data

New Technology Wave (NWI) Virus Chaser 5.09

Since Virus Chaser is a rebadged version of Dr.Web, it should come as little surprise that it shares both the irritations and praise of that product.

With faultless detection rates across all the test sets and no false positives noted in the clean test set, a VB 100% can be included in the shared experience.

Please refer to the PDF for test data

SOFTWIN BitDefender 9 7.06632

There were few notable moments during the testing of BitDefender, though the scanning of clean executables was certainly slow enough to be tedious to oversee.

As far as detection was concerned, BitDefender had a small number of missed detections, although no real pattern was discernable among them. Happily for SOFTWIN, however, there were no misses in the ItW set and no false positives were picked up in the clean test set, thus BitDefender also earns a VB100%.

Please refer to the PDF for test data

Sophos Anti-Virus 5.2.0

Sophos's product was as well behaved as ever. Whether it was practice with the GUI or some small changes in it, something made its use seem very much simpler than I can remember it having been recently, which is always a plus point. With an admirable performance across the test sets, a VB100% is in order for the Sophos product.

Please refer to the PDF for test data

Symantec AntiVirus 10.0.0.359

The Symantec GUI has remained the same for many years and on this occasion the product's full detection rate across all test sets leaves little scope for discussion.

Not even my pathological hatred of the colour yellow can detract from the fact that the product's performance was ample for SAV to be awarded a VB 100%.

Please refer to the PDF for test data

TrustPort Antivirus 1.6.0.807

Since this product is based on a combination of BitDefender and Norman scanning engines, I was fearful, when I first launched TrustPort, that its scanning performance would resemble blue whales forced into pogo-stick races. Thankfully, scanning speeds were not absolutely terrible, just pretty bad.

The combination of the two engines may be responsible for one of Trustport 's oddities, namely that it reported many more files as having been scanned than actually existed in the test sets.

A further mystery was the sheer variation in the actions taken upon detection of a virus. Using the default settings, samples were deleted, disinfected, quarantined, renamed and simply left to fester – all in the course of one scan.

All this aside, the detection rates demonstrated by the product came close to decent, but there were certainly a sufficient number of ItW misses to deny TrustPort a VB 100%. Maybe my successor will see an improved performance in the not too distant future.

Please refer to the PDF for test data

VirusBuster VirusBuster Professional 2006 5.2 33

Not surprisingly, VirusBuster suffered some of the same woes as Vexira, though thankfully to a lesser extent. Instability on demand resulted in scanning simply not being available after existing scans aborted while in progress. Only a reboot solved this broken state. Misses of samples in the ItW test set merely added to these woes, meaning that VirusBuster was denied a VB 100% on this occasion.

As a side note, after discussion with the developers, the reason for the scanning speed issues which plagued VirusBuster in the Linux comparative review (see VB, April 2006, p.13) was determined to be the handling of alert messages. In the default setting, alerts are sent to the client and if the client is set such that it will not accept these alerts, then the sending will wait until it times out. Since the client is set, by default, not to accept these alerts, this causes a dramatic slowdown in scanning rates. Clearly this problem can be solved easily by some simple changes in the client or scanner configuration.

Please refer to the PDF for test data

Conclusion

My final words should be statements, grave judgements and moments of prescience, so as to leave a lasting memory of the quality of my reviews. Unfortunately for this line of thinking, the only thoughts I have to offer are of a cynical nature.

The names and descriptions of the threats may change, but the anti-virus industry remains pretty much the same as it ever has been. The major companies are the same, user ignorance is unchanged and the hyperbolic press releases are the same. Even the claims that 'soon all will change' are simply repeats of the past.

If I should return to the anti-virus field in the future, I really don't think it would take more than a few minutes to become re-acclimatised – I just hope that NetWare is extinct by then.

Technical details

Test environment. Identical 1.6 GHz Intel Pentium machines with 512 MB RAM, 20 GB dual hard disks, DVD/CD-ROM and 3.5-inch floppy drive running Windows XP Professional SP2.

Virus test sets.  Complete listings of the test sets used can be found at http://www.virusbtn.com/Comparatives/WinXP/2006/test_sets.html.

Results calculation protocol.  A complete description of the results calculation protocol can be found at http://www.virusbtn.com/virusbulletin/archive/1998/01/vb199801-vb100-protocol .

twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest articles:

Nexus Android banking botnet – compromising C&C panels and dissecting mobile AppInjects

Aditya Sood & Rohit Bansal provide details of a security vulnerability in the Nexus Android botnet C&C panel that was exploited to compromise the C&C panel in order to gather threat intelligence, and present a model of mobile AppInjects.

Cryptojacking on the fly: TeamTNT using NVIDIA drivers to mine cryptocurrency

TeamTNT is known for attacking insecure and vulnerable Kubernetes deployments in order to infiltrate organizations’ dedicated environments and transform them into attack launchpads. In this article Aditya Sood presents a new module introduced by…

Collector-stealer: a Russian origin credential and information extractor

Collector-stealer, a piece of malware of Russian origin, is heavily used on the Internet to exfiltrate sensitive data from end-user systems and store it in its C&C panels. In this article, researchers Aditya K Sood and Rohit Chaturvedi present a 360…

Fighting Fire with Fire

In 1989, Joe Wells encountered his first virus: Jerusalem. He disassembled the virus, and from that moment onward, was intrigued by the properties of these small pieces of self-replicating code. Joe Wells was an expert on computer viruses, was partly…

Run your malicious VBA macros anywhere!

Kurt Natvig wanted to understand whether it’s possible to recompile VBA macros to another language, which could then easily be ‘run’ on any gateway, thus revealing a sample’s true nature in a safe manner. In this article he explains how he recompiled…


Bulletin Archive

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.