VB Comparative: Windows XP - June 2005

2005-06-01

Matt Ham

Virus Bulletin
Editor: Helen Martin

Abstract

This month's testing process proved to be relatively plain sailing for VB's resident reviewer Matt Ham. Find out whether it was such a breeze for the 28 products on test.


Introduction

VB's last comparative review on Windows XP (see VB, June 2004, p.12) was carried out at around the same time as the release of XP Service Pack 2. Fortunately for the products, the release date of SP2 was just after the deadline for the comparative, thus the products were spared the challenge of having to perform on the newly updated platform. Having had close to a year in which the products could adapt to the new features in SP2, this month's review was expected to bring few surprises and not to be too taxing.

Happily, this was indeed the case. The testing process was the smoothest that I can remember, with only a handful of crashes to mar the plain sailing. Considering the instability problems I usually encounter on other platforms this is convincing evidence that Windows XP bears the bulk of testing, whether this be by developers in-house, or at the hands of end users. All but one of the products on offer integrated fully with the Windows Security interface, which was a slightly higher percentage than I had expected.

However, there were problems in two other areas. Of more immediate importance to users, there was a significant upsurge in the number of false positives generated while scanning the clean sets. This meant that a VB 100% award was denied to more than one of the products in the review. On a more personal level, the logging attempts by some products ranged from the downright disgraceful to the perplexingly cryptic.

The test sets

The test sets were aligned to the February 2005 WildList, with a product submission deadline of 3 May 2005. This time lag should have been enough for all but the most tardy developers to catch up with detection, thus high detection rates were expected. The additions to the In the Wild (ItW) test set were a dull bunch, as ever, and possibly the most uninspiring yet. The predominance of various W32/*bot samples does not give cause for further comment.

AhnLab V3 Pro 2004 6.0.0.383

Please refer to the PDF for test data

Testing of V3 Pro this month progressed with few problems, and detection rates proved perfect for ItW viruses. There were no other problems that were relevant to VB 100% status, thus AhnLab is in receipt of the award this month.

However, problems were encountered during on-access testing of V3Pro. Somewhat unusually, the 'leave as is' option for on-access detection does not deny access to infected files. Thus infected files were deleted instead of logging denied access attempts. V3Pro is also unusual in that it does not scan archives by default. The option was activated when scanning archives during the clean set timings.

 

Alwil avast! antivirus 4.6.654[vps 0518-1]

Please refer to the PDF for test data

On-access scanning with avast! started problematically, with an error proclaiming that ashEnhcd was out of memory. Scanning also seemed very slow. As has been noted in previous reviews, this was due to the fact that all viruses detected on access are added to the quarantine area, even when the quarantine option is not activated. In this case it seemed that the resultant filling of the OS partition also denied the system virtual memory, hence the error.

The timing function within the product was also rather eccentric. Since these timers are often flawed, external timing is used for the clean set scans and then compared against the product's listed timings. In the case of avast! it seems that the internal timer starts not from zero, but from five seconds, thus adding considerable illusory overhead to fast scans.

Aside from these oddities, avast! performed admirably on other fronts, and obtained a VB 100% award easily.

 

ArcaBit ArcaVir 2005

Please refer to the PDF for test data

ArcaVir was the odd man out in this test as far as stability was concerned, with the on-access scanner crashing repeatedly after 300 or so infected files had been thrown at it. To circumvent this problem the tests were performed with the scanner set to delete infected files, and repeated until no further infections were logged.

The product was troubled in other areas too, with AntiCMOS.A missed on access in the boot set and the .HTM form of W32/Nimda.A In the Wild. That Nimda can cause problems so long after its release is an enduring mystery to me. A false positive in the clean test sets completed ArcaVir's woes, with this adding to the miss of the ItW Nimda sample to deny the product a VB 100%.

 

Authentium Command AntiVirus 4.92.91

Please refer to the PDF for test data

Command AntiVirus is a long-standing entrant in VB's testing regime and thus it comes as no surprise that problems with the product were few and far between. However, there were a number of issues with the log file which caused some grief. First, the log file is available only as an RTF file, which increases its size appreciably. This might not be such a problem if the log were not truncated before export can occur, since a more compact log would be expected to be truncated less, if at all. Due to these logging problems the on-demand tests were performed by deleting infected files and examining those left. While logging was problematic the other aspects of testing were not, with a VB 100% award being the result.

 

Avira Avira 1.00.00.64

Please refer to the PDF for test data

With most products in this test the installation procedure either mentions a reboot explicitly or ignores the issue entirely. In the case of Avira a reboot is deemed to be recommended, but not vital - which makes it a little unclear as to what might be changed by the reboot process. VB's test procedures include a reboot in any case. Detection rates have improved once more for Avira, and are now very good, with no misses either on access or on demand. With no false positive detections either, the result is a VB 100% award for Avira.

 

BLC Win Cleaner 7.03

Please refer to the PDF for test data

The detection rates of Win Cleaner and its parent product Quick Heal remain high, though the results in this test were rather overshadowed by the issue of false positives. In total 28 false positives were generated during clean set scanning - certainly enough to give cause for concern and equally sufficient for a VB 100% to be denied. Of some note was the presence among these false positives of a detection for 'Hoax.Pompol'.

These two products also share the dubious distinction of being the last to present log file entries in a strict 8+3 format, a feature which complicates parsing of the logs no end.

 

CA eTrust Antivirus (I) 7.1.192

Please refer to the PDF for test data

CA's eTrust Antivirus supports two engines, this being an optional setting with the InocuLAN engine activated. Updating was particularly seamless, to the extent that I assumed it must have failed due to being so fast and not interrupting the on-access scanner. As ever, all is well with the product until the log files are encountered. These are so outrageously poor that the designer should be chained to a rock and his liver devoured by eagles in the ancient fashion. Not only do the results for single files stretch over several lines due to word wrapping, but the word wrapping is continued over several columns - fragmenting the results beyond any ease of parsing, either automatically or by observation.

 

CA eTrust Antivirus (Vet) 7.1.192

Please refer to the PDF for test data

With the same interface as the preceding product, this is the version with the default setting as far as the engine is concerned, and is thus eligible for a VB 100% award. Since the scanning results were good and no false positives arrived to spoil the proceedings, a VB 100% award is awarded. The logging was, however, the same abomination as with the alternative engine.

 

CA Vet Anti-Virus 10.66.0 11.8.00

Please refer to the PDF for test data

Another of those products where there is nothing to be said but words of praise, Vet AntiVirus is destined for a short write-up confirming that its performance was worthy of a VB 100% award. Vet remains unique in that an out-of-date version of the product refuses to scan, forcing the user either to update or have no scanning functionality at all. Quite how effective this is with real users - who are not always known for choosing security over convenience - is a matter for conjecture.

 

CAT Quick Heal 7.03

Please refer to the PDF for test data

Quick Heal is identical in all but appearance to its daughter product BLC Win Cleaner. As such the comments made for that product are directly applicable for Quick Heal. Sadly for CAT, this includes the withholding of a VB 100% award due to the generation of 28 false positives in the clean test set.

 

Doctor Web Dr.Web 4.32b

Please refer to the PDF for test dataa

Dr.Web remains admirable in every way other than the configuration of its on-access scanner. This requires a reboot after any configuration change, including such matters as changing the default log size, which might be classified as relatively minor. The tray icon for the scanner also vanished at one point, seemingly a configuration change triggered merely by opening a dialog rather than actually changing settings. However, this is minor stuff in comparison with the detection rates shown by Dr.Web, which gains yet another VB 100%.

 

Eset NOD32 1.1087

Please refer to the PDF for test data

The results for NOD32 were, once more, somewhat perplexing for a product which claims not to scan within archives. Despite this claim it detected samples of W32/Heidi.A in their zipped form, suggesting that such scanning may be activated by default. On this occasion Eset's scanner missed two samples in the standard set, though this was not sufficient to deny the company another VB 100%.

 

Fortinet FortiClient 2.27 8.812

Please refer to the PDF for test data

FortiClient's VB 100% aspirations were not to be realised this month after the product became another victim of ancient viruses and, like ArcaVir, was unable to detect the .HTM form of W32/Nimda.A In the Wild. Other than this (admittedly rather major) flaw, FortiClient's performance was good.

 

FRISK F-Prot Antivirus 3.16b

Please refer to the PDF for test data

During normal testing FRISK's submission for this test demonstrated no problems whatsoever, the result of which is a VB 100% award for F-Prot. However, an error on my part highlighted an odd feature of the product. As a matter of routine, on-access scanners are deactivated during testing of on-demand functionality. This should make no difference in theory, as one would expect that a scanner would be instructed not to scan on access a file which it is opening to scan on demand. In practice, however, this is not always the case. When the F-Prot on-access scanner was inadvertently left running during an on-demand test the result was to show several files that had been blocked by the on-access scanner. This behaviour has been observed in other products in the past, but usually goes unnoticed due to the testing methodology.

 

F-Secure Anti-Virus Client Security 5.55 SR1

Please refer to the PDF for test data

F-Secure Anti-Virus is very much in a state of predictability these days, with all but full detection in the test sets. Only W32/Heidi.A concealed within zipped files and W32/Nimda.A in its TMP file form were missed. Since both samples require a degree of interaction to turn into an infectious object, such misses can hardly be considered a problem. Part of the predictable nature of >FSAV is its string of VB 100% awards, to which it adds another on this occasion.

 

GDATA AntiVirusKit 15.0.5

Please refer to the PDF for test data

Continuing in its successful ways, AVK once more detected all files in the test set both on access and on demand. Clearly, the combination of engines used by AVK is capable of good protection, though speed issues might be a problem for some impatient users. A VB 100% award is duly winging its virtual way to GDATA.

 

Grisoft AVG Anti-Virus 7.0.308

Please refer to the PDF for test data

Another product which causes no problems and produces no surprises is Grisoft's AVG, which earns another VB 100% award. Only one feature was irritating enough for me to note, which was that the timings for scans are not kept on screen after the scan has completed. With this the most serious problem encountered, it can be appreciated that the product is not brimming with faults.

 

H+BEDV AntiVir 6.30.00.18

Please refer to the PDF for test data

The results for AntiVir are identical to those for sister product Avira. The only differences noted were cosmetic, with the graphics in Avira being noticeably more up to date than those sported by AntiVir. The two products' similarities extend to the award of a VB 100% for their excellent performance.

 

Hauri ViRobot Desktop 5.0 149168

Please refer to the PDF for test data

Testing of ViRobot started well but was beset by a number of problems later in the process. The clean test set saw the first problems, with a number of false positives emerging. Intriguingly, one of these was a detection of the confusingly named 'Not-A-Virus.15718'. There was also a total inability to detect the floppy disk-based samples in the test sets.

Logging proved annoying, since exporting to file seemed not to work at first. The export did complete eventually, though this was after an interval of several minutes had passed, with the testing process having passed on to other matters by this time. On-access scanning was also tricky, with most of the usual avenues used in these tests blocked. In the end the scanner was set to disinfect and CRC testing was used to determine which files had been affected. Despite good detection rates in the file-based sets, the false positive detections and the missed floppy detections mean that ViRobot is denied a VB 100% award in this test.

 

Kaspersky KAV Personal 5.0.227

Please refer to the PDF for test data

KAV has now settled back into its traditional pattern of repeated VB 100% awards after suffering a brief glitch a few months ago. While perfect detection across all test sets will be gratifying to Kaspersky, it leaves me little to say, other than to reveal that I was wearing Kaspersky socks while performing the tests.

 

McAfee VirusScan Enterprise 8.0.0 4400 4483

Please refer to the PDF for test data

VirusScan showed few problems in detection rates during these tests, though there were some noteworthy irritations with the interface. Since choosing an area to scan requires both selecting dropdown menus and browsing, the process is tedious to perform on multiple occasions. McAfee has opted not to scan archives by default, and this lack of archive scanning was responsible for two of the three misses observed (detection of archived versions of W32/Heidi.A being impossible without handling the zip files in which they are located). These quibbles aside, good detection rates and the lack of false positives earn VirusScan a VB 100%.

 

MicroWorld eScan Internet Security 2.6.522.9

Please refer to the PDF for test data

As a rebadged version of GDATA's AVK, the detection rates for eScan were expected to be very good. It was a little strange, however, to see that eScan missed samples of three W97M viruses which presented AVK with no problems at all.

There were also initial problems with the interface, with the 'leave alone' option on detection seeming to have no effect. This proved only to be momentary, however. With no other problems eScan gained a VB 100% award despite being somewhat enigmatic.

 

Norman Virus Control 5.80 5.82.01

Please refer to the PDF for test data

With the complexity of its sandbox emulation engine running, the slow speeds of scanning for Norman's product are an expected, if irritating, feature. The technology does not detect all the more complex polymorphics in the test set but the detection for NVC elsewhere is good. No false positives were detected (in fact none have been seen for the product in living memory), meaning that Norman earns its latest VB 100% award.

 

NWI Virus Chaser 5.0

Please refer to the PDF for test data

Virus Chaser is a rebadged version of Dr.Web, the similarities between the products being obvious from installation onwards. The similarities include the requirement for a reboot between changes to settings and the need to activate the on-access scanner after installation. However, the installation routine does not mention this at all - and no reboot is prompted after reconfiguration, leaving the user never quite sure of the current settings. The log too is rather less than desirable, splitting file names from paths and thus complicating the use of the results. Despite these problems, detection is very good and no false positives were generated, thus Virus Chaser earns a VB 100% award.

 

SOFTWIN BitDefender 8 Professional Plus 7.01144

Please refer to the PDF for test data

There was just one area where problems occurred with BitDefender. During on-access testing, the 'deny access and continue' option produced file errors rather than simply 'access denied' as a result of attempting to access the files. This problem also occurred with the use of Xcopy, even when the There was just one area where problems occurred with BitDefender. During on-access testing, the 'deny access and continue' option produced file errors rather than simply 'access denied' as a result of attempting to access the files. This problem also occurred with the use of Xcopy, even when the 'ignore errors' switch was used. As a result, deletion was chosen as the setting, noting the files which were not deleted. Even here there were problems, since some files were disinfected rather than deleted. Therefore CRC checking was also used to determine which files were, in fact, not detected. These problems should not impact a normal user seriously, however, and do not prevent the award of a VB 100% to BitDefender.

 

Sophos Anti-Virus 5.0.1

Please refer to the PDF for test data

Sophos Anti-Virus has undergone something of an image change in the latest release, which might not be altogether a good thing. Rather than being ugly to look at, but pleasant to use, the product is now pleasant to look at, but ugly to use. The default location of the log file has also changed to deep within the Documents tree. This may be aligned with Windows file location good practice but it always enrages me when searching for logs. The shock of the new aside, Sophos Anti-Virus remains its usual self as far as detection is concerned, and gains a VB 100% as reward.

 

Symantec SAV 9.0.0.338 51.1.0.15

Please refer to the PDF for test data

Symantec SAV demonstrated a very good detection rate with no false positives, thus earning another VB 100% award. However, the results were marred by the logging facilities within the program. The whole application crashed during the creation of the log file, though the file itself was created correctly. The log file is viewable in several different places in the GUI, but among these views there seems an arbitrary division as to where exporting of the results can be performed. More disturbing is the treatment of some viruses in the log file. As has been noted before, seeing a reference to a virus discovered in file '????????' is not particularly useful.

 

UNA UNA 1.83 265

Please refer to the PDF for test data

UNA distinguished itself on this occasion with the dubious honour of being the only product not to be recognised by Windows Security Center. This resulted in rather more irritating pop-up bubbles than usual while testing proceeded. It also managed 14 false positives in the clean test set, mostly claiming the presence of HLLO.NumberOne.K and HLLP.Jacklyn.12416. There were also issues with the on-access scanner. Although no reboot is required after installation, the on-access scanner does not seem to perform until the machine is rebooted. Detection rates continue to improve, but there is still some way to go before a VB 100% award is achieved by UNA.

 

VirusBuster VirusBuster Professional 2005 5.0.163

Please refer to the PDF for test data

The last product in the line up is another which does not scan archives by default. This became more noticeable during scanning of the clean test sets, when several configuration switches were required. These changes seemed to be very sluggish, with an irritating delay between updating the settings and the GUI updating itself. These are not major problems, however, and detection rate was good. One file was detected as suspicious in the clean sets but this was not a full blown false positive, thus a VB 100% can be awarded.

Conclusion

As mentioned in the introduction, this was one of the least problematic of the recent reviews from a technical point of view. Log files remain an issue which seems destined never to vanish, though numerous companies have changed their logs for the better. The increase in false positives is something of a worrying trend. Whether it will turn out to be a momentary blip, or increase in future tests remains to be seen.

Technical details

Test environment.  Identical 1.6 GHz Intel Pentium machines with 512 MB RAM, 20 GB dual hard disks, DVD/CD-Rom and 3.5-inch floppy drive running Windows XP Professional SP2.

Virus test sets.  Complete listings of the test sets used can be found at http://www.virusbtn.com/vb100/archive/2005/06/testsets.

Results calculation protocol.  A complete description of the results calculation protocol can be found at http://www.virusbtn.com/virusbulletin/archive/1998/01/vb199801-vb100-protocol.

twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest articles:

Nexus Android banking botnet – compromising C&C panels and dissecting mobile AppInjects

Aditya Sood & Rohit Bansal provide details of a security vulnerability in the Nexus Android botnet C&C panel that was exploited to compromise the C&C panel in order to gather threat intelligence, and present a model of mobile AppInjects.

Cryptojacking on the fly: TeamTNT using NVIDIA drivers to mine cryptocurrency

TeamTNT is known for attacking insecure and vulnerable Kubernetes deployments in order to infiltrate organizations’ dedicated environments and transform them into attack launchpads. In this article Aditya Sood presents a new module introduced by…

Collector-stealer: a Russian origin credential and information extractor

Collector-stealer, a piece of malware of Russian origin, is heavily used on the Internet to exfiltrate sensitive data from end-user systems and store it in its C&C panels. In this article, researchers Aditya K Sood and Rohit Chaturvedi present a 360…

Fighting Fire with Fire

In 1989, Joe Wells encountered his first virus: Jerusalem. He disassembled the virus, and from that moment onward, was intrigued by the properties of these small pieces of self-replicating code. Joe Wells was an expert on computer viruses, was partly…

Run your malicious VBA macros anywhere!

Kurt Natvig wanted to understand whether it’s possible to recompile VBA macros to another language, which could then easily be ‘run’ on any gateway, thus revealing a sample’s true nature in a safe manner. In this article he explains how he recompiled…


Bulletin Archive

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.