VB Comparative: Windows NT 4 Workstation - February 2006

2006-02-01

Matt Ham

Virus Bulletin
Editor: Helen Martin

Abstract

Matt Ham fully expected a bumper harvest of VB 100% awards this month, simply due to the familiarity of the Windows NT platform to the developers.


Introduction

Windows NT is such an ancient platform that writing a review for it seems more akin to writing about history than present-day affairs. The platform is still used by a fair number of people the world over though, so the review will be relevant to many.

For a reviewer, both very old and very new platforms are of great interest. When products are tested on very new platforms one tends to see many oddities as developers struggle to accommodate unexpected technology, while products tested on the very old platforms have the potential to be utterly broken due to these very struggles. Symantec, for example, no longer supports Windows NT in its most recent product line (SAV 10), and thus SAV 9 was submitted for test here.

That said, I was fully expecting a bumper harvest of VB 100% awards on this occasion, simply due to the familiarity of the platform to developers.

Test sets

The test sets were aligned to the October 2005 WildList, which was the most recent edition available on the product submission date, 9 January 2006.

The overwhelming majority of new samples in the test sets were of W32/Mytob, with close to 50 new variants added this time. Other additions were also predominantly bot-related – perhaps VB should consider Bot Bulletin as an alliterative name change.

Alwil avast! Professional 4.6.750 0602-1

avast! is the first of a small number of products in this test in which archive scanning is not activated by default. Where scanning and detection were concerned, however, default settings seemed to have been well chosen.

A selection of files were missed – primarily polymorphics and some macro samples – though none of these were in the In the Wild (ItW) test set and no false positives were generated when scanning clean files. avast! is thus the first product to receive a VB 100% award in this test.

Please refer to the PDF for test data

Avira Avira Desktop 1.00.00.80

With a string of recent good results behind it, Avira had ample opportunity in this test to fall from grace and little to improve. In the event, however, test results were exactly the same as the last time the product was tested: full detection in all sets. With no false positives in the clean test sets, this performance gains Avira another VB 100%.

Please refer to the PDF for test data

CA eTrust Antivirus 7.1.501 (InoculateIT engine 12.4.2034)

As usual, this version of eTrust is included here for information only, since the InoculateIT is provided within the eTrust installation, but not activated by default. Also as usual, the logging functions within eTrust remain utterly abominable – screenshots being more useful than the dumped log versions available from within the scanner. It was notable that the InoculateIT version of eTrust detected more viruses than the default Vet engine on this occasion.

Please refer to the PDF for test data

CA eTrust Antivirus 7.1.501 (Vet engine 23.71.42)

Much of the comment about eTrust has already been made under the previous section, and with the interface here being identical, there remains little to comment on other than the scanning results.

With 100% detection of samples in the ItW test set and no false positives generated, these were sufficient to guarantee a VB 100% for eTrust when using its default Vet engine.

Please refer to the PDF for test data

CAT Quick Heal 2006 8.00

Quick Heal remains a fast and easy product to test, and its performance was once more sufficient for the product to earn a VB 100% award. There was little else to note about CAT's product, so this remains a short write up.

Please refer to the PDF for test data

Command AntiVirus 4.93.6

Logging became a problem while testing Command AntiVirus, with logs available only in RTF format - one of the least friendly formats for automated parsing. Since all logs were truncated in any case, they were sufficiently useless that parsing was not attempted. Thankfully, the number of misses was small enough that manual inspection of the truncated logs, combined with scan summary information, could easily pin down the missed files on demand. Such a small number of misses is always a promising sign, and indeed Command AntiVirus receives a VB 100%.

Please refer to the PDF for test data

Dr.Web Dr.Web Scanner 4.33

Before installation of Dr.Web would complete, a version of psapi.dll needed to be installed on the machine, in order to make on-access scanning possible. A great change was noted in the on-access scanner: it seems that a reboot is no longer necessary after making changes in the on-access scanner configuration. After many years of constant restarts during testing, this came as a happy event.

Probably more happily for the developers, the number of missed files remains very low – the only files missed were during on-access scanning, and then only if in EML or ZIP format. It comes as no surprise, therefore, that a VB 100% award is in order.

Please refer to the PDF for test data

Eset NOD32 1.1358

NOD32 was the second product in this test that required archive file scanning to be activated when testing the zipped clean files. With an otherwise uneventful set of tests I was able to come up with only one event of note: the log file for on-demand scanning was 1337 kb in size – clearly this is highly significant if one favours conspiracy theories or numerology. Less shocking will be the news that Eset gains a VB 100% as a result of the tests.

Please refer to the PDF for test data

Fortinet FortiClient 2.0.180

FortiClient's performance was sufficient for another VB 100% to be added to Fortinet's collection. The misses that remain are scattered through the test sets to such an extent that no real pattern emerges. One suspects that results will improve gradually.

Please refer to the PDF for test data

FRISK F-Prot Antivirus 3.16f

Perhaps to make the log files seem more interesting, a very large amount of information was included – though it was at least easy to filter out during parsing. Only one sample was missed during on-demand scanning, in the standard test set, though a few more misses were added on-access. None of these are currently rated as In the Wild, however, so a VB 100% is awarded.

Please refer to the PDF for test data

F-Secure Anti-Virus 5.44 11411

FSAV is a product where very small numbers of misses are something of a habit. On this occasion the product missed only the stored .TMP sample of W32/Nimda.A on demand. On access, the total was increased by the two zipped samples of W32/Heidi. However, since these are all currently in the standard test set (not In the Wild), F-Secure is also the recipient of a VB 100%.

Please refer to the PDF for test data

GDATA AntiVirusKit 14.1.2

Continuing the theme, AVK managed to miss even fewer samples than the previous products – no samples went undetected. With no false positives, it goes without saying that these results earn AVK a VB 100% award. However, the product's scanning performance does come at a small expense, with a slightly slow scan rate as a side effect. The trade off between detection and scanning speed is a common dilemma for anti-virus developers, with many misses in these tests occurring as a result of pragmatism, with developers opting for faster on-access scanning at the cost of some detection when files are being manipulated rather than executed.

Please refer to the PDF for test data

Grisoft AVG Anti-Virus 7.1 371

AVG missed a number of files in the various test sets, though none were classified as In the Wild. Of those missed, the majority were polymorphic in nature or packaged in slightly unusual formats. With no false positives and full detection of ItW viruses, AVG earns itself a VB 100%.

Please refer to the PDF for test data

Hauri ViRobot Desktop 5.0

ViRobot held the dubious distinction of having by far the largest number of false positive detections in this test. Six files were reported as infected, while a further clean file was declared to be suspicious. As a result, the product does not qualify for a VB 100% this month. This will be something of a disappointment, since detection rates were respectable.

Please refer to the PDF for test data

H+BEDV AntiVir 6.33.00.02 1127

Other than a lower price and older graphics, AntiVir is essentially identical to Avira internally and thus similar scanning results were expected. This was indeed the case. Minor variations in the scanning throughput rates were noted, though with Windows being host to numerous unpredictable background processes, it would be surprising if results here were found to be identical.

Please refer to the PDF for test data

Kaspersky Anti-Virus Personal 5.0.388

The ever-productive interface developers at Kaspersky have been at work once more for this version. Personally, I am less of a fan of this latest incarnation than the previous interface, though this is more due to unfamiliarity than any obvious faults. The only oddity noted was in the 'time remaining' bar on the scanning interface, which demonstrated some interesting time dilation and compression phenomena.

On the detection front, however, there were few changes to be seen. Two zipped W32/Heidi samples on access were the sum total of missed files, leaving Kaspersky the holder of a VB 100% yet again.

Please refer to the PDF for test data

McAfee VirusScan Enterprise 8.00 4400 4669

VirusScan was the third and last of the products in this test to require manual activation on archive file scanning during clean set tests. It also showed notable differences between scanning on access and on demand, with several samples of W32/Etap missed on access. No misses were noted on demand, however, and no false positives surfaced either. McAfee thus receives a VB 100% award for VirusScan's performance.

Please refer to the PDF for test data

MicroWorld eScanWin 8.0.641.1

As a rebadged version of the GDATA product, eScanWin might be expected to show similarities to that product, despite being blue in places rather than yellow.

Somewhat disturbingly, however, there was a major difference, in that the on-access scanner crashed during testing. This occurred only once though, so did not seem easily reproducible. Happily, the differences in performance did not extend to detection capabilities and, with 100% detection of ItW viruses and no false positives, eScanWin also gains a VB 100%.

Please refer to the PDF for test data

New Technology Wave Inc. Virus Chaser 5.0a

Installation of Virus Chaser failed initially due to the requirement of a new version of mfc42.dll. Installing the redistributable C++ libraries on the machine solved this problem.

Somewhat less easily solved was the total lack of control of on-access scanning available within the program. In the end, scanning was performed while locking an appropriate key in a depressed position – scanning in this way taking a little over 24 hours to complete. On demand, scanning progressed more easily, though the logs must have been the creation of either a sadist or a fan of complex logic problems. Overall, there was an impressive degree of user unfriendliness in this product.

Such irritations aside, the product's scanning performance was less than awesome, with a smattering of misses across the test sets. The fact that samples of W32/Yaha.G and W32/Yaha.E were missed on access was sufficient to deny Virus Chaser a VB 100% on this occasion.

Please refer to the PDF for test data

Norman Virus Control 5.81

Minor improvements seem to have been made to the creation of new tasks in Norman Virus Control of late, since the process seemed less painful than it has done in the past. Of course, this could merely be due to the fact that I have gained familiarity with the interface, but either way the effect was appreciated.

When the logs were analysed the results were much as expected: some polymorphic and a few other samples were missed, but with no ItW samples missed and no false positives, NVC is a VB 100% winner.

Please refer to the PDF for test data

SOFTWIN BitDefender Professional Plus 9 9

BitDefender continues to be a solid performer in our tests, with little in the way of comment necessary. It is presumably this solidity which has led to its being the basis of detection in several other products, including Hauri's offering in this test. SOFTWIN will be pleased that none of the false positive issues apparent with that derived product were present in BitDefender, thus entitling it to a VB 100%.

Please refer to the PDF for test data

Sophos Anti-Virus 4.5.8 2.32.6 4.01

Of note in Sophos's clean file scans was the fact that archive scanning is now activated by default. This is a recent and much appreciated configuration change. With both detection and lack of false positives in their usual respectable state, Sophos earns itself a VB 100% award. That said, logging functions were not without their niggles, with various unnecessary spaces added to lines which serve no purpose but to make parsing a little more complex. Meanwhile, archives are designated merely by appending \[archivename] directly to the path in which the infected archive is located. This ensures that parsing is made more complex for these entries and would be an ideal place to use the spare spaces just mentioned.

Please refer to the PDF for test data

Symantec AntiVirus 9.0.0.338 51.3.0.11

SAV's scanning speed was far slower with some settings activated than with others. Default scanning settings are not pleasant when large numbers of infected files are present, though acceptable when files are mostly clean. Unfortunately, it seems that on-access no POT or PPT files were checked in the default mode, thus resulting in samples of O97M/Tristate.C being missing in the ItW test set and no VB 100% being awarded this time.

Update: Subsequent to this review test it was discovered that the incorrect version of SAV had been provided for testing. The correct version - v9.0.5 - was retested and achieved VB 100% status satisfactorily.

Please refer to the PDF for test data

UNA UNA 1.83

The good news for UNA is that scanning was fast and no false positives were flagged. The bad news is that there were a multitude of missed detections in every test set. Although only two files were missed In the Wild both on access and on demand, this is ample reason to deny a VB 100%.

Please refer to the PDF for test data

VirusBuster Professional 2005 5.001 41

VirusBuster was perhaps the most troubled of all the products. First, it required mfc42.dll to be installed – a hurdle that was easily passed. When scanning on access, however, the scanner failed repeatedly. This failure was silent, with no indication other than the fact that no files were being checked. It seemed reproducible, simply by passing around 6,000 infected files through the on-access scanner. Woes were to continue in the clean sets too, where a suspicious file was noted. Matters on the detection front were no more inspiring. Despite having .EML files flagged for scanning, the .EML version of W32/Nimda.A was missed both on access and on demand. A VB 100% award is thus out of reach for VirusBuster this month.

Please refer to the PDF for test data

Conclusion

The biggest surprise for me in this test was not the products that failed to detect virus samples, but the issues concerning operating system support. NOD32, for example, included the Microsoft C++ foundation classes as part of its installation package and asked whether they should be installed. Several products, however, were missing DLLs when installed onto the Windows NT platform. This shows a little lack of care for Windows NT, even if it is aged and mostly ignorable as far as new installations are concerned.

The instabilities noted with on-access scanning are more worrying, and presumably due to the operating system rather than any basic software flaws, since the same issues have not been noted with these products on other platforms. Essentially, the developers are caught between the most modern and most ancient incarnations of the NT operating systems and the desire to produce one package which will install on every variant. With the differences apparent between Windows XP and Windows NT, this is obviously easier said than done.

While Microsoft can drop support for a platform, the same is not true for developers. Without the Microsoft monopoly to back them up, anti-virus developers can gain customers by their range of supported platforms, and lose them if they cut back when a customer demands support for machines of more historical than practical interest. One wonders whether Microsoft's entry into anti-virus, currently restricted to Windows-only platforms, will be influenced by this in future.

Technical details

Test environment. Three 1.6 GHz Intel Pentium 4 workstations with 512 MB RAM, 20 GB dual hard disks, DVD/CD-ROM and 3.5-inch floppy, all running Windows NT 4 Workstation SP 6.

Virus test sets.  Complete listings of the test sets used can be found at http://www.virusbtn.com/Comparatives/WinNT/2006/test_sets.html .

Results calculation protocol.  A complete description of the results calculation protocol can be found at http://www.virusbtn.com/virusbulletin/archive/1998/01/vb199801-vb100-protocol .

twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest articles:

Nexus Android banking botnet – compromising C&C panels and dissecting mobile AppInjects

Aditya Sood & Rohit Bansal provide details of a security vulnerability in the Nexus Android botnet C&C panel that was exploited to compromise the C&C panel in order to gather threat intelligence, and present a model of mobile AppInjects.

Cryptojacking on the fly: TeamTNT using NVIDIA drivers to mine cryptocurrency

TeamTNT is known for attacking insecure and vulnerable Kubernetes deployments in order to infiltrate organizations’ dedicated environments and transform them into attack launchpads. In this article Aditya Sood presents a new module introduced by…

Collector-stealer: a Russian origin credential and information extractor

Collector-stealer, a piece of malware of Russian origin, is heavily used on the Internet to exfiltrate sensitive data from end-user systems and store it in its C&C panels. In this article, researchers Aditya K Sood and Rohit Chaturvedi present a 360…

Fighting Fire with Fire

In 1989, Joe Wells encountered his first virus: Jerusalem. He disassembled the virus, and from that moment onward, was intrigued by the properties of these small pieces of self-replicating code. Joe Wells was an expert on computer viruses, was partly…

Run your malicious VBA macros anywhere!

Kurt Natvig wanted to understand whether it’s possible to recompile VBA macros to another language, which could then easily be ‘run’ on any gateway, thus revealing a sample’s true nature in a safe manner. In this article he explains how he recompiled…


Bulletin Archive

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.