With a double whammy of a brand new platform and a record-breaking haul of 43 products to test, the VB test team had their work cut out this month. John Hawes has all the details of how the products fared on the shiny new Windows 7 platform.
Copyright © 2009 Virus Bulletin
So Windows 7 is finally with us. The hordes of users and admins who have put off migrating away from the stalwart XP can breathe a sigh of relief and finally start using a modern operating system. Vista can be consigned to the scrap heap of history, with the best of its innovations living on in it successor and the rest swiftly forgotten.
Perhaps that’s going a little far; as a new and untried entity, Windows 7 will at least have to do a little work to earn the approval and trust of cautious users. Initial impressions have generally been fairly positive, with speed, stability and style impressing many early adopters. Some teething problems were noted with many security products, but that was way back at the public beta stage and by now they should all have been resolved. We can only hope as much anyway, as this month’s comparative takes place on the new platform, with the deadline for product submission having been just days after its official public release.
Unlike the general consensus elsewhere, our initial impressions of Windows 7 were not entirely favourable. A trial installation of the Ultimate edition – to see how it got on with our hardware and tools, and to get a feel for what changes we needed to be aware of – proved somewhat problematic. A troublesome install process finally got us to a fully operational set-up, but Explorer seemed prone to odd behaviour, displaying only blackness within its shimmery semi-transparent framing until the right combination of clicks restored it to life. Meanwhile, the first blue screen was achieved within half an hour of installation.
Fortunately, the Pro edition selected for our tests proved more robust and well behaved. Getting all our test systems installed, activated and backed up with images was not an arduous task, with most of the steps fairly standard (although finding our way to some of the configuration controls proved a little bewildering thanks to some unnecessary adjustments to the layout).
With our lab hardware fully supported from the off, few changes were required to the standard installation besides a couple of handy tools to be used during testing – an archiving package to access submissions sent as archives and a PDF reader to check out manuals in case of unclear or unfamiliar products. Being rather simple folk easily overwhelmed by fancy graphics, we opted to revert the display to the plain, unflashy ‘classic’ style, intending to check out each product in the context of the snazzy ‘Aero’ options briefly, just to make sure they didn’t look too out of place.
Getting the test sets and associated tools put together and onto the systems was also a relatively simple task. The test set deadline was 24 October, and the latest WildList available on that date, the September list, provided few surprises. The most dangerous of the Virut strains which rocked the last comparative was retired from the list, and our troublesome large set of samples thus removed to the polymorphic set. Additions to the WildList were dominated by online gaming and social networking threats, along with a sprinkling of autorun worms and Conficker variants. The polymorphic set was enlarged in terms of numbers of samples, but not greatly in terms of entirely new items, while the set of worms and bots was trimmed of some older items and enhanced with a selection of more recent arrivals. As usual, the trojans set was compiled entirely afresh, mostly with samples gathered during September while we were busy working on the last comparative. The RAP sets were populated as usual in the few weeks before the test, and in the week following the 28 October deadline for product submissions – meaning that testing could not start until well into November.
The deadline day proved a busy one, with products coming in thick and fast – a few new arrivals to spice things up, the usual flood of familiar faces, many of them providing both suite and AV-only variants, and one even submitting a free edition alongside the standard paid-for version. Many of our occasional entrants failed to materialize, perhaps put off by the potentially tricky new platform, but nevertheless the numbers stacked up to a monster 43 products. With a record field to test on what was likely to be a difficult platform, we knew that time would be against us.
Noting this time pressure, and having put together a fairly large and challenging set of infected samples to test against, we decided to make things extra hard for ourselves by expanding and deepening our performance tests. The standard speed sets were enhanced with a selection of files from the new operating system, while the clean set got a fairly large addition from CDs provided with hardware devices and magazines, and popular and recommended downloads from various software sites.
The speed tests were extended to take into account the performance-enhancing caching technologies included in many products these days. While in the past only one set of figures was reported for default handling of the speed sets, for this test we decided to include both ‘cold’ and ‘warm’ figures – that is, for the initial encounter with the files, and for subsequent rescans of the same items, measured multiple times and averaged to minimize anomalies. These measurements were taken both on access and on demand, although the on-demand figures are perhaps somewhat less useful – most products will have been updated at least once between on-demand scans of the same items, which should mean that any cached data should be purged and items looked at afresh in case improved detection powers lead to something being spotted. The on-access data is much more relevant, as files may be accessed numerous times between updates and checking known files faster will significantly reduce the system footprint of the security solution.
We also introduced an update to our on-access measuring tool, opening files with the execute flag set to spark detection in a fuller range of products, and also taking MD5s of each file encountered and granted access to, in order to keep better track of unwanted changes to the testbeds. During testing we also gathered some more detailed performance measures, including records of CPU and memory consumption under various conditions, but given the heavy workload this month it was not possible to wrestle these figures into presentable shape in time for inclusion in the final report.
With all these schemes ready to go, and a tally of 43 products to get through, we shut ourselves away in the lab ready for a long and arduous, but what we hoped would be a productive month of testing.
AhnLab’s offering kicks off this month’s review with few changes from its last few appearances. The installation process is fairly smooth and speedy, with minimal interruption from Windows 7’s UAC system – a single prompt for confirmation on commencing the install. The interface is fairly pleasant and reasonably usable, with a few quirks likely to fool the unwary, but generally simple to navigate and operate. Running through the tests proved unproblematic, although matters were slightly complicated by the separation of logging into items categorized as mere ‘spyware’ from those definitely malicious. After some careful merging of logging data some reasonable scores were recorded across the detection sets.
In the speed tests, scanning speeds were pretty decent but on-access overheads were a trifle heavy. No false positives were recorded, but in the WildList set a single sample of the last remaining W32/Virut strain was missed, thus denying AhnLab a VB100 once again.
This may be the last appearance in VB’s tests of the current version of Alwil’s popular avast! product, with a long-anticipated new edition due for release very soon. The install is uncomplicated and fairly speedy but does require a reboot of the system to complete, while the design of the interface remains somewhat unusual but provides a good range of fine-tuning for the more demanding user if switched to the advanced version. Running individual scans is a little fiddly, and logging can be problematic – initially limited to a fairly small size and, if a non-existent folder was mistakenly selected to write logs to, the process was silently disabled.
Detection rates were pretty solid across the test sets, with a steady decline as expected across the RAP sets but a strong starting level making for a very respectable overall average. Speeds were excellent, with some impressive improvements on access when files had been checked before. The WildList presented no difficulties and with no false positives either, Alwil earns this month’s first VB100 award.
It has been a while since ArcaBit made an appearance in VB100 testing. The product’s installer defaults to Polish, but is otherwise straightforward and very speedy, the installation process requiring less than a minute all told (although a reboot is required at the end). Running the tests proved a little more arduous, with multiple UAC prompts presented at various stages of accessing and adjusting the controls and extremely long pauses waiting for browser windows to be presented. Nevertheless, scanning speeds were decent – fast on demand and overheads not too heavy on access.
Detection rates were not bad in general. There was a marked decrease in coverage in the more recent weeks of the RAP sets, but the WildList was covered without problems despite the large numbers of previously unseen Virut samples. With the clean sets throwing up no show-stoppers either, ArcaBit earns its first VB100 award after a handful of sporadic appearances; we hope to see the product becoming a more regular entrant in the future.
Authentium’s product goes very much for simplicity, with a pared-down interface providing the bare minimum of control options, all of which are reasonably easy to find. Opening reports proved slow in the extreme, most likely thanks to the unusually large size which would not be experienced by normal users, but otherwise testing progressed without major difficulty.
Scanning speeds were on the good side of medium and pretty light in terms of on-access overheads. Detection scores were fairly decent, with an especially strong showing in the proactive week of the RAP sets, and with no problems in the WildList and no false positives, Authentium safely qualifies for a VB100 award.
AVG’s product had a very lengthy and complicated installation process, with numerous components to be put in place and configured. When the product is finally installed, it demands to be allowed to make an ‘optimization scan’. If delayed, this scan is run anyway before any scheduled scan can take place – as we discovered when we set a scheduled job to run overnight, only to find on arrival the next morning that the optimization process was still running, and the requested job was yet to begin. Perhaps not helped by the incomplete optimization process, on-demand scans showed no sign of speeding up when run again over previously scanned data, and on access only a minimal improvement was observed on revisiting previously scanned files.
The interface occasionally proved rather slow to respond, especially when updating its display during large scans, but was generally reasonably easy to navigate, and a decent although not exhaustive level of configuration was available. Detection results were pretty solid, with no problems in the WildList and an excellent showing in the reactive portion of the RAP sets. With no false positives in the clean sets either, a VB100 is duly earned by AVG.
Perhaps responding to the increased interest in free solutions of late, Avira opted to enter its free version in this month’s test, and the product did not disappoint. The basic design and layout was pretty familiar to us from having used the professional edition, with a few minor adjustments, starting with the personal usage terms and conditions presented during the snappy install process. A few other areas also seemed different, with the default scanning depths perhaps a trifle less strict, and the on-access scanner lacking an option to simply block without prompting for an action. In the on-demand area, the GUI seemed to provide no option to scan a folder, offering to scan only entire drives or partitions, but a context-menu scan option provided more flexibility. These issues proved a little frustrating during our intensive on-access test, but not too upsetting, and otherwise the depth of configuration proved admirable.
Performance was excellent, with some very fast scanning speeds both on access and on demand, while detection rates proved as splendid as we have come to expect from the company. The test sets were demolished without apparent effort, with even the proactive portion of the RAP sets handled impressively. With no problems in the WildList, and no false alarms, Avira’s free Personal edition comfortably earns its first VB100 award.
The full paid-for version of AntiVir, as mentioned above, is pretty similar to the free one on the surface, but with a wider range of options and a deeper level of control available. The set-up process is similarly simple although some post-install options are presented, including some extras such as detection of suspicious iframes. Logging is also clearer and more sophisticated than in the Personal edition, as befits a product intended to be put to use in a business environment.
Otherwise, little difference was observed – detection rates were identical to the free edition, while speed measures were as superb. Again no problems emerged in the WildList and no false positives were presented, and Avira adds a second VB100 to this month’s haul.
BitDefender’s 2010 edition provides another redesign and another unusual look and feel. The install process is rather lengthy and features a number of command prompt windows flashing into view and disappearing again in an instant. A reboot is needed to complete the process. The new GUI has a simple, straightforward, rather chunky appearance, with the layout variable for each of a selection of user profiles – an interesting and effective approach to allowing the advanced user a decent level of control while avoiding frightening the novice. A number of other interesting features are included, such as home network configuration controls, vulnerability management and system configuration options, alongside the core anti-malware protection elements which proved as solid as ever.
Detection rates were excellent across the test sets, while in the performance measures scanning speeds proved fairly slow on first sight of files but improved notably on revisiting them, with a particularly impressive improvement on access. The WildList was handled comfortably, and with no false positives BitDefender earns a VB100 award.
Incorporating the BitDefender detection engine, Bullguard’s product proved much faster and easier to install, but again a reboot is needed for full operation. Its overwhelmingly red interface felt a trifle cluttered, but with a little exploration proved nicely laid out and fairly simple to use – although the process of setting up and running a custom scan is a little long-winded, and requires the approval of a UAC prompt.
Detection rates, as expected, were along the same lines as those achieved by BitDefender – a very respectable showing – while in the speed tests medium rates were recorded with no change on second viewing of the files. No problems cropped up in the WildList or the clean sets, and a VB100 is duly earned by Bullguard.
CA’s home-user offering arrives following a major overhaul, with a redesigned interface promising some stylistic innovations. The installation begins with some extremely large icons, and after a long and slow process requires a reboot before presenting a final interface which is equally large-featured. The design is indeed unusual, with its swirling 3D tabs and icons apparently inspired by computer systems used on the TV show CSI: Miami. Clearly, it is intended to provide a simple and user-friendly experience for the most inexperienced users. For us, however, it proved baffling in the extreme, with the tiny amount of configuration available proving both tricky to find and perplexing to make use of; perhaps with experience its mysteries will be unravelled.
An attempt to run scans from the GUI – when the appropriate area was at last uncovered – proved very slow to access the filesystem browsing details. A context-menu entry is provided for simpler initiation of specific scans, but is also somewhat confusing, with multiple nested options and the option to exclude an area from scanning given prominence over the scan itself. Scanning speeds seemed remarkably fast – as we have come to expect from CA solutions – but on repeated attempts showed some worrying oddities. Most rescans proved slightly faster than the first attempt, as might be expected, but some were significantly slower and apparently scanning at a greater depth (with no change to the options). On one occasion a component of the useful Sysinternals suite was alerted on as a potential hacking tool, despite having been missed on two previous scans and going unnoticed again on two subsequent runs.
In the infected sets, detection was less than excellent, with three items in the WildList set not detected: an autorun worm and a pair of online gaming password-stealers. Furthermore, while running the performance tests a .DLL file included with the Windows 7 operating system (in the system32 folder) was alerted on as a ‘Startpage’ trojan; CA’s new-look product is thus denied a VB100 award this month.
UPDATE: re-analysis of the false positive alerted on by CA Internet Security Suite Plus 2010 was carried out, and it was found that while the false positive existed in the installer submitted by the company, it was no longer present in the updated definition set also included in the submission. As users would be updated to the fixed protection level prior to running any scans, the issue should not emerge in the real world. As the product is confirmed to have also had problems in the WildList set, this will not affect qualification for a VB100 award. The revised RAP graphic is shown below.
According to the vendor, CA’s business product is no longer to be referred to as ‘eTrust’ – but despite this it continues to carry ‘eTrust’ branding at various points and persists in using a rather old-fashioned and less than satisfactory interface. However, we understand that the long-awaited redesign is on the horizon at last.
We have learnt through long and painful experience how to cope with the quirks and oddities of this product’s layout, although the responsiveness issues noted in previous tests were less evident here than on some other platforms. Some particular areas of frustration remained, including the reverting of some option selections from scan to scan, the absence of archive scanning on access despite the provision of a setting to enable it, and the awkward logging which put such a strain on the interface trying to interpret and display the data that on one attempt the machine overheated and rebooted.
Eventually, though, it did manage to display its own logs in a fairly usable format – a first for the product – and detection rates seemed somewhat better than previous rather disappointing levels. However, despite the autorun worm being handled properly this time, the two gaming trojans were missed once again. In the clean sets there was no sign of the false positive found by the consumer product, but nevertheless, CA’s business solution is also denied a VB100 award this month.
The Blink product submitted for this month’s test is a late-stage beta, due for final release around the time this review will be published, and thus a few oddities are only to be expected. After a fairly straightforward and reasonably pacey install process, some areas of the nicely designed interface failed to operate properly, presenting some rather stark messages reading simply ‘Parameter is incorrect’. However, after a reboot, and with some patience, testing was completed without serious problems. We noted that the firewall bundled with the product is disabled by default, but some of the other additions, such as the vulnerability scanner and intrusion-detection controls, impressed us greatly. The anti-malware component is only a minor part of the offering, and is thus granted less space in the configuration areas than might be desired by more demanding users.
The product incorporates the Norman engine, and the implementation of sandboxing of unknown files may well account for some rather sluggish scanning speeds over executable files on demand. The sandbox came into its own in the detection tests, with the on-demand results proving rather better than the on-access ones, where less intensive scanning is provided. This was something of a problem for eEye in the WildList set, though, where a handful of W32/Virut samples were missed by the on-access component, although spotted by the sandbox on demand. In the clean sets, the same .DLL file which caused trouble for the CA consumer product was alerted on. Thus, despite a generally solid performance, eEye does not qualify for a VB100 award this month.
The latest version of eScan has another rather lengthy installation process with a number of long pauses, and a reboot to cap things off. When up and running, the interface proved somewhat poorly laid out but fairly usable with a little practice. Once again there were problems accessing browser windows when setting up scans. The product includes a number of extra features, including controls for managing removable USB devices and application control.
During the process of running some of the more demanding scans of the infected sets, an error window was presented, warning the user that the product had stopped working. However, scanning seemed to continue unimpeded and further investigation showed that on-access protection was also fully operational. Scanning speeds in the clean set were slow in the extreme, with no sign of speeding up on repeated runs, but the product remained solid and well behaved throughout. Detection rates continue to impress with strong scores across all sets, and with no issues in the WildList or clean sets a VB100 award is well deserved.
ESET’s product remains much as it has been for some time: pleasantly designed with an efficient and lucid layout. The install process is simple and needs no reboot, and protection is up and running with ease. Configuration is as in-depth as could be desired, although options to enable the scanning of archives on access seemed to produce no increase in scanning when enabled.
At one point during the most intensive scan of the infected sets the product became a little overwhelmed, consuming rather more than its share of memory and requiring a reboot to return the system to a functioning state. In more normal activities no problems were observed however, with scanning speeds unaffected by repeated runs but fast enough to be beyond complaint. Detection rates were very solid, with a commendable regularity across the reactive part of the RAP sets and still fairly strong in the proactive portion. With no trouble handling the WildList or clean sets, ESET adds yet another VB100 award to its tally.
Filseclab’s product has a slow installation process and requires a reboot to complete. The interface is pleasantly designed and simply laid out (although the configuration screen is rather cluttered with a wealth of options described in less than helpful language). It seemed splendidly stable and responsive throughout testing. On-demand scanning proved fairly slow and showed no sign of speeding up once familiar with files, while the on-access protection did not appear to fully intercept file accesses, merely logging detections after allowing them to be accessed. As a result, the on-access speed measurements may appear faster than they ought.
Detection rates were generally fairly good, with solid scores in the trojans set and decent levels across the RAP sets despite a steady decline as the samples grew fresher. In the WildList set a number of items were not detected, including fair numbers of the W32/Virut strain – a failing that was also seen in the other polymorphic strains in the detection sets. In the clean sets a small number of false positives were noted, with some components of the popular freeware image manipulation solution The Gimp misidentified rather vaguely as ‘Trojan.Obfuscated’ – clearly a very generic detection algorithm applied slightly too severely in this case. Between them these issues are enough to deny Twister a VB100 award once again, despite continuing signs of improvement.
FortiClient proved a little tricky to install on Windows 7, with two UAC prompts before the installer got started on a process doomed to fail very shortly. Re-running the installation numerous times while applying varying options to the useful compatibility troubleshooting tool provided by the operating system eventually got things rolling. When the product was finally installed and running the interface offered excellent clarity of design and a fairly thorough selection of options – appropriate for a predominantly business-focused solution. One issue observed with the GUI was that the ‘restore defaults’ control failed to reset changes made in advanced subsections.
Scanning speeds were in the mid-range, but stability was maintained even under pressure and detection rates showed notable improvement over recent tests. No issues were observed in the WildList or clean sets, and a VB100 award is duly earned.
F-PROT continues to offer icy minimalism, with a swift and straightforward install process impeded only by a single UAC prompt and the need for a reboot to complete. The interface provides few options but caters for the basics in an admirably clear way. Scanning speeds were fairly reasonable but showed no sign of advanced caching of known-clean files, and detection rates were decent but not overly impressive.
With full coverage of the WildList set and no false positives, F-PROT also earns a VB100 award this month.
F-Secure’s latest product version features a notable redesign, starting with a heavily automated install process requiring minimal user intervention – even offering to remove any existing protective solutions – but taking some time and needing a reboot. On restarting the system a notable heaviness was apparent, with Windows taking some time to come back to life, and a number of large and intrusive pop-ups from the HIPS system warned of potentially unwanted behaviour on the part of several standard Windows components, including the Malicious Software Removal Tool (although such behaviour may have been influenced by the lack of an Internet connection to check with cloud-based systems).
Our first attempt at running the test proved fruitless as the on-access component appeared completely non-functional, but on reinstalling on a second test machine the issue did not recur. Once everything was working properly testing proceeded without further interruption, with some fairly decent scanning speeds and splendid detection rates. Even the highly inefficient and precarious logging system proved more reliable on this occasion. There were no problems in the WildList and no false positives in the clean sets, and as a result a VB100 award is easily earned.
F-Secure’s second submission this month is the company’s rebrandable version provided to users via ISPs and so on. It is fairly similar to the 2010 version in design and user experience, even down to the annoying pop-ups warning about Windows components. Scanning speeds were similarly reasonable and detection rates likewise excellent, and with an identical showing in the core sets a second VB100 award goes to F-Secure this month.
G DATA’s 2010 edition has a rather higher than usual number of steps to its installation process, including the set-up of the malware feedback system for reporting detections back to base. The latest version of the interface is clear and uncluttered with a pleasantly logical layout. Configuration is made available at a reasonable depth – with some more specialist requirements perhaps missing, but quite ample for the average user.
A few oddities were observed, with the most notable examples being a somewhat low default limit on archive scanning (300KB) and the intrusion of a UAC prompt before any on-demand scan can be run. Logging is also a little frustrating, with reports stored in an awkward format which proved something of a strain for the product to interpret into human-readable form if allowed to grow too large. Initial scanning speeds were fairly slow, as expected from a multi-engine approach, but on repeat viewing of previously seen files speeds proved lightning fast, with the same pattern of improvement showing again in the on-access tests, demonstrating some sterling effort at keeping overheads down through caching.
Detection rates, as we have come to expect from G DATA, were stratospheric, setting a seriously tough benchmark for others to aim for across all the sets, with even the proactive portion of the RAP sets handled admirably. With barely a whisper of a miss in the standard sets the WildList proved something of a breeze, and with no false alarms either G DATA easily earns another VB100 award for its effort.
K7’s installation process is nice and speedy, with a single UAC prompt at the start, a standard set of stages including a check for conflicting third-party software, and no reboot required. The interface is simple and pleasant, providing an ample level of configuration for the average home user in a rational and usable layout. Logging was a minor problem, with the viewer window freezing on attempting to view unusually large logs, but this minor issue is unlikely to affect the majority of users. The only other oddity observed was the occasional zero missing from scan duration times, which was no more than a little confusing.
Detection rates proved pretty decent, with most of the older sets handled with aplomb and a decent score in the trojans set, while the RAP scores proved a little uneven, with the ‘week +1’ set handled marginally better than the ‘week -1’ set. The WildList presented no difficulties however, and with no false positives in the clean sets either, K7 wins a VB100 award and our gratitude for a nice easy run through the tests.
Kaspersky’s latest consumer offering is as glossy and shiny a beast as ever; the install is no slower than the average and getting at the new-look interface didn’t take long. The redesign caused a few moments of confusion on first approach, but soon became familiar and simple to use. A vast wealth of fine-tuning options are provided under the attractive surface, including some interesting features like the keylogger-proof ‘virtual keyboard’.
Scanning speeds were pretty slow in some areas, especially over the sets of media and documents which most products fly through. While they did show some signs of improvement on second and subsequent attempts, the rescans still took a long while.
On the other hand, detection rates proved superb pretty much across the board, and with no issues handling the core sets and no false alarms, Kaspersky comfortably earns a VB100 for its 2010 edition.
Kaspersky’s second offering this month has a slightly more businesslike name and is presumably a corporate version, but in look and feel it is not so very different from the home-user edition – somewhat plainer perhaps, and with some of the advanced features absent. Again the wealth of configuration options is a pleasure to behold and the user experience is extremely smooth and trouble free. Scanning speeds were much faster this time too, and showed signs of considerable improvement on repeat attempts thanks to the ‘iSwift’ and ‘iChecker’ technologies mentioned in the control system.
Detection rates were excellent in all sets, and no problems were encountered in the certification requirements, thus earning Kaspersky a second VB100 award this month.
Kingsoft’s Advanced edition has a fairly straightforward installation process: fast and unchallenging with only the mention of cloud-based intelligence worthy of comment; no reboot is required to complete. The interface is simple and unflashy, presenting all the required controls without fuss but occasionally looking a little sparse thanks to the use of some rather odd fonts.
Logging proved sturdy and responsive – something of a rarity for this month’s test and certainly worthy of praise. Scanning speeds were middle of the road and detection rates proved rather unpredictable, with problems being caused by both polymorphic viruses and samples that were less than a few weeks old. No such issues were encountered in the WildList however, despite the Virut strain in there, and with no false alarms generated either, Kingsoft earns a VB100 award for its Advanced edition.
Kingsoft’s Standard version is, as usual, identical to the Advanced edition – on the surface at least. In the past we have noted a sizeable speed difference between the two, but this time the two performed much on a par with each other. In terms of detection, however, a fairly major difference was observed, with much lower scores here in the trojans and RAP test sets – once again seeing that rather surprising jump up and down across the RAP weeks – and a similar level of polymorphic misses too. However, with no issues in the WildList and no false alarms, Kingsoft’s second entry also makes the required grade for a VB100, which is duly awarded.
Kingsoft’s ‘Swinstar’ version is apparently a preview of upcoming technology, and is indeed quite different from its predecessors in many respects, starting with an installer package of not much over half the size of the previous two versions. The install is even faster and simpler, and the interface a little more glitzy and stylish but still fairly simple and easily navigated. More sensible default settings and a greater range of configuration are available. Scanning speeds are also a little better.
Again no false alarms were generated in the clean sets, but in the WildList set a single sample out of several thousand of the W32/Virut strain was missed, thus denying Kingsoft the chance of a hat trick this month.
McAfee’s home-user product was one of several this month which required Internet access during the set-up phase; in this case, not only do updating and activation take place online but so does the entire installation process. For me this would be entirely unacceptable; the several systems I use for my own purposes are all regularly reimaged to a known-clean state, and wherever possible I scrupulously avoid connecting to the web until security is installed and active (preferably fully updated too). It could, of course, be that I have grown paranoid from long experience in the security industry and exposure to too many scare stories, but such factors seem not to have influenced the designers at McAfee.
Once the product is installed, after a fairly drawn-out process, it presents a rather drab, grey outlook on the world which the test team found rather depressing. Although well stocked with buttons to click, the product provides virtually no control over its behaviour, merrily skipping through our test sets deleting and disinfecting samples without hesitation or approval. Again this would be less than ideal for my personal needs – fear of false positives and sloppy disinfection of precious files makes many users prefer quarantining and manual checking before any permanent damage is done. Logging also proved an issue, capped at a very small fixed level which cannot apparently be adjusted, so although the product reported having spotted and destroyed numerous files and threats, it could provide no details of what it had done and where.
Scanning speeds were mediocre and showed no signs of improvement over time, but we finally got through the test. Numerous reboots were required as, lacking the ability to disable the protection, we were forced to boot into another operating system to replace destroyed sets. Results were obtained by laboriously checking the files left behind on disk and counting only those left in place unchanged as misses. A satisfactory level of detection was observed, solid across most sets. The WildList presented no difficulties and there were no false alarms, so McAfee’s consumer offering is adjudged (just about) worthy of a VB100 award.
The VirusScan product for the corporate market has a much more grown-up attitude to its users, providing a more solid and sensible approach. The installation process is simple and clean, with the offer to disable Windows Defender a highlight, and the product itself is similarly businesslike, unflashy and properly thought out.
It ran through the tests in good time without problems, showing excellent stability and general good behaviour throughout. In the final verdict it actually scored slightly lower than its wayward consumer sibling in the newer test sets, thanks to the daily offline updater being plucked somewhat earlier than we were able to install, update and snapshot the total product, but scores remained pretty decent. The WildList proved not much of a challenge, and with no false alarms VirusScan ably earns itself a VB100 award and much gratitude for a relatively painless experience.
The Forefront product requires a rather complex install process thanks to our hermetically sealed lab, with multiple reboots to get the various components in place. This non-standard set-up prevents us from properly commenting on the process as would be experienced in the real world. Once up and running however, the product is pleasantly simple to use, the very minimal configuration provided making for light work as no in-depth measurements could be taken.
Parsing the results, we saw some pretty decent scanning speeds and fairly lightweight on-access figures, with a very noticeable increase in speed once files had been initially processed and remembered. Detection scores were a little less pleasing though, with levels much lower than expected in most areas. Thinking at first some error had been made when applying updates, the tests were re-run but the same results were obtained. On checking the version information displayed, the updates appeared to be from several days prior to the deadline for the test – suggesting that the wrong updates had been included with the submission. With a number of W32/Bagle samples recently added to the WildList not detected, Forefront is regrettably ruled out of contention for a VB100 award this month.
UPDATE: After closer analysis, it was spotted that Microsoft Forefront Client Security had not been run with the default settings, as per the standard procedures of the VB100. When tests were rerun with the correct settings the product was found fully capable of detecting all samples on the WildList, and a VB100 award is thus awarded to the product retrospectively. Detection scores in the trojan and RAP sets were also affected by the adjustment, the revised RAP graphic is shown below.
Microsoft’s new, free home-user solution was reviewed in these pages just last month (see VB, November 2009, p.18), so its layout and usage provided no surprises. The design is simple but perfectly workable, with enough options and sensible default behaviour to satisfy our requirements comfortably. It ran through the test without hindrance or upset, running for what seemed like a rather long time over the infected sets, but which would later prove to be not so bad compared to some others in the field this month. In the proper speed tests, rates were pretty impressive, with some good use of caching to lighten on-access overheads once files had been confirmed safe.
After the problems noted with the corporate product there were some worries about detection rates, but clearly the submission for the Security Essentials product had been made more carefully; scores proved very solid indeed, with a very gentle decline across the RAP sets and a fairly sharp drop in the proactive week but remaining highly competitive. False positives being absent, and the WildList handled ably, Security Essentials comfortably takes its first VB100 award.
This was Nifty’s second appearance in our tests, and once again the product was only available in Japanese. Installation proved fairly simple – a little slow, but running through the familiar gamut of steps before demanding a reboot. With the GUI still trying to summon some of its display fonts from the operating system (where they were sadly not to be found in our test set-up) navigating proved somewhat difficult, especially since the guides provided by the developers on the previous occasion had been rendered out-of-date by changes to the interface and the operating system alike.
Nevertheless, we bravely soldiered on, eventually obtaining results through various techniques after one of the longest spells spent on a single product in VB100 history. Scores, as expected from the Kaspersky engine incorporated into the product, were pretty decent. Speeds were somewhat sluggish on first attempt but, as we had surmised they might be, considerably quicker on repeated scans. Easily satisfying the technical if not aesthetic demands of the VB100, an award is duly earned by the Nifty Corporation.
Norman’s suite solution has caused a few headaches in the past, and we were most grateful to see a considerably redesigned version submitted this month. The new version, after a very speedy install indeed, proved much more useable, stable and responsive, although the apparent absence of the ability to run a manual scan, either from the GUI or the context menu, set things back a little as well as provoking some bewildered amusement.
Another issue which seemed to defy all logic was the scheduled scan, confidently timed for late on a Friday night so that the bulk of the scanning would be complete by Monday. On arriving back after the weekend, we found the scan had uncovered an item of potentially aggressive commercial software early in the job, and had sat waiting for instructions for two days without continuing its scanning, leaving the vast bulk of the scheduled job still to run.
Having shaken our heads a little at these quirks, we did eventually manage to gather the required data, which showed some solid scores, aided by the sandbox. However, as expected after having seen the results of the Blink product, there was a slight failing on access with the Virut samples, although on-demand coverage was better. This was enough to deny Norman a VB100 award this month.
PC Tools’ product range has had a pretty shaky time in recent VB comparatives – seemingly coinciding with the company having been taken over and the product ceasing to incorporate a third-party engine. Running through the familiar installer, which took rather a long time and needed a reboot to finalize things, we were a little worried that nothing had changed this time, but running through the tests on the top-of-the-range Internet Security suite product proved much more satisfactory than on the last few occasions, with no problems with stability or bad behaviour of any kind. The interface, which has become more usable through familiarity and seems pretty much unchanged since the last submission, is fairly appealing and has a decent range of controls, most of which are sensibly located and labelled.
Under the hood though, it is clear that some great strides forward have been taken. Above and beyond the solid stability, detection rates have soared since the rather pitiful efforts of just a few months ago, possibly aided by the experience of the company’s new owners, and in the main sets – particularly the trojans – some truly excellent scores were achieved. The RAP sets were also handled fairly well, steady across the reactive weeks and with a steep dip into the proactive set, but overall not bad at all. Scanning speeds were somewhat mediocre, and especially slow handling .JAR archive files, but the WildList was handled impeccably and without false positives PC Tools is firmly back in the VB100 award winners’ camp.
The second PC Tools entry this month is essentially the same as the suite product minus a few of the extras, and has the same fairly slow installation process, punctuated this time by the offer of a Google toolbar. The product also presents a very similar-looking interface. This time, however, all was not so well, with the first install seeming to have a partially functioning on-access component. While malicious code was detected on execution, the on-read and on-write protection boasted of in the interface appeared to be completely absent, despite numerous restarts and adjustments of the settings. Finally, however, the right combination of clicks managed to get it up and running, and on a second install on fresh hardware it seemed happier to start of its own accord.
Scanning thus proceeded without further interruption, with the same excellent detection rates as the IS product, and also the same fairly slow scanning times. The core requirements of the VB100 were easily satisfied, and a second award is thus earned by PC Tools, along with some compliments on the developers’ sterling efforts at improving the product.
A newcomer to this month’s comparative, Preventon provides its own version of a third-party engine which appears generally to be sold via ISPs and other rebranding sales channels. Our first impressions were good, with a nice, simple install process, and a well-designed GUI aiming firmly for the simple end of the market. The simplicity did nothing to impair performance or usability however, with a sensible set of defaults and a sprinkling of useful controls that were easy to find in the bright, colourful interface. One issue that did perplex us was the pair of arrow buttons provided, which we assumed would move us left and right through the tabs but seemed not to; we eventually divined that they were actually browser-style forward and back buttons rather than simple left and right.
This minor moment of confusion aside, a few problems with auto-quarantining – which slowed things down considerably in the larger infected sets – and limited logging were easily overcome with some advice from the vendor and a little care in running jobs, and results were easily acquired. Scanning speeds were fairly decent, and detection rates pretty solid, with a fair-sized decline in the more recent weeks of the RAP sets. Without false alarms and with complete coverage of the WildList, Preventon is a worthy winner of a VB100 award on its first attempt.
A second newcomer to this month’s test, and like the previous entrant Qihoo was a surprise last-minute appearance with a third-party engine (BitDefender in this case). Qihoo hails from China and, this being a fairly new product, the company has yet to translate its product interface into other languages. Aided by a thorough guidebook and a little inspired guesswork, the team found the install fast and simple and the interface clearly and rationally designed, allowing some options to be discovered simply through logic without recourse to understanding the markings.
Scanning speeds were no more than mid-range but detection rates, as demonstrated by other incarnations of the same engine, were splendid, with solid scores across the sets. The WildList and clean sets proved little problem bar a handful of files marked merely as 'suspicious', and Qihoo also makes the VB100 grade at first attempt.
Quick Heal’s product offers a pre-installation scan along with the usual set of steps, but is still in place in excellent time. The design is bright and eye-catching, the layout reasonably rational and not too tricky to find one’s way around, and a fair level of controls is provided for most needs, so testing proceeded apace.
Speeds were not as rapid as we have come to expect from the product in the past, but still perfectly decent, and detection rates were fairly decent too, with a steady decline observed across the RAP sets. The WildList and clean sets were handled well, so Quick Heal also wins a VB100 award this month.
With the latest edition of this product Sophos again introduces some additional functionality without noticeably affecting the user experience. In this case we understand that encryption features have been merged into the company’s corporate offerings, but after another fairly lengthy install process the interface seemed unchanged, at least at a cursory glance.
The GUI is simple and logical and presents an excellent range of options, as demanded by the product’s business audience – although some items, such as always scanning memory and boot sectors when running a manual scan, are tucked away in a super-advanced section alongside other controls of a far more technical nature. We noted a few quirks in the layout which had the potential to confuse, such as the separation of scan settings into two areas, and also spotted some disagreement in data presented when opening the scan interface part-way into a running scan. While the newly opened scan window reported one set of figures, these seemed only to measure activity from the point at which the window was opened. Meanwhile, the display in the main interface offered a different set of statistics for the same scan.
These minor quibbles aside, scanning speeds proved pretty decent and detection rates solid. Detection rates were particularly good in the RAP sets where some excellent figures were noted, especially in the proactive set; we observed enormous numbers of detections being covered by a relatively tiny number of unique identities, so it seems like Sophos’s focus on generic coverage is paying dividends. With no problems in the WildList and no false positives, Sophos earns another VB100 award after a minor upset last time around.
Perhaps one of the most long-anticipated VB100 appearances, Sunbelt’s Vipre has been around for a few years now. The product was featured in a standalone review in these pages last summer (see VB, July 2008, p.16), and has been building a strong reputation for itself despite little participation in the standard tests. For some time we have been getting regular enquiries from our readers as to why Vipre has yet to appear in the VB100, and it is with great excitement that we finally get to record and report some results. Given the company’s marketing of the product as lightweight and easy on resources, we were particularly interested in its performance figures.
The installation process runs along fairly typical lines, at a rapid pace, but requires a reboot to complete. The interface is fairly clean and attractive and provides a reasonable range of configuration options, although we could not find a way to protect against more than the default set of file extensions on access – or indeed, to delve into archives on demand.
Stability proved solid though, and speeds were pretty decent too, with an impressive improvement on access once files had become known to the product. Detection rates were not bad either, with a few issues in the polymorphic set mostly explained by rare and obscure items not covered at all, and scores in the trojans and RAP set fitting into the better end of the middle of the field. The WildList proved no obstacle despite the set of tricky Virut samples, and with no false positives either Vipre earns a VB100 on its first appearance; we hope to see many more.
Unlike many of its competitors, Symantec continues to enter only its corporate product for most of our comparative reviews – although we do hope to see some more regular appearances from the ubiquitous Norton consumer solutions in future.
The corporate product is a little less sober and businesslike than it used to be. After a fairly unflashy, somewhat slow install which requires a reboot to complete, a curvy and colourful interface appears, with a fairly simple layout. Some in-depth configuration is provided in more serious-looking ‘advanced’ areas – although some administrators may wish for a little more depth. In places options need to be set multiple times for minor variations on the same theme, making the process of setting up an on-demand scan something of a chore.
We’ve noted before that scanning infected items can be rather slow with this product – something which may be due to the intensive logging that is carried out as scanning proceeds. Where many other products this month have frustrated us by limiting their logs to unusably small sizes, Symantec has gone the other way and provided almost 2GB of information for us to plough through. On one occasion we had a more serious issue with the logging system, when a scan seemed to get snagged somehow, spending more than 30 minutes on a single file. Rebooting the system seemed to clear the jam, but the product insisted that the scan was still running, and thereafter refused to add any information about more recent jobs to the history display system.
These were fairly obscure issues of course, that are unlikely to be encountered in real-world day-to-day use, and in the core data all seemed to be fine. Scanning speeds were pretty good, and on-access overheads excellent, while detection scores were splendid up until a fairly steep decline in the latest weeks of the RAP sets. No problems were encountered in the WildList or clean sets however, and Symantec duly earns another VB100 award.
Trustport’s installer follows the standard paths, with a few sidetracks for some set-up of the multi-engine system, and does so at a fair speed, finishing with a reboot. The multi-GUI control system is not best suited to UAC-affected systems, as numerous prompts for confirmation must be endured to access the various components, and again some problems were observed opening browser windows for on-demand scans, which could take an excessively long time. We also noted the system was quite clearly slower to come to life on reboot, and after a number of on-access detections there seemed to be some oddity with pop-ups, which kept reappearing at regular intervals long after they had been observed and acknowledged, even after the system was rebooted.
Scanning speeds were fairly sluggish, but in some areas did show some improvement the second time over the same files on access. On the positive side, detection rates were outstanding as usual, with the highest scores overall this month in the trojans set and no issues at all elsewhere. With the core requirements easily met, Trustport comfortably earns a VB100 award.
VirusBuster’s installation is fast and easy, although the interface when it comes up looks increasingly elderly and in need of updating. The design is somewhat clunky and unintuitive, with on-demand scans requiring repeated recourse to advanced tabs, which must be called up separately in each of the numerous stages. There are also a few snags and glitches in the display, with lines and text boxes overlapping and poorly laid out on screen.
Otherwise everything proved pretty plain sailing, with some fairly decent scanning speeds and reasonable detection rates too, declining steadily into the final portions of the RAP sets. The WildList and clean sets presented no difficulties, and a VB100 award is duly earned.
The final entry on this month’s monster roster of products, Webroot’s installation process kicks off with a very busy page covering registration code, EULA, install options and the offer of a (free!) Ask toolbar, all at once. The install process is then fairly brisk until a reboot is demanded, and some post-install set-up of community scheme participation is also required.
The product itself is slowly revealing its mysteries thanks to long exposure, but remains something of a challenge to navigate and control properly, with custom on-demand scans a particularly arduous chore. GUI buttons can take a huge amount of time to respond, particularly at the end of a scan when it sometimes feels like it would be quicker to allow the product to destroy our test sets than to wait for the ‘deselect all’ and ‘quarantine selected’ to respond – even with little or nothing selected. Logging is also severely restricted, although a custom fix from the developers provided us with a way around this. On-access scanning appears not to function on-read by default, with an option to enable it buried deep in the elaborate configuration structure. In most cases scores were divined by a mixture of logging and checking copied test sets for files either not written or allowed to write only after disinfection.
In the end, scanning speeds were fairly good. On-access overheads were heavier than expected, but detection rates pretty decent, as one would expect from the Sophos engine that underlies the product. The WildList proved no major challenge, and with no false positives either Webroot also takes away a VB100 award.
Crawling exhausted from the lab after our biggest month of testing ever, with a mind-numbing 43 products crammed into a mere three weeks of testing, we found it surprisingly difficult to draw any specific conclusions from such a large and varied set of data. As usual there were some excellent performances and some disappointments, some high scorers and some fast speeds counterbalanced by some at the other ends of both scales.
Generally we found Windows 7 a fairly amenable platform, afflicted by a number of fairly basic bugs which will hopefully be ironed out in the first service pack (which surely cannot be long in coming). Our poor test hardware, battered from some seriously heavy usage, began to show signs of wear, with some of the more heavyweight products causing one system in particular to overheat regularly. The range of products under test had few specific issues running on the new operating system, although a few had some problems getting installed and for many some more thought is needed as to how to interact with the UAC system less intrusively.
In terms of passes and fails, this has been a good month for most products, with a fairly small number of false positives – perhaps thanks in part to the tightening of our own rules concerning what is considered ‘fair game’ for the clean sets. The WildList, despite more rapid changes in its makeup, presented few major challenges, but continues to be a good gauge of which products are consistently up to the mark. Some further improvements to the complexity of the list are expected soon, which should make it a much more complete and challenging measure. We had a number of new faces in the test this month, several of whom will be able to present themselves to their customers with certain proof of their bona fides – a valuable thing in these days of rogue products flooding the Internet with their deceitful claims.
What issues were observed with products mainly confined themselves to frustrations and irritations rather than outright show-stoppers. Curious and inexplicable time lags were frequent, especially when trying to browse local filesystems, and many of the interfaces proved far less responsive than most users will accept. With a mix of corporate and consumer products being tested, we saw some vast differences in the approach to user interaction, with many at the home-user end trying to take responsibility and control away from the user entirely – an approach which seems to limit their market somewhat to only the least engaged audiences.
One of the biggest issues we had this month was with logging, with problems arising both from the lack of complete data and data being obscured and/or encrypted. Some products which store their data in proprietary formats and rely on parsing and processing raw data into humanly readable forms can easily get overwhelmed by logs over a certain size. Meanwhile, others seem to think it acceptable to simply destroy any data once a certain size threshold has been reached; if software has been doing things to my computer, I want it to be able to tell me about it and account for its activities, whether or not it has been busy doing other things since. Aside from this worry, it renders testing rather difficult, and we may have to impose some stricter requirements on logging provision for future comparatives.
Something else which will have to wait is the introduction of our additional performance measures. A vast horde of data was gathered during this month’s test, but as deadlines closed in on us and the slower, more recalcitrant products took longer and longer to provide usable data, we had to make a decision to put off the lengthy job of processing and interpreting all this information for presentation to our readers. Hopefully we will be able to make it available soon, and having gone through the process of preparation we should be able to include it regularly in comparatives from now on.
Looking to the future, the next test will be our annual excursion on Linux – surely a blessing for our tired eyes and weary fingers thanks to the less well-populated field of potential competitors. After that we will be back up to full speed for another XP comparative, and what looks likely to be another challenge to this month’s record-breaking haul of submissions. We can only hope that on a more seasoned and familiar platform, and with some points taken on board from this month’s comments, products will be better behaved and easier to push through our ever-growing range of tests.
Test environment. All products were tested on identical systems with AMD Athlon64 X2 Dual Core 5200+ processors, 2 GB RAM, dual 80GB and 400GB hard drives, running Microsoft Windows7 Professional, 32-bit edition.
Any developers interested in submitting products for VB's comparative reviews should contact email@example.com. The current schedule for the publication of VB comparative reviews can be found at http://www.virusbtn.com/vb100/about/schedule.xml.