2014-07-22
Abstract
The VB test team put 35 products through their paces on Windows 7. John Hawes has the details.
Copyright © 2014 Virus Bulletin
The timing of this comparative coincided with some rather major structural changes within VB, which have delayed the publication of this test report considerably, but all of the testing was performed on schedule, starting with a deadline of 19 February and completing around the start of April. To avoid further delays, the write-up will be kept as short and to the point as possible.
Product submissions were, thankfully, not too numerous, with some 35 products making the grade for the final report (plus, as usual, a few extras which were entered for testing but which were found not to be testable for one reason or another). The platform, Windows 7, is one of the most widely used at the moment, and we might have expected a rather larger turnout, but the majority of our regulars made an appearance nevertheless.
We have run many tests on Windows 7 in the past, and for this one little changed in our standard system set-up processes, with the base operating system installed from MSDN media with no further updates applied, unless required by individual products. We added a few simple tools for unpacking archives and viewing instruction documents, but otherwise kept the machines as simple and standard as possible.
As usual, the test sets were built around the test deadline, with the February 2014 WildList released just in time to be used. There were minor additions to and a little housekeeping within the clean sets on the deadline date as well, keeping the total at just below one million files. The sets used for speed measurements were rebuilt from scratch, but we used the same approach as in previous iterations: a selection of machines of varying degrees of usage were scraped for all we could use, and the files then organized by type to give some indication of how different solutions handle different types of file. The main difference between this test and earlier ones is the absence of a Windows XP system in the mix, which has been replaced with more emphasis on the newer Windows 8.1. Our speed and performance measures are presented as comparisons against a system running Windows Defender, as this should be considered the norm for desktop platforms. Comparable baselines on unprotected systems were also recorded, but omitted from this report for space reasons; we hope to make use of this data for more in-depth analysis in the future.
We also introduced a slightly more advanced process of snapshotting our test systems, but again the fundamental approach remained much the same, the changes mainly focusing on improved automation.
With everything set up and ready to go, we settled down to plough through the list of products under test.
Main version: 4537.670.1937
Update versions: N/A
Last 6 tests: 4 passed, 0 failed, 2 no entry
Last 12 tests: 5 passed, 1 failed, 6 no entry
Agnitum’s Outpost suite has returned to being a regular participant in our tests after a brief absence while the development team picked up the work of maintaining their own engine, inherited from the now defunct VirusBuster. The product has had something of a face lift too, but the overall experience is much the same, with a fairly lengthy installation process requiring a reboot to complete. The interface is clean and simple with a good level of configuration provided for its many features, and it responded well throughout testing with no stability issues to report.
Scanning speeds were not super-fast, but sped up considerably in the warm runs in some sets. Overheads were a little high, but again showed the benefit of some optimization once the product had settled in. RAM use was low, CPU use a little high, and our set of standard activities completed in reasonable time – a little slower than with Defender in place, but not excessively so.
Detection was a little disappointing, with relatively low scores in our daily Response sets, but the WildList was well covered and there were no false alarms, earning Agnitum another VB100 award to maintain its strong recent record.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 2014.02.17.33 (3.1.8.1 build 342)
Update versions: 2014.03.10.02, 2014.03.19.05, 2014.03.25.05
Last 6 tests: 1 passed, 0 failed, 5 no entry
Last 12 tests: 4 passed, 0 failed, 8 no entry
AhnLab’s appearances in our tests have been rather unpredictable of late, with many of them not entered, but when it does take part, its results are usually decent. The current product is little changed from previous occasions on which we’ve seen it on the test bench, installing rapidly with no restart needed. The interface is well laid out, with clear information and simple controls at the top level, and a decent degree of fine-tuning under the covers.
Scanning was a little slow, overheads a little high, but our set of tasks ran through very quickly with little increased RAM use and CPU use not too high either. A single, fairly minor stability problem was noted, with one of the updates refusing to complete properly, but re-running after a restart proved much more successful.
Detection was pretty decent, only tailing off into the second part of the proactive sets, and the core certification sets were well handled, earning the product a VB100 award and maintaining a decent record of passes from only occasional appearances.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 2014.02.19
Update versions: 2014.03.11, 2014.03.18, 2014.03.26
Last 6 tests: 1 passed, 0 failed, 5 no entry
Last 12 tests: 1 passed, 0 failed, 11 no entry
Arcabit has been off our radar for a while now, with no appearances in our tests since 2011, but it returns with a completely new product on show. Gone is the old in-house engine, replaced by the popular Bitdefender, and the front end is completely revamped too. Installation is reasonably speedy, and the interface follows the recent Windows 8-inspired trend of sharp angles and large fonts. It seemed clear and usable, with a good level of configuration, and was mostly fairly stable, with just a couple of incidents of the GUI shutting down unexpectedly and taking a few minutes to come back to life.
Scanning speeds were impressive, overheads a little on the heavy side, and despite low resource usage, our set of activities took quite some time to complete.
Detection was very strong, with only the last part of our RAP sets showing any significant decline. The core certification sets were handled well, with no WildList misses or false positives, thus earning Arcabit a VB100 award on its first appearance in some time.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 2014.9.0.2013
Update versions: 140219-0, 140307-0, 140307-0, 2014.9.2016, 140324-0
Last 6 tests: 4 passed, 1 failed, 1 no entry
Last 12 tests: 9 passed, 2 failed, 1 no entry
Avast very rarely misses a test (although it was absent from the last Linux comparative), and it generally does well all round. The product – the free edition, which usually appears in our desktop tests – always wins praise for its pleasant design and is simple to operate despite a wealth of configuration options for the wide range of components included (an impressive selection for a free solution). Installation is pretty fast and easy too, and there were no problems with stability even under heavy stress.
Scanning speeds were very good, overheads not bad, especially with the default settings which only cover some file types on-read. Our set of tasks took a little while to get through, but used little more memory or processor time than Windows Defender.
Detection was decent, if not as good as one might hope, and while there were no false alarms in our clean sets we did observe some misses in the WildList sets – apparently due to communication issues with the WildList operators. This was enough to deny Avast a VB100 award this time, despite an otherwise good showing.
ItW on demand: 99.74%
ItW on access: 99.87%
False positives: 0
Stability: Solid
Main version: 2014.0.4335
Update versions: 3705/7100, 3722/7163, 3722/7204, 3722/7240
Last 6 tests: 5 passed, 1 failed, 0 no entry
Last 12 tests: 10 passed, 2 failed, 0 no entry
AVG is a very reliable participant in our comparatives, and it tends to pass most tests with only the odd problem. The product is another that has adopted the boxy Windows 8 styling, installing in good time and not requiring a restart. The interface for the suite is fairly attractive, if a little dark in places, and provides a good set of controls. It remained pretty stable for the most part, although one of our RAP scans did freeze up completely, requiring a reboot to get things moving again.
Scanning speeds were impressively fast, overheads a little high initially, but not bad at all after the initial runs. Resource use was low, and our set of activities ran through pretty quickly.
Detection was very strong indeed in the reactive sets, dropping away somewhat into the proactive parts of the RAP test, but still decent even there. The WildList was well handled, and the clean sets only threw up a few warnings of possibly corrupt or over-compressed files. AVG earns another VB100 award, and keeps up its decent record.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 14.0.3.338
Update versions: 7.11.132.74, 7.11.135.236, 7.11.137.240, 7.11.138.196
Last 6 tests: 3 passed, 0 failed, 3 no entry
Last 12 tests: 5 passed, 0 failed, 7 no entry
As usual, Avira submitted both its free and paid for solutions for this test, with the free personal version up first. It installs fairly rapidly, and presents a slick and attractive interface which is simple to operate and provides a fairly complete set of controls. It maintained decent stability, although a few scans did crash out, on some occasions claiming to have been cancelled despite no interaction from the test operator.
Scanning speeds were decent, overheads very light thanks to some changes to the on-access component which mean that simple read operations are no longer monitored. Resource use was pretty low, but our set of activities took a little while to get through.
Detection was as excellent as ever, and this solid coverage extended to the certification sets, earning the product a well deserved VB100 award. Entered only in desktop tests, this free edition has maintained an excellent record of passes.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 14.0.2.286
Update versions: 7.11.132.74, 7.11.135.236, 7.11.137.240, 7.11.138.196
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 9 passed, 0 failed, 3 no entry
The premium version of Avira’s product is pretty similar in most respects, with the main difference we noted being the addition of a few more fine-tuning options. The interface has the same professional feel and sensible layout, and the installation process is also very speedy and simple. Again, we saw some minor issues with completing some scans, but no serious problems.
Scanning speeds were decent, and our file access measures showed very little lag time, once again thanks to the lack of full on-read scanning. This probably also contributed to the low resource use recorded, but our set of tasks took some time to complete.
Detection was not a problem though, with excellent scores across the board, and with no issues in the core sets another VB100 award is easily earned, keeping up a very strong record of passes.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 1.8.0.52023
Update versions: 2014.01.23/1839, 2014.03.07/1416, 2014.03.20/0602, 2014.03.25/0602
Last 6 tests: 1 passed, 1 failed, 4 no entry
Last 12 tests: 1 passed, 1 failed, 10 no entry
Baidu’s previous appearance in our tests was with a rather different solution: an international version based around the Avira engine. This time, we got to try out the version marketed mainly to Chinese users, with no translation of the interface available and the Kaspersky engine under the covers.
Installation was very speedy and simple. The GUI was something of a surprise – it shuns the busy, cluttered look so common in Chinese products in favour of a very sparse, pared-down layout with excellent clarity and what looked like a decent set of controls.
Stability was for the most part decent, but several scans of our clean sets did seem to lock up near the end, freezing at around 99%, and we also had trouble with updates completing on a couple of occasions, just nudging our stability score into the ‘Fair’ category.
Scanning speeds were good. File access times look very speedy thanks to a lack of proper on-read protection, and our set of tasks blasted through in impressive time, with slightly high CPU use thanks to the frenzied activity, but RAM use remaining low.
Detection was reasonable, remaining pretty steady right up to the last part of the proactive set in our RAP test. The WildList was mostly well covered, but a few items did seem to be missed from time to time, possibly down to further instability. There were also a couple of FPs in our clean sets, on components of OpenOffice and The Gimp, so no VB100 award can be granted to Baidu’s Chinese edition on its first appearance.
ItW on demand: 99.11%
ItW on access: 97.75%
False positives: 2
Stability: Fair
Main version: 17.25.0.1074
Update versions: 7.53300, 7.53586, 7.53707, 7.53830
Last 6 tests: 6 passed, 0 failed, 0 no entry
Last 12 tests: 12 passed, 0 failed, 0 no entry
Bitdefender comes into this test riding a wave of excellence stretching back to the summer of 2010, with every VB100 test entered and passed since then. The Antivirus Plus product is familiar from many recent comparatives, offering a pretty fast install time and presenting a dark, brooding interface with sharp angles and a few flashes of colour to give it some impact. The layout is slightly quirky, but becomes fairly intuitive after a little exploration, and configuration is provided in some depth. Stability was impeccable throughout testing.
Scanning speeds were pretty good, and overheads not bad initially, and super-light in the warm runs. In the activities test, we once again saw some extremely long completion times, indicating an issue with part of the process. The test includes a wide range of common tasks such as downloading, moving around of files and archives, as well as software installation. We have added some extra steps to the process to allow finer-grained measures of each separate step, which we hope will enable us to pin down the issue (which has been apparent over the last few tests). The extended time taken also meant that an abnormally low figure was recorded for CPU use, which is measured regularly during the activities and averaged out – the long periods of relative idleness contributed to the figure being lower than our baselines.
Detection, on the other hand, was splendid, very strong indeed in all areas, and with no issues in the clean or WildList sets, a VB100 is comfortably earned, maintaining that strong run of form.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 14.0.277.1
Update versions: 14.0.278.4, 14.0.278.5, 14.0.278.5
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 10 passed, 0 failed, 2 no entry
BullGuard also has an excellent record in our comparatives, with no fails since 2008 and a fairly complete set of appearances, entering and passing every test for the last three years with the exception of those on Linux platforms.
The current product is a little different from those we’ve previously looked at, with a GUI split into sections covering a wide range of components. It installed in reasonable time, and maintains a slightly unusual but fairly navigable layout, providing a good basic level of fine-tuning controls. Stability was very good throughout testing.
Scanning speeds were pretty good, overheads not bad initially and very light in the warm runs. RAM use was very low, and our set of activities sped through very quickly indeed.
Detection was again very strong, with good scores everywhere and no problems in the core sets, earning BullGuard another VB100 award to add to its long line of successes.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 13.0.092.000
Update versions: 8.3.3.9
Last 6 tests: 2 passed, 2 failed, 2 no entry
Last 12 tests: 2 passed, 2 failed, 8 no entry
Check Point’s appearances in our tests are rather sporadic, but have become a little more regular in the last year or so. The installation is a little slow, but not horribly so, and the product looks much the same as ever – it is starting to look a little old-fashioned, but provides a reasonable set of options in a reasonably accessible format.
Stability was a little problematic, with a number of jobs freezing up or crashing out, and a couple of installs saw logging data failing to make it to the proper log file and having to be ripped out of a separate database instead.
Scanning speeds were rather sluggish initially, but blazing fast in the warm runs. Overheads were very high too, and again showed some improvement in the warm runs, this time only coming down to normal levels though. Resource use was fairly low, and our set of tasks got through very quickly, helped by that smart optimization.
Detection was decent if not stellar, but there were no problems in the core sets and a VB100 award is easily earned, giving Check Point a 50-50 split over the last year which we hope to see improved with more regular participation going forward.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Fair
Main version: 10.0.0.1
Update versions: 4180.645.1856
Last 6 tests: 1 passed, 0 failed, 5 no entry
Last 12 tests: 1 passed, 0 failed, 11 no entry
Cranes Software is the latest incarnation of Proland, whose Protector Plus has featured in VB100 reports since as long ago as 1998, although with several fairly long periods of radio silence in between. The vendor returns with an all-new product featuring the rejuvenated Agnitum engine, with a rather slow install and an interface closely modelled on Agnitum’s own, meaning a clear and standard design allowing for easy navigation of its decent set of controls. It remained stable and responsive at all times, although during heavy on access bombardment it did seem to prevent just about any other activity on the system.
Scanning was slow to start with, but sped up nicely later on, with overheads also a little heavy but again better once the product had familiarized itself with the system. Resource use was a little above average, and our set of tasks were a little slower to complete than the baseline measures.
Detection was a little disappointing, but at least the core sets were properly covered, earning a VB100 award for Cranes, its first in some time.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 8.1.1
Update versions: 4318.687.1936
Last 6 tests: 2 passed, 0 failed, 4 no entry
Last 12 tests: 3 passed, 1 failed, 8 no entry
Another product using the Agnitum engine and interface, Defenx managed a good run of success in our tests until an absence while the ownership of the engine was transferred from VirusBuster, but has returned in good form. Installation this month was a little slow, but not too bad. The interface is clear and simple with good controls, and stability was flawless with no problems to report.
Scanning was slow to start with but faster later, with similar improvements in the on-access overhead measures, again from a slowish start. Resource use wasn’t bad, and our set of activities not too slow.
Detection wasn’t great, but with no issues in our core sets a VB100 award is earned, putting Defenx on track for another good run of passes.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 8.1.0.40
Update versions: N/T
Last 6 tests: 4 passed, 1 failed, 1 no entry
Last 12 tests: 7 passed, 3 failed, 2 no entry
Emsisoft’s history in our tests shows a sharp turnaround about 18 months ago when the underlying engine was switched to the hugely popular Bitdefender, which is included alongside some of the company’s own detection technology. The product has seen a strong run of passes since that point. The current version installs in reasonable time, with no reboot required, and presents a familiar interface with large, friendly icons and plenty of information about the decent range of controls.
There were a few issues with stability, including a number of problems getting through our on-access measures, where the contents of the standard WildList sample set are accessed in rapid succession or copied to the C: partition of the machine. In both approaches, the protection locked up or simply shut down, on one occasion sparking an ‘Access Violation’ error message, and after much effort we were unable to complete a run over the set with reliable results. This should not be too much of a problem in everyday use though, as few real-world systems will be subjected to such an intense barrage of threats.
Scanning speeds were a little slow but fairly consistent, with on-access lag times also not too fast but at least showing a little improvement in the warm runs. Resource use was low, and our set of tasks completed in good time.
Detection was excellent, with only the latest parts of the proactive sets falling below very high standards, and the certification sets were well handled for the most part. However, thanks to the repeated problems with the on access component, we were unable to show full detection of the WildList, and thus cannot grant a VB100 award this month. We expect to see Emsisoft back to its previous strong form soon.
ItW on demand: 100.00%
ItW on access: N/A
False positives: 0
Stability: Fair
Main version: 14.0.1400.1568 DB
Update versions: N/A
Last 6 tests: 5 passed, 1 failed, 0 no entry
Last 12 tests: 11 passed, 1 failed, 0 no entry
No VB100 has been without a product from eScan since 2009, and the product has achieved a pass in a very high proportion of its entries. The product had a major overhaul a couple of years ago to match Windows 8 styling approaches, and we found its grey-on-grey colour scheme a little murky, but it has since had a minor face lift, adding some welcome brightness.
Installation is a little slow at times, but faster at others, with updates also adding a good few minutes to the initial set-up time. The layout of the GUI is fairly clear and usable – more so now a little more colour has been added - with a very comprehensive set of fine-tuning controls. Stability was decent, with a couple of moments of unresponsiveness but nothing serious or unrecoverable.
Scanning was not too slow, but our overhead measures were very high, with a noticeable sluggishness about the whole system at times. However, our set of activities didn’t take too long to complete, with fairly low use of RAM and CPU.
Detection was excellent throughout, and with no issues in the core sets, a VB100 award is well deserved.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 7.0.302.26
Update versions: 9443, 9513, 9550, 9583
Last 6 tests: 6 passed, 0 failed, 0 no entry
Last 12 tests: 12 passed, 0 failed, 0 no entry
ESET’s VB100 record speaks for itself, with nothing but green ticks in our results table going back more than a decade – a remarkable run of excellence. This month the product set up in good time, presenting the usual slick and professional interface with clear, simple usability on the surface and a comprehensive set of options underneath for those who want to fine-tune things. Stability was once again impeccable.
Scanning was fairly slow initially, but very quick indeed in later runs. Overheads on file access were pretty light and our set of activities completed in decent time, with low RAM use and fairly reasonable CPU use too.
Detection was very good across the board, and with no problems in the core sets yet another VB100 award goes ESET’s way, adding to its splendid tally.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 2.5.0.23
Update versions: 13.3.21.1/523521.2014021915/7.53302/11472351.20140219, 13.3.21.1/524719.2014031120/7.53583/11582753.20140311, 13.3.21.1/525112.2014031819/7.53689/11589856.20140317, 13.3.21.1/525637.2014032622/7.53829/11647930.20140326
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 6 passed, 2 failed, 4 no entry
ESTsoft has become a familiar name on our test bench over the last few years, and in the last 12 months or so has built up a good run of passes. The product installs rather rapidly but updates take a while sometimes. The interface is fairly pleasant and clear, although in some places the language used is a little open to misinterpretation. Stability was decent, although a couple of scans did freeze up and on one occasion logging failed to export properly.
Scanning speeds were a little sluggish initially, better in the warm runs but still not super-fast, while overheads showed slightly elevated impact initially but very quick speeds later on. Our set of activities zoomed through very quickly indeed though, with low resource use too.
Detection was very strong, with only a slight decline into the later sets, and with no issues in the WildList or clean sets, ESTsoft earns another VB100 award, keeping up its recent run of success.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 5.07.333
Update versions: 5.147/21.663, 5.147/21.773, 5.147/21.818, 5.147/21.871
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 10 passed, 0 failed, 2 no entry
Fortinet has also been doing well lately, having entered all but our Linux tests over the last few years and with passes to go with each entry. The installation proved extremely slow on every attempt, most of the time being spent downloading components. The interface, when finally available, is very minimal with very little by way of configuration available, but what is available is simple to operate and clearly laid out.
Stability was mostly good, but a couple of scans failed to complete properly and on one occasion we saw a rare blue screen event during the scanning of part of our clean sets, which was rather hard to recover from. This pushes Fortinet’s stability score to the very brink of ‘Buggy’ territory, landing just on the ‘Fair’ side of the line.
Scanning was OK in some areas, but a little slow over binaries; overheads were not too bad initially and very light indeed later on. Our set of tasks didn’t take too long to get through, with fairly low use of resources.
Detection was as excellent as ever with only a slight decline through the RAP sets, and the core certification sets were well handled with no problems to report. Fortinet thus adds another VB100 to its lengthy run of success despite a few stability issues.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Fair
Main version: 24.0.3.4
Update versions: AVA 24.646/GD 24.147, AVA 24.936/GD 25.2949, AVA 24.1058/GD 25.2983, AVA 24.1180/GD 25.3022
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 9 passed, 0 failed, 3 no entry
G Data’s record over the last few years has also been very strong, with only a handful of no-shows and no problems passing any of the tests entered. The current product installs in reasonable time and presents a crisp, clear and simple interface with easy navigation of its wealth of controls. Stability was once again flawless.
Scanning speeds were pretty good from the off, and very fast indeed in the warm runs, with overheads a touch high initially, but much better later on. Resource use was fairly low and our set of activities got through very quickly.
Detection was very strong indeed, with very commendable scores even in the most recent parts of the proactive sets. The certification sets were easily brushed aside, comfortably earning G Data another VB100 award to add to its recent run of success.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 5.0.8.0202
Update versions: 5.0.7.2/12.163, 5.0.8.0304 5.0.7.3/12.163
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 9 passed, 0 failed, 3 no entry
A new name, coupled with a well-populated history, can be explained by yet another identity change: Total Defense, which inherited a product from CA (which in turn picked up other companies in the distant past), has split in two, with the corporate offering now being operated by iSheriff. The product remains unchanged though, with the Total Defense name still adorning the main interface and even some mentions of CA still lurking in places within the product’s files.
Installation was very fast indeed, beating 10 seconds on one attempt, but updates add a fair whack to the overall set-up time. The interface is split between a local client and a web-based management portal, both of which are accessed via a browser, but the design is reasonably usable and a decent set of options for control can be achieved by combining the two. Stability was reasonable, with some crashes of the browser showing the GUI, and some scans locking up, but no serious issues.
Scanning was fairly quick, but overheads a little heavy perhaps. Our activities measure showed a decent time, with low resource use.
Detection was very strong indeed, and the core sets presented no problems, comfortably earning iSheriff its first VB100 award under its new name, and maintaining a good run of passes for the same product under its previous identity.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 14.1.0216
Update versions: 9.176.11205, 9.176.11397, 9.176.11468, 9.176.11551
Last 6 tests: 2 passed, 1 failed, 3 no entry
Last 12 tests: 4 passed, 2 failed, 6 no entry
K7 is not the most regular of participants in our tests but has a reasonable recent record. The latest version installs nice and quickly, and has a very crisp and rugged-looking interface with a decent set of options provided. Stability was almost perfect, with just a single, very minor incident – a scan which froze up for a few minutes before righting itself with no external input.
Scanning wasn’t the fastest, and overheads were a little high initially, but sped up nicely with some good use of optimization in the warm runs. Resource use wasn’t bad at all, and our set of activities completed in good time.
Detection was very strong, tailing off a little into the proactive sets, but the core sets were properly dealt with and a VB100 award is well deserved.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 14.0.0.4651(e)
Update versions: 14.0.0.4651(f)
Last 6 tests: 4 passed, 1 failed, 1 no entry
Last 12 tests: 10 passed, 1 failed, 1 no entry
For simplicity, this month’s submission from Kaspersky is recorded in the mainline history for the company, which has entered a wide range of products over the years and which seldom misses a test, failing almost as rarely. The set-up isn’t too slow, and the interface is the usual very clean and well-built affair with the company’s traditional green colour scheme. The layout is a little funky but reasonably easy to find one’s way around, with a complete set of controls available after a little exploration. Stability was good, with only a few minor issues of scans not quite completing properly.
Scanning speeds were reasonable to start with, and very fast in the warm runs. Overheads were a little high initially, but again much better after initial familiarization. Our set of activities did take a little extra time to complete, but resource use was low throughout.
Detection was pretty good, with only a slight dip into the later parts of the RAP sets. The core sets were handled well, and a VB100 award is well deserved, adding further to an impressive test history.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 2013.SP6.0.021400
Update versions: 2013.SP6.0.021400, 2013.SP6.0.031411, 2013.SP6.0.032110
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 7 passed, 0 failed, 5 no entry
Another product with a Chinese-only interface to delight and bewilder the test team, Kingsoft’s test history since it switched from its own in-house technology to the Avira engine towards the end of 2012 has been very strong indeed.
The installation this month was pretty speedy, enlivened by a groovy progress bar, and the interface seemed reasonably clear and well laid-out, with most basic functions easily found despite the language barrier. Stability was mostly good, but a few scans claimed completion despite clearly having ignored some of the areas they were instructed to inspect.
Speeds were a little on the slow side, but overheads weren’t too bad, and our set of tasks ran through very quickly with minimal resource use.
Detection was very strong indeed, with excellent coverage just about everywhere. The core sets presented no issues, thus Kingsoft earns another VB100 award to add to its impressive recent tally.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 11.1.5354.0
Update versions: N/A
Last 6 tests: 3 passed, 0 failed, 3 no entry
Last 12 tests: 3 passed, 1 failed, 8 no entry
Lavasoft’s participation in our tests has become quite predictable of late, entering all of our desktop tests over the last year and doing well in all of them. This month, set-up was a little slow to complete, but the update part at least was fairly speedy. The interface is clear and attractive after a fairly major redesign, which also took in the addition of the Bitdefender engine. Stability was rock solid, with the product remaining firm and responsive under even the heaviest of stress.
Scanning speeds weren’t bad, overheads just a touch high, and RAM use was fairly low. However, in the activities tests we saw another extreme slowdown, accompanied by a very low average CPU during that time, indicating that there was a good deal of waiting for something external to feed back; we will investigate this issue more closely with the developers to see if it can be explained by an oddity in the design of our test processes.
Detection was as splendid, as one would expect, with very good scores everywhere. With no issues in the clean sets or WildList, a VB100 award is easily earned, along with several compliments from the test team on the new design.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 4.5.212.0
Update versions: 1.1.10302.0/ 1.167.15.0, 1.167.1334.0, 1.167.2147.0, 1.1.10401.0/ 1.169.671.0
Last 6 tests: 4 passed, 0 failed, 2 no entry
Last 12 tests: 6 passed, 0 failed, 6 no entry
Microsoft’s business solution enters our tests fairly regularly, alternating with the vendor’s free home-user products, and has a very strong record of passes over the last few years. This month, set-up was very speedy as usual, and the interface was in place quickly, with its look and feel fitting in very nicely with its surroundings. The layout is fairly simple to navigate, with a decent set of controls provided for those who look for such things, and stability was impeccable throughout testing.
Scanning was on the slow side in most sets, and overheads a little higher than expected initially, but much improved in the warm runs. Our set of activities took only very slightly longer to complete than with our Windows Defender baselines, and resource use was also fairly low throughout.
Detection was reasonable, if unlikely to threaten the best in the field, tailing off a little into the proactive sets. The core sets presented no problems and a VB100 award is well deserved.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 1.1.107.0
Update versions: 1, 86869, 86951, 87056
Last 6 tests: 0 passed, 3 failed, 3 no entry
Last 12 tests: 1 passed, 4 failed, 7 no entry
MSecure hasn’t been too lucky lately, with false positive issues and some detection problems meaning it has failed to achieve a pass in the last year of testing, but it keeps coming back for more. This month, the product installed very quickly, with speedy updates too. The interface is crisp and clean, if not quite as professional-looking as some others this month. The layout is fairly intuitive though, with a decent basic set of options available.
Stability was something of an issue: there were some major problems with the on-access component, which appears to ignore many significant file types despite them clearly being listed in the set of extensions to be covered. This list can be bypassed by setting the product to scan all file types, but the option requires a reboot to take effect – which is not indicated at the time of setting it. There were also some more minor issues including failing to log properly, and at one point an alert was shown concerning a newly inserted USB device, despite no such device being attached to the test system.
Scanning speeds were distinctly slow, especially in the sets of binaries, but overheads weren’t too bad, doubtless helped by the failure to cover a range of important file types. Resource use was low and our set of tasks completed very rapidly, again perhaps pushed along by the lack of dependable on-access protection.
Detection was very strong though, with good scores in most places on demand, and the WildList was covered properly too. A single minor FP was recorded in the clean sets, and this, combined with the failure to produce reliable on-access coverage, means no VB100 award can be granted to MSecure this month.
ItW on demand: 99.98%
ItW on access: N/T
False positives: 1
Stability: Buggy
Main version: 10.1
Update versions: 7.03.02
Last 6 tests: 3 passed, 2 failed, 1 no entry
Last 12 tests: 7 passed, 4 failed, 1 no entry
Norman appears in most of our comparatives, with a reasonable record of passes, and has shown some strong improvements over the last year or so. This month the set-up process for the suite was a little slow, but once up, the interface proved clear and well put together, as well as considerably more responsive than previous versions. Responsiveness was maintained throughout testing, earning a ‘Solid’ rating for stability.
Scanning was fairly zippy and overheads a little on the high side, but improving in the warm measures. Our set of activities completed in very good time, and resource use was decent too.
Detection was excellent in the reactive sets, tailing away somewhat into the proactive parts of the RAP sets, but the core sets presented no difficulties and Norman (which was taken over by Blue Coat Systems during the course of the comparative) earns another VB100 award, along with a nod of approval for the continuing improvements observed.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 11.1.2 Build 3387
Update versions: 0.12.0163/255.1.0.19
Last 6 tests: 2 passed, 0 failed, 4 no entry
Last 12 tests: 3 passed, 1 failed, 8 no entry
Another product with only a sparse history in our tests, Optenet has also had a bit of an overhaul in the last year or so, with a new engine – once again that widely deployed Bitdefender one – now powering things under the hood. The set-up isn’t the fastest, and on a few occasions we had trouble getting it to complete happily. The interface is a browser-based affair with all the lagginess and wobbles one expects from such an approach – several scans saw it complaining about scripts stopping, and on a few occasions it froze up and refused to respond, but thankfully behind the scenes everything kept running happily by itself. These fairly minor issues mount up to only a ‘Fair’ rating for stability.
Scanning was rather slow and overheads fairly heavy, but our set of tasks got through very quickly with very little resource use. Detection showed the excellent scores we expect from the engine. These extended to the core sets and Optenet earns another VB100 award, keeping its record pretty decent.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Fair
Main version: 2.3.0
Update versions: N/A
Last 6 tests: 3 passed, 2 failed, 1 no entry
Last 12 tests: 6 passed, 2 failed, 4 no entry
Panda’s cloud-based products have become regulars over the last few years, and have generally done well, barring a few unlucky moments with some fairly minor false positives. This month, as usual, set-up was lightning-fast, with only minimal local components required. The interface is bare and simple with very few options available, but does the job nicely and is easy to operate. Stability was a little shaky, with a number of scans failing to complete – mostly simply freezing up and refusing to make any progress for long periods before we gave up on them.
Scanning speeds were reasonable – a little better in the warm runs than the cold – with overheads mostly fairly light thanks to a lack of full on-read protection. Our set of activities ran through very quickly indeed, with very low resource usage too.
Detection was very strong in the reactive sets, and could not be recorded in the retrospective component of the RAP sets thanks to the cloud-based approach. The core sets were also well handled, earning Panda another VB100 award and getting it back on track for another run of strong showings.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 1.0.0.44
Update versions: N/A
Last 6 tests: 2 passed, 1 failed, 3 no entry
Last 12 tests: 3 passed, 1 failed, 8 no entry
PC Pitstop’s products have been appearing in our tests on and off for a little over a year now, and have managed a few passes with the ThreatTrack engine bundled into the company’s range of offerings. This month sees something different though, with a whitelisting solution also included. This operates on access only, so at the developers’ request, our usual testing patterns were reversed, with the bulk of the detection tests performed on-read, and on-demand scans used only where absolutely necessary.
The installation process is pretty fast and simple. The interface is fairly basic, but reasonably usable, with only a minimum of options to worry about. Stability wasn’t perfect, with the product collapsing several times under the pressure of dealing with lots of files, even clean ones, and shutting down protection completely several times during the test process.
Scanning speeds were very slow on demand, and overheads fairly high on access, with a hefty impact on our set of tasks but not too much use of memory or processor cycles.
Detection was superb in the RAP sets, almost perfect in the reactive sets, and very impressive indeed in the retrospective component. Sadly, this high detection rate comes at a price, with the whitelisting causing alerts on a large number of samples in our clean sets, including components of software from HP, IBM, Microsoft, ATI, Lenovo, Samsung, Lexmark, SAP, Sony and Oracle, as well as a raft of others from smaller software houses and open-source projects. There were also a handful of misses in the WildList sets, in both modes, meaning there is no VB100 award for PC Pitstop this month, despite an interesting effort.
ItW on demand: 99.89%
ItW on access: 99.79%
False positives: 1065
Stability: Fair
Main version: 4.9.0.4110
Update versions: N/A
Last 6 tests: 4 passed, 1 failed, 1 no entry
Last 12 tests: 8 passed, 1 failed, 3 no entry
Qihoo is another name that has become familiar on the test bench over the last few years, its first appearance having been back in 2009. Since then, it has put in some decent performances. The current version installs very speedily, with no restart needed. The GUI is bright and shiny and fairly appealing, with a sensible, standardized layout and a decent set of options available. Stability was reasonable, with a few fairly minor snags when running big scans and a few quirks in the interface.
Scanning was distinctly slow, and overheads look very light thanks to an odd approach to on-read protection which appears to allow files to be accessed, or even written, with scanning backgrounded for a while, eventually presenting an alert warning that the file has been spotted and claiming to have blocked it, albeit rather late. Under heavy bombardment, these alerts could take several hours to show up, making the testing process a little lengthier than ideal. Our set of tasks took a little while to get through, but resource use wasn’t excessive.
Detection was pretty excellent, with good scores everywhere, and with (eventually) no problems to report in the core sets, Qihoo earns another VB100 award and maintains a pretty decent recent record.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Main version: 15.00 (8.0.4.0)
Update versions: N/A
Last 6 tests: 5 passed, 1 failed, 0 no entry
Last 12 tests: 9 passed, 2 failed, 1 no entry
Quick Heal’s history in our tests dates back to 2002, with a very solid record of participation and some pretty long chains of passes too, including a current one running for over a year. The latest solution installs in reasonable time, and presents an attractive green-and-grey interface with a touch of the Windows 8 in its styling. The layout is easy to navigate, and options are provided in decent depth. Stability was very good, with no issues noted.
Scanning speeds were pretty good too, overheads a little on the high side, but our set of tasks got through in good time and resource use was reasonable too.
Detection was not the strongest, but remained fairly steady until the very last part of the retrospective sets. There were no false positives, and detection of the WildList seemed good on access. On-demand scores were oddly down though, which was difficult to diagnose, but we eventually figured out a problem with the logging system which omitted several detections at the beginning of a scan. As detections can only be counted if they are recorded, these have to be counted as misses despite the rather odd nature of the issue, and no VB100 award can be granted on this occasion.
ItW on demand: 99.81%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 2.5.0.23
Update versions: 13.3.21.1/523521.2014021915/7.53302/11472351.20140219, 13.3.21.1/524719.2014031120/7.53583/11582753.20140311, 13.3.21.1/525112.2014031819/7.53689/11589856.20140317, 13.3.21.1/525637.2014032622/7.53825/11642333.20140326
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 6 passed, 2 failed, 4 no entry
Roboscan is a spin-off of the ESTsoft product, and tends to share in its successes as well as its problems. Fortunately, the successes have taken centre stage over the last year or so. Installation was speedy, but updates a little slow. The interface looks fairly pleasant and is for the most part clear and usable, and stability was very good with no problems emerging.
Scanning was only reasonable to start with, but much better in the warm runs. Overheads were a little high, but also improved considerably in later runs. Resource use was low and our set of tasks completed in good time.
Detection was excellent (that Bitdefender engine doing its job once again), and the certification sets presented no issues, earning Roboscan another VB100 award and keeping its recent run of form going.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 8.9.25001.501
Update versions: N/A
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 9 passed, 0 failed, 3 no entry
Another Chinese-only product, Tencent’s solution uses the Avira engine and has used it wisely, recording a steady stream of passes since its first appearance in our tests a couple of years ago. The set-up process is very speedy but initial updates proved very slow, taking more than 20 minutes on some occasions, but that may be attributable to the distance of our lab from the company’s target market.
The interface is fairly minimal by Chinese standards, not even particularly cluttered by ours, and seems to provide a wealth of components and controls. Stability was impeccable, with no problems observed.
Scanning speeds were reasonable, a little slow over binaries, and file access lags look fast thanks to an absence of on-read protection. Our set of tasks took quite a while to get through though, with low resource use.
Detection was very strong indeed, with little missed, and with no problems in the core sets Tencent earns another VB100 award, keeping up its flawless record.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Solid
Main version: 14.0.2.5250
Update versions: N/A
Last 6 tests: 5 passed, 0 failed, 1 no entry
Last 12 tests: 9 passed, 1 failed, 2 no entry
The final entry in this month’s report is another old hand. TrustPort has been a regular in our comparatives since 2006, frequently topping our detection charts with its dual-engine approach. The set-up is pretty quick and simple, and the interface refreshed and a little more centralized with a touch of that familiar Windows 8 sharp-box styling and a few splashes of colour. Configuration is provided in good depth, and stability was very good, with only a few very trivial issues noted in the workings of the new interface.
Scanning speeds were distinctly slow, and file access lag times rather high to start with, benefiting from some impressive optimization once warmed up. Our set of tasks took a little time to complete, but resource use remained low.
Detection was remarkable, as ever, remaining very strong well into the retrospective sets, and the core sets were managed well with no issues, earning TrustPort another VB100 award and keeping up a solid recent record of passes.
ItW on demand: 100.00%
ItW on access: 100.00%
False positives: 0
Stability: Stable
Several additional products were submitted for this month’s comparative but are not included in the final report for various reasons, including instability or other problems that prevented complete testing. These included products from BizSecure, iYogi and Zillya.
(Click for a larger version of the table)
(Click for a larger version of the table)
(Click for a larger version of the chart)
(Click for a larger version of the chart)
A much delayed report is finally complete, with the next one not far from ready too, and the following test already well under way. There were few big surprises this month, with a smattering of issues denying a few participants the awards they hoped for, many of them more technical than usual. False positives were lower than usual, except in a single case where a high number was recorded thanks to the inclusion of a fairly uncommon whitelisting component.
To cope with this variation from the traditional approaches we’re used to, some small tweaks were made to how the test was performed. We’re looking at ways of making more comprehensive changes in this direction to allow a much wider field of participation, as many modern products now claim to offer protection as good as, if not better than traditional approaches, using all manner of alternative methods. Comparing such offerings directly with the rest of the field will not be easy, but is surely worth the effort.
We also saw good levels of stability this month, which is pleasing both as a reflection of improving quality in the industry and as a reduction in the number of headaches for the test team. We remain hard at work through the changes in the structure of VB, and look forward to having those changes reflected in the future path of our testing efforts.
Test environment. All tests were run on identical systems with AMD A6-3670K Quad Core 2.7GHz processors, 4GB Dual DDR3 1600MHz RAM, dual 500GB and 1TB SATA hard drives and gigabit networking, running Microsoft Windows 7 Professional, 32-bit edition, with Service Pack 1.
Any developers interested in submitting products for VB's comparative reviews, or anyone with any comments or suggestions on the test methodology, should contact [email protected]. The current schedule for the publication of VB comparative reviews can be found at http://www.virusbtn.com/vb100/about/schedule.xml.