Glad to know as a lot of users ask for better scanning speed…
My experience is that, for on-demand scanning, avast! is not the fastest, but on-demand is on-demand, not that trouble if you can run overnight.
It is good that avast came 7th, however, one of the problems with this type of scanning test is that it is often difficult or impossible to get all scanners scanning exactly the same files.
avast! is a very configurable anti-virus allowing very flexible on-demand settings of standard, through, with or without archives whilst many of the others might be less flexible. I didn’t notice how they catered for this or the sensitivity settings, etc. So this may fall into the category of yet another AV comprison.
I agree - I also have (big) doubts about the value of this test.
Actually, this test was done by Virus Bulletin - but its purpose was not to evaluate the speed of the scanners, but rather to test for false alarms; the table of times and speed was just an “additional information”. So, you should be careful when interpretting these results.
the article doesn’t mention the scan settings - the scanner speed will differ significantly for various scanner settings (for avast!, you may set if whole files, or just important parts will be scanned, if archives will be detected and unpacked, etc.)
it’s absolutely pointless to consider a scan time of “4 seconds” having any sense - if you scan just a few files, the main part of the time will be some overhead (scanner initialization, etc.), the scanner speed won’t really affect the result. Besides, measuring the speed on whole seconds (in the range 1-10) and stating the resulting speed to 5-6 significant digits is… erm, strange
this particular test is from VB June 2005. If you take a look at the similar tables in VB October 2005, you’ll notice the e.g. avast! scanner speed when scanning ZIPped executables is bigger than when scanning the (unpacked) executables themselves. That’s also rather suspicious ;), and brings more questions about the test methodology…
Sorry, but the first reaction of many peoples to someone’s creative is a negation. It’s easier to say “Don’t trust on these tests!” than to understand it.
having read the whole wilders thread and a little of the johannsen review it seems the devil in these review/tests is in the detail. In future maybe you could avoid this hassle and provide the detail as well as the findings .
Interesting to see Rejzors comments there as well. Any AV that has the capacity to scale down the intensity of scanning when compared to those that have only one level is going to suffer if they are tested on their “out of the box” settings.
Not here in avast! forums. Look, you’re not talking to newbies in this field. By the way, Igor has some experience and knows exactly what he’s posting.
For your first two posts, you must recognize you forgot any kindness and friendship.
I hope you can correct this position and contribute with avast! forums 8)
I don’t think the test published at the www.Anti-Malware.ru is wrong.
Thare’s something strange with Nod32 results in another test, but during along discussion at Wilders Forum we couldn’t find the reason why it had happened.
(My point of view is that i were some obscure problems with Nod32.
Anyway it’s not a reason to suppose that this test is bullshit.)
If for example avast! was at the first place there all this AV fans would praise it.
Unfortunately this test won another AV …