AV-Comparatives May 2008

http://www.av-comparatives.org/

Yes the old retrospective tests where the virus signatures are three months old to see what is caught. Not a very good test for those AVs without heuristic detections, whilst the generic signatures should catch some of the newer malware, signatuire based AVs don’t do particularly well in this test.

However I feel this test is somewhat artificial as I don’t go on-line with my AV 3 months out of date.

Bad results.

However I feel this test is somewhat artificial as I don't go on-line with my AV 3 months out of date.

But you do go on-line (potentially) exposed to zero-day malware.

The test simulates exposure to zero-day malware three months ago, with the hope that this predicts detection of zero-day malware today.

please read the report.
and btw, the av’s were updated the 4th february, but the new samples are max. 1 week old (appeared between 5th and ~12th february).
the aim of a retrospective test is not to show how an AV scores if an user does not update his product, but how well a product is able to detect new samples. (the retrospective method allows to test products against new samples without e.g. unethical need to create new/artificial malware to test proactive detection capabilities).

23 false positives & only 1*, too bad :-[ :-[, even AVG got advanced :o
Let’s hope Avast will improve soon ::slight_smile:

I’m not surprised: it has been a downgrade to ‘standard’ level.
Too many false positives last days.
The test was made with version 4.7 ???
Avira got the best prize (even so, 74% on the overall average… not a software is perfect).

that’s the version which was available at that time (4th february).

These tests don’t mean much to me.

What interests me are:

  1. Am I infected? no

  2. Am I having any problems with avast!? no

  3. Do I have good support? yes

  4. Good updates? Yes, the ALWIL team will get out needed detections for any looming dangers. Unlike some, avast! will sometimes update multiple times a day.

Whilst that might not be the intention to see how it might be if a user didn’t update his signatures, that is in fact what it is when the updated signatures/program are frozen at a time in the past ???

That surely is why there is such a disparity between the on-demand and retrospective results. Effectively the generic signatures are the only means of detecting new samples if you don’t update the signature database. This would also account for the potential increase in false positive detections as most come from the -gen signatures.

this is what the retrospective test measures (how much of unknown/new samples are detected proactivly). If you want to know how much is in general detected with up-to-date signatures and heuristic you look at the tests of february and august.

I’m aware of the on-demand scans of Feb and August, the problem is that many that see the retrospective results as bad and you would have to say they are.

Those same people aren’t really familiar or care on what the retrospective test does or is attempting to show, or the previous on-demand results, as far as they are concerned it is a bad set of results.

I can’t do more than write on every page to read carefully the full report.
(p.s.: this is also a reason why direct links to the results page are not allowed and journalists should show us in advance what they are going to write, in order that we can proof-read that they interpreted the results correctly).

Thanks for jumping in Andreas.

avast! is slowly showing it’s lack of heuristics. I hope the behavior blocker will be ready soon.

Rather than getting caught while-we-wait, Threatrfire(Cyberhawk) is free for home use. Tech may not have been overly impressed but it works well with avast! for me.

I know ThreatFire (even from its original days of Cyberhawk) but it’s still doesn’t support Vista 64bit…

That is indeed a disincentive! :frowning: