The following chart shows the RAP results obtained between February and June 09, with average reactive scores plotted against average proactive scores for each product. (The detection figures from any test during which a product generated false positives are omitted (for that product) from the average calculations.) The chart will be updated on a bimonthly basis.
Great News!^^ ;D
-AnimeLover^^
I think avast gets these good results on proactive tests by its generic detection? If so, avast 5 will really rock. ;D
I’ll be expecting better results with the help of avast 5’s heuristics.
Nice graph give a good indication of the overall effectiveness of an AV that having to view two separate graphs or sets of results.
It also gives the lie to the heuristics being the panacea for better detections as there are a number of AVs in this which have heuristics that didn’t do as well as avast which hasn’t (currently). So the old cry when will avast get heuristics because it mustn’t be as good as those AVs that have it has hopefully been firmly put to rest, heuristics isn’t the be all and end all of protection.
Though avast’s generic and algorithmic signatures could be reasonably be deemed heuristic, but because avast doesn’t call it heuristics users feel it is somehow lacking.
AVG is very good on reactive detection.
Avira false positives weren’t taken into account in this test.
It’s comfortable to see avast better than Norton, McAfee, Kaspersky.
And ahead of Nod32 (AKA eset) that many rave about ;D
Only on reactive detection. ESET is a little behind Alwil in proactive in this test.