AV comparative tests March 2013 - Avast 9th rank, only 97.8%

http://www.av-comparatives.org/comparativesreviews/detection-test

there was 2.2% missed samples by Avast! (which was 4th worst result of out all tested AV products!)
there was 14 FP (which is exactly in middle (average) out of all tested AV products)

as result Avast! failed to again Advanced+ mark and achieved only Advanced

these results of v8 are ‘bad’ and hopefully it was first and last time v8 failed to stay in top5

discuss …


also it’s good to read AV comparatives False Alarm tests March 2013
http://www.av-comparatives.org/comparativesreviews/false-alarm-tests/259-false-alarm-test-march-2013

from that it’s clear majority of the FP come from detections: Malware-gen and some Trojan-gen
only 3 out of 14 FP were non-generic


While the results aren’t spectacular they need to be put in line with the recent real world results
http://av-comparatives.org/charts/chart1.php
where it’s nearly 100% perfection (98 detected + 2% interaction)
yet please discuss result of AV Comparative Dynamic Tests in it’s own topic here : http://forum.avast.com/index.php?topic=120974.msg926069#msg926069

It’s a file detection test. Looking at it realistically, this test entirely and completely excludes Evo-Gen, Auto Sandbox and all the FileRep detection capabilities. From my perspective, it’s an absolutely useless test as far as home end users are concerned…

I agree and Avast still have more room to improve their AV software, IF they got time to fix the problem and maybe add some new features.