Breathing life into a topic that is 4 months old isn’t advised, even more so when this relates to security based issue/test (even older). A week is a long time in the AV world, a month a very long time and 4 months a very, very, long time.
Looking back this far when avast 8 has just been released is unrealistic, so we will have to see how the new detection functionality gets on after the next test that includes avast 8.
Most test results are meaningless since they don’t really measure the effectiveness in the real world.
The following is a lot more meaningful: http://forum.avast.com/index.php?topic=119458.0
How does your link measure real world effectiveness? The Lifehacker editorial board is by no means experts, and they uses AV comparative as it’s main argument over using Avast over MSE. We shouldn’t cherry pick studies that suit the narrative we want unless you have specific complaints about the test’s methodology.
In fact I always had complaints about AV comparatives’ methodology, for example in the past they allowed Norton and Avira to set its Heuristics to “high” instead of default settings. Very problematic because it doesn’t necessarily reflect the true effectiveness of the product considering that probably 95 percent of people will never fiddle with their AV settings.
My almost ten years of using avast! Free certainly does.
lifehacker does happen to be a respected and trusted publication.
There are many other reviews and test that rank avast! at or among the best AV available.
They also qualify as an AV that’s extremely light on the amount of system resources they use.
For the latest test scores, use Google, it’s a great tool.
This still remains an old and outdated thread that you originally chose to post in.