system
41
Hi Vlk, I disagree with you.
- only about the half is pointing directly to binaries/files. The rest are exploits. In your misses you for sure also encountered some exploits and not only direct links. The “problem” is (and it is even written in the report) that practically all products (including of course Avast) are good are blocking/detecting exploits/drive-by downloads. That’s also why the % are so high. If you look at the latest research of Microsoft, the biggest issue for users are not 0-day exploits (according to their paper its even close to 0%) but social-engineered malware, which includes also tricking users in clicking on links pointing to files. If you miss malware from the web, the test will and does reflect that. But I am glad to hear that the next version will improve further in this regard.
- too less samples: others use 10 samples for such a test and base ratings based on that. We use usually 50x that size. Arguing that sample size is too small doesn’t sound fair. If it would be 1 million someone would say “who surfs to 1 million malicious sites…?” missing the whole point.
- How user-dependent cases are interpreted is up to the user. I do not believe that a product which would ask the user for everything should get the same like a product which is able to distinguish between malware and goodware without letting the decision up to the user. Anyway, only on chart2 you can sort based on the green bar. In chart3 you can combine blocked+userdependent.
- I expected that also Whole Product Dynamic Tests would be criticized (like any other test) in future if the scores are unfavorable for someone, despite the internal promotion for such sophisticated tests.
system
42
VLK spoke about script blocking etc. I myself install avast in my relatives PCs and I recently got a big thanks because of avast!. avast! indeed blocked a Facebook link that tried to exploit something and install ZerkAccess rootkit. The link was blocked in it’s roots as soon as it was clicked on. That’s the true protection avast! offers 
The same link, in the other hand, was opened in PC that had no protection at all. It was hard to remove the Rootkit because it made things very hard inside windows ^^
Vlk
43
Hi IBK,
Many thanks for coming here and taking your time to respond. It’s always good to see you here (for starters: IBK is the person behind Av-Comparatives.org).
Fair enough. Social engineering for sure is an important attack vector, and can indeed lead to users directly running binaries. However, I don’t think that a typical social engineered attack does that (have the user download and manually run a binary).
BTW would you mind sharing a link to that MS report you’re referring to?
All I was saying that Avast missed 18 samples while e.g. product B and product C missed 11 and 10, respectively. Without talking about other tests (which of course deserve same - or even bigger - criticism also) I’m just questioning the statistical relevance of the numbers. No pun intended.
This is probably the part where I’m most frustrated with the test. I just somehow disagree with the yellow category, simply because it tries to encompass all cases where the user has some control over the final decision. In the case of avast autosandbox, the message is so imperative, and has such a clear recommended action that I don’t quite see a user deliberately overriding the default decision and actually getting infected. But anyway, as I’ve already said, we’re refining the Autosandbox in v7 and so we’ll probably move these files to the green category.
And that’s fine, isn’t it? Criticism is generally a good thing, if it is material.
Thanks
Vlk
Asyn
44
system
45
I know about the numbers, that’s why I personally wait for the overall report which is released after 4 months (see comment on page 9 of previous overall WPDT report), and then start with the statistics.
Dwarden
46
@IBK i have question about Avast! settings used in the test(s)
i assume default was used
that mean Heuristic on File and Web at Normal (not High), same goes for on-demand setting
that mean PuP detection disabled (potencially unwanted programs)
right?
@Vlk will be these Avast! false positives fixed (or are they already?)
http://www.av-comparatives.org/images/stories/test/fp/avc_fp_aug2011.pdf
system
47
The point I find interesting is that Symantec aka Norton is missing from the latest 0-day malware test?
http://www.av-comparatives.org/images/stories/test/ondret/avc_retro_nov2011.pdf
avast3
48
Also interesting is this! Something is going on here that does not meet the eye!
note 2 on page, and I quote
“AVG, K7, PC Tools, Symantec, Trend Micro, and Webroot decided to not get included in this test and to renounce to get awarded”
Notice this is spyware and not virus testing going on here. I believe ALL A/V programs are somewhat weak at spyware. The list of A/V companies above knew it and bailed. I heard a Senior TCPIP Cisco expert (ex is a has been, and a spert is a drip under pressure) say that 60% is average for A/V stopping spyware. I don’t necessarily condone this statement, but it sure kinda fit this test. We are having a higher success than this number.
For stopping virus spreading through a network (the key ingrediant of A/V), I haven’t had a virus spread through an avast! protected network in 8 years. This is the IT nightmare. I have replaced many other A/V products with avast!, where the networks were destroyed from virus infection spreading to every single node and server.
VLK - we have to manually go in and set every system to high on the webshield. The we have to set up PUPs on all the shields. Why can’t these be the defaults?
DavidR
49
Many people simply don’t understand what a PUP is, even less what that particular detection might be, e.g. what the hell the file is or does and if it is legitimately on their system.
So a huge majority of your average Joe users aren’t equipped to make the decision about PUP detections and would generally delete them (or rather allow avast to send to the chest, if that is the default detection).
You only need to browse the viruses and worms forum to see this in action, so avast have aired on the side of not having the user confused and making a bad decision. Whilst that is OK for most average users (not wanting to be constantly bombarded with decisions to make), it can reduce the detection rate on some tests like this.
system
50
you are mixing up various tests in this thread which is causing a lot of confusion.
anyway, PUPs are not included/used in the tests.
avast3
51
I understand your statement. How about a check box to turn on PUPs, and Webshield to high, called “increased protection” This covers the best of both worlds.
avast3
52
It is relevant, it makes avast! more resistant to infection of spyware, which is this thread!
Dwarden
53
my primary question was about the sensitivity used in tests … not PUPs…
but i still assume it’s just on normal default 
speaking of which i think Avast! shall use as new default for next updates the high not normal 