If you have been looking over Matousec’s Proactive Security Challenge, you will see them claim that Avast’s protection equals none.
You will also notice that Comodo has been on top of that list for a long time.
Question is, how accurate and valid are these results?
As Avast IS user, I never had any security related issues, so are these guys at Matousec even testing with real life situations?
Somehow I doubt.
The write a lot about dynamic testing, but if you dig into their testing material, you will find a synthetic test kit which gets some additions from time to time.
That as dynamic as running 3DMark or other benchmark applications to test your computers performance, isn’t it?
Neither have I seen people getting attacked with the methods used in these tests.
So is it best to actually stop real situations or to tailor you suite to stop these tests?
There is no such thing as AV tests or performance tests either.
Not to mention bugs.
I didn’t use Avast for a short period of time, since version 5 had some performance issues, bu version 6 is lightning fast and shines.
The “award winning” suite I switched back from got slower with time and even introduced a few odd bugs.
Yet, matousec adds things to their top 10 which have bugs, are slow and even have a high rate of false positives.
Sorry for ranting, I just had to get rid of it, thanks.
Gizmo criticizes the Matousec’s tests. It’s a technical reading, but seems fair (http://www.techsupportalert.com/content/matousec-personal-firewall-tests-analyzed.htm).
Others point to an interest conflict on Matousec’s tests, reducing their independence (?) (http://smokeys.wordpress.com/2008/04/20/matousecs-firewall-challenge-wrinkle-conflict-of-interests/).
Leak tests are popular mainly because they are very easy to perform: you simply run a program, and it tells you if it passed or failed the test. However, life is not that simple, unfortunately.
The primary goal of a firewall is to keep hackers out of your system, that is, prevent inbound attacks in the first place. It’s astonishing that many firewaller “testers” only focus on outbound protection, completely ignoring the inbound part (which is absolutely vital). It’s like they assumed it worked flawlessly in case of all the products, which, unfortunately, doesn’t seem to be the case, really.
Next, outbound protection is of course also important, but so called leak tests are not everything. There’s a myriad of other things that a decent firewall should do, and which are usually not assesed by these tests. All I’m saying is that testing a firewall is a very complex task and focusing on leak tests is a gross (and inappropriate) simplification.