avast! not in latest AV Comparatives Retrospective

Why did avast! decide not to have their scores published when tested on proactive detection by AV Comparatives? Makes me think the score sucked otherwise there would be no reason to hide it.

I’ve started such discussion in a reserved part of the forum before.
Only one avast member posted there.
Seems they’re looking for few false positives and av awards and not to be good in that particular test.
Also, they’re considering the automatic + the manual blocking.

So avast! didn’t do so hot in that area?

As the other did not get good grades on false positives :wink:

As I commented in another thread, Avast seems to be going backward to some extent. I know such testing does not indicate everything–but it does indicate something. Real world is ideal, but my real world usage is different than others, etc, etc. The fact that I’ve had a couple things slip by avast in the past few months tells me my real world detection is a little off for such a careful user as myself. I can’t image what it would be like if I were going all over the web like some.

As I’ve said before, I love avast enough to pay for it, but I do hate to see detection and removal rates going backward not forward.

If I remember right, this test was “not” on the full product.
Which is why I feel this test is misleading.
Wait for the full product test which should be out later this month or early next month. :slight_smile:

edit In the full product test in the March-June tests avast scored very high receiving the “Advanced + ***”.
AVG “Standard *”
Avira “Advanced **”

“Real World” tests. :slight_smile:

Avast might do a little better at AV-TEST if you consider 10th better. Being behind two Panda versions, AVG and several others. Most others were Internet Security if that matters any, but the free Panda AV and Qihoo AV.

I have no idea what tests you are looking at! ???

Try av-comparatives.org…click Comparatives\Reviews…scroll down to "Whole Product Dynamic (“Real World”) Tests. It will open a PDF file. Read and be enlightened. :slight_smile:

The next av-comparatives release will use avast free and not AIS. Both free and AIS use the same engines.

The AV-TEST I referred to is…
http://www.av-test.org/en/tests/test-reports/julaug-2011/

I’m aware of the av-comparatives.org and been reading them for several years and thus been enlightened! Good stuff there.

Try the test I referred to. The one you mentioned is also good. :slight_smile:

Yes, I’ve read that and avast overall does well, but even here in the charts you can see detections getting less impressive.

Clicking on the http://chart.av-comparatives.org/chart2.phpshows detections de-improving!

There may indeed be some variables that explain what seem to be detection rates going in the wrong direction, but on the surface as I read it avast is being overtaken by many others. The difference is not all that great in come respects, but it’s the trend that concerns me a little–an avid avast lover. I’m just hoping the guys and girls at avast! can get those numbers back up.

Avast looks pretty good here.
http://www.virusbtn.com/vb100/RAP/RAP-quadrant-Apr-Oct11-850.jpg

I like comparing the three we’ve listed.

That chart is very difficult to read. Just to use one area to compare does not give anyone a very clear picture. That is why I prefer, and use, the whole product tests using real world tests. By using the whole product test a user gets the whole picture to make an informed choice. :slight_smile:

For who might be interested, and the linked chart is not clear enough, Avast results for that test is about 95% reactive detection and about 85+% proactive detection. Note that I’m not saying anything about the quality of the test or comparing them. I’m just posting Avast’s numbers on that graph, in case is necessary / wanted by anyone curious.

For what it’s worth http://www.avast.com/en-us/pr-avast-free-antivirus-wins-12th-straight-vb100-award
Also for a while avast detection rate did lag, it is getting much better over the last couple of months. :slight_smile:

Thanks ad4um!

Another test to consider. I can’t comment as to how accurate it is, but when combined I think it helps one get a better picture. This is not a detection test.

http://www.passmark.com/ftp/antivirus_12-performance-testing-ed1.pdf

Was the test paid by Norton?

I would assume so given the wording and highlighting.

Why do we need a paid comparison? It’s not independent and, really, I cannot trust it at all.
Just ad technical bla-bla-bla.

Things are bad when a company feels the need to pay for a test. Norton sucks. ;D

well the test for sure was run w/o the caching feature enabled in Avast! (e.g. the PE scan test)

also there was for sure some weird anomaly
(or maybe Avast! has bug / defficiency on the media conversion test, cause the result is absurdly wrong
it’s nearly as bad as Gdata absurd low performance on the 180k write/open/close of file)

the installation size and amount of registry keys is weird for testing too