New AV-Comparatives test

On-Demand comparative

http://www.av-comparatives.org/index.php?option=com_content&view=article&id=144&Itemid=152

This was an On-Demand test and not a Prevention Test…I use a free-anti virus for prevention and use Malwarebytes and Hitman Pro for detection, so I am anxiously waiting their Prevention Test…

Why not look at the previous reports, most AV take a hit on this, if they score Advanced+ in the on-demand, many drop down to Advanced.

Obviously previous retrospective tests will be using 4.8, so I’m hopeful that avast 5.0 will do better.

unfortunately Avast v5 doesnt seem to have improved over v4.8 at all… in on demand scanning at least :-\ it isnt a deal breaker, but i was still hopeful to see a noticeable improvement in pure detection.

Actually, you can hardly compare the numbers against the previous results (to say that there was no improvement against avast! 4.x).
The methodology of the test has changed this time - specifically the way the malware samples are chosen. The sample set is new - and that’s what affects the results heavily.

If you had two different sample sets at given time (and by “different” I mean really different - comming from different sources, not just a random split of one bigger set), and ran two tests like this one, the results may be completely different (to the extent that 97% in the first test might be the 1st place, and 97% in the second one might be the last).

That’s right.
But why shouldn’t Avast5 do better against new samples as the new samples are more significant than the outdated ones. I really wish Avast to finish in top 3 as on demand & realtime detections are concerned but maybe its asking for too much.
Still, many users including me expected Avast5 to come out better than Fsecure, Bitdefender & Eset in the on-demand test, this is not wishing for too much.

OK, I guess I didn’t use the best possible wording.
By “new” I meant “different”… i.e. a “newly compiled” sample set - but I didn’t mean to say anything about the age of samples themselves.

In the previous test, the samples were also “new” to a certain extent… I’m not saying this has changed, just that the samples are different now, possibly coming from different sources and chosen with different priorities.

;D :smiley: well done to avast what i am pleased with is the reduction of False positives. ::slight_smile:
Keep up the good work.

well even so, most of the other vendors still had at least similar results to the previous test while avast is also similar to the last test as well, which is unfortunate. i was expecting a noticeable diff even if methedology was changed, doesnt seem to make much of a diff to other vendors results (avasts as well it seems :()

Hello, it’s better to you based upon the tests proposed by VLK here: http://blog.avast.com/2010/03/03/malekal-2010-test/ … For the direct link is here (Google translation): http://translate.google.com/translate?hl=fr&sl=fr&tl=en&u=http%3A%2F%2Fforum.malekal.com%2Fcomparatif-antivirus-gratuits-2010-t23535.html

The AV Comparative tests are obsolete (this is only my opinion) because they are only “on demand”. I use Avast Internet Security Suite and I can not endorse the results of recent tests of AV Comparative.

Thank you and wish you a good day.

PS: sorry for my bad English

theyr not obsolete, they show a different area of the AV, and what i use my AV for is to be able to detect, i dont rely on it alone to prevent hence why i want to see detection capability. whats the point of even creating signatures and traditional methods of detection, whats the point if u can just focus on prevention and not need sigs of any kind? thats why these static tests still have relevance to certain situations.

Good work to the ALWIL team. Just one thing why alwil wanted to they test the free version of avast? and not the pro?

The pro version wouldn’t have made any difference. The results would be the same in that on-demand test.

I dont use anything of avast on demand - except uninstall utility

are they talking about about how avast fares when a special scan is actioned?
that is, outside the resident running in real time, a special scan is scheduled to give some idea of the performance of the antivirus?

I was not particularly happy with Avast’s performance especially seeing as how Avira did fantastically well and also managed for the first time to really reduce their FP’s. Avast was caught a bit short in Trojan detection and most current malware is of that type. Avira did great.

I too ONLY use an AV for real time and on demand SIGNATURE AND HEURISTIC detection. I do not use shields like Web shield. (I do use zero day behavioral detection but IMO that should NOT be behavioral but classic HIPS which is FAR STRONGER PROTECTION). I detest Suites and want an AV that has nothing except outstanding real time, on demand, network scanning and zero day protection preferably classic HIPS rather than behavioral in nature . I also do not use anything in the cloud or anything that calls home all the time checking lists in the cloud or tattling to mommy in the cloud, etc.

Avira has recently stated that the free version cannot protect users anymore because it does what I think an AV should do and nothing more. But supposedly, a real time detector is deficient now and one must use a web shield/detector to be protected. I think that is BS and designed (in the case of Avira) to force users away from free AV. Avast has a webshield for the free version also but I would never use it. I use the Proxomitron and I cannot daisy chain two web proxies together and expect to have any enjoyment of the internet. So, if AV vendors are now claiming that their products are effective only if you have the web protection part turned on then I will stop using AV and use other types of protection. Thus, IMO AVComparatives tests are extremely valid and not the slightest bit old fashioned. Never put your eggs in one basket. It it is very foolish to use an AV vendor’s Suite for that reason alone.

I realize that all AV is headed toward in the cloud weak, behavioral only (instead of strong classic HIPS) detection but as that gets further along, I’ll probably be gone and using other protection. The current model of signature and heuristics does have to change because of the sheer magnitude of the malware we see now so that vendors cannot keep up with the onslaught using traditional methods.

It’s not bad for avast!, though i had slightly higher expectations.

Me too.

On the other hand, looking at most of the samples missed (especially in the Trojans category - by far the largest), I don’t think we have a problem.

Moving forward, I think that Andreas (the person behind av-comparatives) will be putting more and more emphasis on the “dynamic” tests, and less and less on the traditional on-demand tests. While on-demand tests are interesting to look at, they only exercise certain parts of the product and the results don’t necessarily reflect the real-world scenarios.

In any case, congrats to Avira, they did fantastically this time.
http://forum.avast.com/Themes/babylon/images/post/thumbup.gif

Thanks
Vlk