So then prefering accommodation …
This is Av-Comparative Retrospective/Proactive Test February 2010, published in June 2010. Or am I missing something ???
Greetz, Red.
Henrique, better is avoid criticism in my opinion.
Everyone recognizes (avast team either) that there are room for improvements.
The previous Retro test was in Nov 09. The Feb 2010 test was an on-demand test. This June test is a new Retro test.
Of all the AVC tests I believe that the Dynamic tests are the most likely to reflect “real world” conditions.
Avast was tied for third, along with MSE, in that last Dynamic test. The June test mentioned today would not cause me to dump Avast in view of previous on-demand and dynamic tests. It remains an excellent AV.
I think that Microsoft has a real winner in MSE. Avast Free and MSE knocked the socks off most of the paid AVs.
I am not sure if there is a clear choice between those two freebies.
Regards,
Jerry
i also am dissapointed in these results. however in my daily use of avast i see it detect a lot of things especially drive by’s and scripts etc. i have seen WAY more pop ups and blocked items then i ever did with nortons with its pretty over sensitive sonar detection which suprises me. but all in all these tests do sadden me and i sure hope they improve very soon. in the last tests avast did overall very well imo they failed this test for me as well as what jerrym said. i really like avast and so much so i bought a number of lic’s just hope it doesnt go downhill from here.
please keep improving the software and forget about things like the ads and stuff till the program performs the best it can imo…
C’mon… let’s not start to talk about an ad that is not an ad…
some of us feel it is… but thats besides the point. work on the overall detection and program BEFORE trying to start other things like that is what im trying to say. get avast back to the top spots and then worry about the other things later
if its beside the point, dont mention it to begin with… :![]()
This thread is in need of something positive. Avast’s SuspBehav detection is working, avast was one of the few to detect this nasty dropper:
http://www.virustotal.com/analisis/09ff417954bb3996fb5d7737f3286eee2eb5e0a97aa7c6ebec2e55edc3df9d8c-1275672649
Guys don’t get too much embarrassed. Avast always performed good. See for example the Retrospective/Proactive Test test of Nov 2009 (53%) and May 2009 (42%). All these tests are carried out only with a small subset of the vast malwares. Avast has always been consistent performer. So, one should not be too much upset with this result.
For those who are praising about MSE should know that it failed in the last VB100 award for in-the-wild detection.
And also do not give excuses about the early stages of v5 or so, because all the products were tested at the same point. It’s just a coincident. Just tell me how many thread are there in this forum that bash avast for not protecting against virus infection?
It is just as important for an AV to keep up with new malware by updating it’s signatures as it is to detect new things. They say in the test that they used the same signature base from February that they used back then for the other test. Also notice that the version of Avast! used is not the latest one and they admit that behavior analysis was not used. The second fact pretty much nullifies the test results for me. If they tested the latest version with the latest signatures, I’m sure every sample would have been detected. Even the latest version with the February signatures would probably do better. I don’t think we can give these results much credibility when they used a version of Avast! that is at least two updates behind.
Please I think you guys are not understanding the problem. If they had to test with the latest engines of Avast, they will have to make all the viruses their own. Otherwise there will be no significance of this test.
These test shows how avast and all other AVs proactive defense were at Feb2010, not now. Also in these 3 months all the AVs have improved.
Does anybody noticed in the Retrospective/Proactive Test May 2010 they were using the old Avast version 5.0.396 ???
I’ll bet
the latest Avast v5.0.545 might have improved the Retrospective/Proactive result we have to understand that not all AV testers out their keep the software up to date before the test result start, I have no problems Avast will keep getting better each day Vlk may release the next Avast version soon that may cover the problem who knows ???.
The real question is for everybody in here Who do you TRUST!
My vote is Avast
;D
[quote author=sanjose123 link=topic=60554.msg510988#msg510988 date=1276010384]
I think Avast focusing on Program Version which fixing plenty of bugs and adding languages which they forgot to focus on Very basic function of AV’s? Detecting, Removing and Blocking.
Im very sad to see the result.
Avast beaten by AVG which is crap!!!
Says who? the results do not say so!!! unfortunately Avast did bad but that is not the end of the world. We should not bash other product because avast did bad. grow up people stop murmuring.
Sorry for my English skills but I need write my opinion.
Avast is good antivirus but one big problem for me is avast support is very slowly. I send often about one virus sample per day to avast lab but research durative about 1-2 days its too long. All samples I sent to virus@avast.com. Avast support must be quickly half day to release update on this sample. Avast has big problems with pdfka detections and fakeav detection. In nowadays very much infiltrations are focus on fake av products and pdfka / trojans. Avast must be more proactive. I am waiting on changes.
Good job for microsoft developers
Microsoft essentials is good free alternative product.
The testing is getting more and more problematic. And on each AV conference there are multiple papers about how to do proper testing (not that I think that all of them make sense 8))
I have objections against all AV-Comparatives tests performed, also the Av-Test, but those are less ‘documented’, so it’s hard to tell where the deficiencies lie.
The usual points about static testing are:
a) the tests are carried long after the real infection took place, so it’s kind of useless from today’s point of view
b) the tests are carried without any context state information. Such information - if there is file named “document.doc .exe” in email, this is enough to ban the execution
c) the tests are carried only with the signature engines - they don’t test the other generic protection engines the products may have
d) the tests don’t know anything about the relationship of the samples. If you detect the dropper, you don’t have to detect the dropped binary.
e) the tests are too binary-centric and have only small amount of script/pdf/flash malware, althought these are one of the main vectors of getting thru to your computer.
f) there is little of no info on how the testbeds are created. All these 99.1% and such scores are complete nonsense from my point of view. The overlap of the product’s detections is not as great as clementi/marx tests suggest.
This is not an excuse, that’s an explanation what your really should read from the static tests. Yep, it’s nice to be on the first places, but the world does not end if you’re not there.
Regarding the pro-active test, this is the most flawed test of them all. It does NOT test the ability of the product to protect you from the unknown malware. It tests the ability of the signature engines to detect the samples Av-Comparatives got in the test’s timeframe. For example, what if the engine authors already had the samples and wrote the detections and Av-Comparatives added them later? We’re back again in the ‘testedbed construction’ problem.
coper: pdfka samples should be well covered, regarding our internal stats… fake av detections need to improve, that’s right (but they’re difficult to detect proactively - Mystic compresor e.g. - used to wrap some rogues - brings new anti-emu tricks in each generation)…
I think Its about solidarity human who want help avast community with virus sample.
Coper, regarding the pdf detections… I dug out all pdf samples we got from beginning of january.
It was 10104 unique samples. Scanned with the latest signatures of command line scanners.
If we assume, that there are no false alarms, and we can say that 1 detection of any AV means the file is real malware (I know this is oversimplification), then 9226 files are detected.
[tr][td]AV[/td][td]Detections[/td][td]Percentage[/td][/tr]
[tr][td]Avast[/td][td]8441[/td][td]91.5%[/td][/tr]
[tr][td]Kaspersky[/td][td]6908[/td][td]74.9%[/td][/tr]
[tr][td]Bitdefender[/td][td]6340[/td][td]68.7%[/td][/tr]
[tr][td]NOD32[/td][td]4772[/td][td]51.7%[/td][/tr]
[tr][td]Symantec[/td][td]4136[/td][td]44.8%[/td][/tr]
[tr][td]Microsoft[/td][td]4044[/td][td]43.8%[/td][/tr]
[tr][td]Avira[/td][td]2946[/td][td]31.9%[/td][/tr]
[tr][td]AVG[/td][td]2167[/td][td]23.5%[/td][/tr]
Now tell me, where is your comment based on our pdfka problems based on?
And this is completely ‘honest’ without any deliberate messing with the testbed and using the latest signatures. Again - there is nonzero possibility that AVs with bad scores may have some generic anti-exploit protection techniques in their full fledged scanners and are able to protect their customer even contrary to the fact that they had ‘bad score’ in my ‘test’.
I always read about Avast and Avira.
Avast is good on Pro-active protection not excellent on demand scan. I mean not very good on detection rates.
Avira is good on demand scanning but not good on Pro Active Protection which cause plenty of FP.
I hope next Pro Active Test Avast would came Top 5 for Av comparative.
I know Avast is not good on Detection rates so i always back up and anti malware software.
I always use Avast and MBAM.
Avast for protection.
MBAM for detecting and removing malware which avast missed it.