Avast only scored a 42 and was rated “Standard”…even MS’s One Care scored higher. Avast seem to also have lots of false positives. Doesn’t seem like the Avast I know and use but those are the results. Has Avast let down its guard or made changes that have not worked well, or have the other companies been quicker to develop better products?
well False Positive thins is one side and caused avast! to get lower award, but there are one other fact and it’s other numbers…
usually avast! was No.2 in detecting rate and Free version was the reason everyone liked it, the No.2 in detection rate + Free version was every user like and going to use avast! and avast reached to +75M registered user.
but now… what’s happened to database and the number “42%” in score. we have seen some report of missed samples in forum and of course thanks to alwil team we have seen some treat that ONLY avast! CAN DETECT THEM. we should see GOOD and BAD both.
I’m not that much expert and don’t know why Microsoft OneCare is Advanced+ (because of very few FP) and the number 60% in total score. but this nice AV (avast!) get lower than it… you friends explain me by true and honest review/opinion
I’d say you should read the description of the test more carefully
This is a pro-active test - i.e. the scan was performed (on new virus samples) with a three-months old virus database.
What are they testing in this situation could be anything but the real situation for an user… sorry… I can understand you can’t run a test for future versions of malware, but testing with an old virus database is not the common user situation. Besides, we know that avast heuristic and proactive compared to other ones isn’t the great one, specially if in a test it depends on signatures like in this case.
right… they were ALL out of date. Still, what’s the point? If you can browse the web and potentially get malicious software installed on your computer, then why would you have out of date definitions for your virus database? I mean, if you’re browsing the web, then your A/V program should have access to the internet too, and it would have updated itself by then.
The only reason avast may be low on the list is because the definitions were out of date, and it doesn’t use heuristics to find “potential” viruses. If it doesn’t know about them, it can’t detect them.
I think it’s just a dumb test. It doesn’t prove anything, other than how many new malwares have been added since the last time they updated the definitions on the test computers. So what?
The idea is that zero-day malware will not have signatures in the database immediately but behavior blocking/heuritics can help improve the AV for these and this attempts to help measure that. Avast has routinely scored in the 40% range for this test since some of the newer malware is a variation of older ones and the signatures can catch some of those.
Yes, of course all antiviruses had old databases, I didn’t say it was just avast!. I was just trying to say that if you don’t update the virus database for 3 months, you can’t expect 99% detection on current malware (and only current - older was excluded, at least as much as possible).
Also, it’s nothing you can change by quickly adding submitted samples.
Thanks, now I got it,
but my question is why avast! went down in compare to other products in that time? I’m not saying why it’s not 99%, I’m asking why it’s lower than other products, I know you work hard to add all new virus samples and I appreciate it.
The test strikes me as being clueless, useless and pointless. It was, in any event careless, since it neither included all the major players nor was the reason for such exclusion explained.
There would be little point in using any of the tested products were the effective detection rate genuinely some 70% at best. However, Avast does not give Hackers a three month’s start and Avast is consistently ranked in top handful by meaningful tests.
I do not even have any confidence that the False Positive results are worthwhile.
If you read the report there is part where it explain that antivirus programs now days use some type of heuristics,hips, behavior analysis,behavior blocker, HIPS, complex generic signatures and not only simply signatures to detect new/unknown malware. In Avast case, it uses generic signatures(if i am not wrong). Maybe the lack of some type of heuristics or better generic signatures affected Avast results.
I don’t bother to read those test results … Coz I just don’t trust them! :-\
It doesn’t seem to matter… Where they’re from, who did them, how they were done, what they were done on.
They Always Smell - Fishy! >:(
Just For Instance!
Has any of you ever been to the Symantec forum?
And seen how many people are always getting infected : Also!
I had NIS 09 on my laptop for approx 4 months … Had 3 False Positives :o
I’ve now had Avast Home for about the same length of time … False Positives = 0 8)
Experience and Forums - Speak Louder Than… Fishy Tests! ;D