Concerned about Avasts scores.

I installed Avast for testing, and i’m impressed with its flexibility, power options, and tweakability. I’m also impressed with my in-house testing - even though i’ve not really started testing deeply yet.

However, what alarms me, is how BAD Avast did on the Av-Comparatives testing for retro-scanning. (wording clarified) Avast scored ZERO in most catagories, which is very alarming.

http://www.av-comparatives.org/

Results can be seen there. It did OK in the on-demand aspect test, nothing amazing, but the on-access parts are horrendous! Is this going to be addressed? I’d like to know if it will and how soon, as I hasten to pay for it if it can’t even keep up with the most basic AV product in terms of on-access.

Lastly, I used AVTester3.0 on Avast, and it passed the first two tests, but failed on the last two dynamically generated worms, and encrypted worms. Most AV’s pass all of these tests. I believe these tests are really designed to measure on-access/realtime heuristics, and in this case, Avast failed miserable for me. AVTest3.0 can be downloaded here:

ftp://ftp.externet.hu/pub/mirror/sac/avir/avtst30.zip

I know its a synthetic test, but it uses nearly identical signature variants of real worms, harmless of course. To let you know how deep some programs go, SOME Av’s won’t even let this product be downloaded or installed on your system, as they pick up bits of the signatures within the install/setup program itself.

Awaiting a response, regards.

I use Avast! on three of my home computers. I also use Nod32 and Kaspersky 5.0 (both are considered top-notch) on my work computers [company policy prevents me from using anything else].

Nod32 and KAV “failed” the Avtest3.0 as well. Since they are independently considered excellent products, and both failed, I do not draw any conclusions about the overall performance of either, nor that of Avast! which is an excellent product.

Hope that reassures you. :slight_smile:

My KAV 4.5 didn’t pick up any of the tests…

I forgot to mention, make sure you set your AV product on “DELETE” when it discovers a baddie, or AVTest will report the test as failed. Thats in the help file for it I believe.

NOD32 DOES fail on all accounts if I remember correctly, but KAV4.5 or KAV5.0 will pass with flying colors.

Basically most all of the big name AV’s pass AVTest3.0, a couple don’t, such as NOD32 and I think Panda. What the test basically does is generate a fairly random worm with a real signature, and place it in “Memory” in the form of a VB file or something. At any rate, all AV’s should fire on all 4 tests, i’ve personally found it a good test for trojan heuristic testing. You need to have your AV setup a specific way for an accurate test though. I’ve tried Avast on all configurations, and can only get it to pass the first two. :frowning:

(just make sure to set your AV to delete on discovery)

Thanks for the input and test file Kobra.

Hi Kobra,

thanks for your post.

Results can be seen there. It did *OK* in the on-demand aspect test, nothing amazing, but the on-access parts are horrendous! Is this going to be addressed? I'd like to know if it will and how soon, as I hasten to pay for it if it can't even keep up with the most basic AV product in terms of on-access

What test are you talking about, exactly? AFAIK Clementi (the author of those tests) hasn’t done any on-access scanning tests… Or your talking about the recent “Retrospective/ProActive” tests? If so I’d strongly recommend reading what these tests are – these are on-demand tests too… (OK I can tell you as well: what these tests actually do is scan a database of new viruses with a scanner with old virus database. This shows how the scanner performs if you fail to update it. True, some scanners are more tolerant, some are less (the case of avast) but hey, that’s why there’s the cool auto-updater, right??). :slight_smile:

Lastly, I used AVTester3.0 on Avast, and it passed the first two tests, but failed on the last two dynamically generated worms, and encrypted worms. Most AV's pass all of these tests. I believe these tests are really designed to measure on-access/realtime heuristics, and in this case, Avast failed miserable for me. AVTest3.0 can be downloaded here:

This application has been discussed here on the forum multilple times. See e.g. http://forum.avast.com/index.php?board=2%3Baction=display%3Bthreadid=778%3B . All I can say is that testing AV software on non-existent, artificial samples just doesn’t make any sense. We will NOT be adding any mechanisms to detect the AVTest-and-the-like samples into avast just to make it pass those tests…

Also, to get an idea of who you’re dealing with you may try to visit their “corporate website” http://www.damselsoft.freeservers.com:wink:

Hope this helps,
Vlk

Hello,

I am the owner of www.av-comparatives.org (A. Clementi).

Vlk is totally right. Please read everything exactly before you make such statements. All tests I do are on-demand tests. And such a retrospective test is for interest/curiosity of the AV companies, but not that much for users. All users have to do is update their scanner as often as possible. How well the scanners are then can be looked in the regular tests with actual updates in the other tests (february/august). And as you see, e.g. Avast does not only protect you against Itw-samples, but also against most (over 90%) of the zoo-samples that you will probably never encounter in real life. All the tested products are already a selection of very good scanners.
Detection performance is not the only criteria when you are choosing an AV for your PC. You should take your attention also to other factors. I would recommend all tested products, but it is your decision which one you choose based on other things (compatibility, speed, GUI, ressources, etc.).

Regards,
andreas

Greetings Andreas, nice to have you here. How’d you find us? :slight_smile:

While I don’t take full stock in Clementi’s tests, and I DO understand the methodology behind to some extent despite my confusing wording. =) Anyway, basically what this test is, is to test generic detection and heuristics. Correct? Obviously, his way of doing this, by simply using retroactive databases could probably be considered a flawed method, but I think it also could be considered a reasonable way to test heuristics and generic protection, no? Whats most concerning, is the fact that Avast in some of the tests, appears to not exhibit ANY heuristics or baseline detections without definitions in place. If that statement is incorrect, please let me know, because on the surface this appears to be showing this. I did note very good on-demand, updated definition performance from Avast in his tests, which is to some extent comforting - but still not up to the level of other similar products.

This application has been discussed here on the forum multilple times. See e.g. http://forum.avast.com/index.php?board=2%3baction=display%3bthreadid=778%3b . All I can say is that testing AV software on [i]non-existent, artificial[/i] samples just doesn't make any sense. We will NOT be adding any mechanisms to detect the AVTest-and-the-like samples into avast just to make it pass those tests... Also, to get an idea of who you're dealing with you may try to visit their "corporate website" http://www.damselsoft.freeservers.com ... ;)

Whoever produced it isn’t as important to me is to why generic signatures are missed, and when I combine this with Clementi’s results of heurisitic/general compares, it only adds a bit to my worry. =) Some AV’s pick up the AVTest3.0’s signatures as “Generic or Variant” indicating a heuristic hit - which IS comforting. I’m not so much interested in actual signatures being added for it, as I am about heuristic pickup.

Basically, what i’m saying is, i’m looking for some “Confirmation” that Avast has a deep heuristic system at work somewhere. Can you point me to any specific tests in this regard, or data to shed some light on this? After a poor experiance with some AV’s, and after testing 15+ different products, i’m actually incredibly impressed with this latest Avast (as long as I turn off the basic interface lol!). But I want to make sure my impressions are more than skin deep - and the AH is there.

Thank you in advance!

PS: On the bright side, Avast has picked up every single badguy i’ve thrown at it, including rebased/repacked, masked, and altered Trojans, Trojan-Launchers and Trojan Downloaders. It should also be noted that only 4 AV’s in existance can pick ALL of these samples up (approx 20 of them). These are in my person collection, collected over the internet in my encounters in the last months. NOD32 completely failed on every single threat. So far, my REAL tests are have Avast pegged @ 100%, but I will be the first to admit my testing pool is limited to about 150 threats. Hope my above question can be answered!

Anyway, basically what this test is, is to test generic detection and heuristics. Correct? Obviously, his way of doing this, by simply using retroactive databases could probably be considered a flawed method, but I think it also could be considered a reasonable way to test heuristics and generic protection, no?

That’s correct. And I’d say it’s not a bad way to test heuristics, actually (much better than what this VirusTest3.0 is doing). The figures clearly show that some products (DrWeb, Kaspersky etc.) have very strong heuristics/generic detection – in fact some of them (DrWeb, namely) often rely on it even for detection of known samples; and some have weaker heuristics (Trend, Sophos, avast etc.). In fact, avast doesn’t have any heuristics (besides from the thing in the mail scanner – which is not heuristics in the classic AV sense of word) but it does have quite a powerful generic detection engine (a similar thing) used mostly for the detection of Trojans. It also features a non-trivial unpacker engine that makes detection of Trojans much simpler.

On a side note, heuristic detection has one major flaw that is often overlooked. That is, every virus writer in the world can download the scanner and fine-tune the virus so that it goes undetected. And it’s often pretty simple to do so – tweaking a couple of instructions and here we go! Therefore, I’m personally not a big believer in heuristic detection - it’s just too fragile (in this sense, being a relatively small vendor actually helps - virus writers may test their code with most common scanners but fail to test them with the rest). :slight_smile:

Basically, what i'm saying is, i'm looking for some "Confirmation" that Avast has a deep heuristic system at work somewhere.

As I said, sorry, it does NOT have it. :slight_smile:

Sorry if I let you down – but it’s certainly better to be frank than sorry. :slight_smile: If you feel you absolutely need a scanner with very strongly heuristics (and are willing to see a false alarm from time to time), I’d recommend looking for something like DrWeb…

Cheers,
Vlk

avast! does not have the heuristics in the sense that it tries to decide if the unknown program is or is not the unknown virus. It has a lot of different methods which could help (very strong generic detection, family detection, code emulation etc) against unknown malware.

BTW: If you take a look on the latest real big epidemies, you will see that heuristics simply does not work on most (if not all) of these viruses/worms…

Pavel

I noticed heuristics are mostly useful only for VBS files like LoveLetter was…

I appreciate the “Frank” reply, as my tests were indicating something “Fuzzy” about the Heuristics in Avast, which brought this question up in my mind to begin with. I’m not one to discount small anamolies in my tests and tend to ask more questions than most people.

What I DO find interesting, is that Avast is able to pick up clearly rebased and repacked samples that totally stump most other AVs. So in that view, I don’t take lightly your statements that Avast uses Generic Detection(a form of heuristics if we want to debate it with people), Family Detection and Code Emulation. Because my tests - at least in my eyes to me personally show that IT DOES offer these features.

Now i’m not one to say Heuristics are the end-all-be-all feature of an AV, but I do like a decent heuristic engine to ride alongside the definitions and generic detection algorithms. Is this a feature planned for Avast at anytime soon? If think if it did implement them, it would probably be incredible in detection levels! False alarms I can deal with, if they significantly improve my survival rating. ;D

Oh ya, 1 more note, I noticed in Avast itself, several locations refer to “Heuristics”. That might cause confusion if it really doesn’t have heuristics in the traditional sense!

Not trying to be inflamatory, just trying to understand. :slight_smile:

Well one of Basic Heuristics could be extension recognition.

For example if you download file that is named like this:
iLoveYou.jpg.vbs

In this case avast! could detect dual extension (or even triple or whatsoever) and warn user about suspicious extensions with description on which are the fake extensions and which is the real one. I have never meet more extensions together except on worms that spread via mail or P2P so there will be practically none false positive. Not quiet sure how this checking should work (seperating and extension lenght,because in thsese days there are also more then 3 letters in extensions),but here the smart heads of Alwil come in action :slight_smile: I think Internet Mail module already posses this feature,but it could be usefull for On-Access scanner too. And this would be the first “heuristic” hehe :wink:

What do you think?

Just got word, every module in the next release of NOD32 will have Advanced Heuristics… Sheesh, I have had some bad experiances dealing with NOD32 tech support, and their poorly run forums, but man, sounds like that might be the solution whenever they release it.

Attached shot shows their beta test version of AMON the on-access realtime monitor, with Advanced Heuristics enabled.

Tempting… Very tempting… So honestly, theres no plans for AH/H in Avast at this time?

Hm i’m currently working on one right now. You can check the discussion in WISHLIST thread in this section of the forum:
http://forum.avast.com/index.php?board=2;action=display;threadid=57;start=390

Its a very Basic Heuristic,but it should do the trick with minimal complications and false positives :slight_smile: Something more like passive heuristic :wink: I could also “programm” a small simulation (presentation) on how this thing would work.

Yeah, heuristics has always been a very good marketing tool - and marketing people simply love it :wink: ;D!

True is that it might work sometimes and it does not work other times. What do you want to analyze heuristically in the packet sniffer? New buffer overflow threads? C’mon!

Did you ever try to “analyse” scripts? There is so many ways how to fool the analyzer! If you focus on it, heuristics is a method which can be fooled quite easily.

So to make the story short - I do not believe heuristics can be much useful. But as one my old friend says: what is heuristics anyway? Actually - all current detection methods might be called heuristics ;D - so yes, in this way, AH/AP does contain some heuristics methods!

Ok, enough for now :wink:
Pavel

Good point, there IS alot of hype in this field, most of it rather unfounded. Honestly, I feel most AV companies seem behind the curve on this stuff - if you disagree, please feel free to correct me.

I’m thinking a system like Normans, with a virtual sandbox, where you actually “Execute” the badguy, watch its reactions - basically let it play in the sandbox - then determine if its a possible new threat (this explains how Norman’s sandbox has defeated some new threats without definitions). That seems to be a very very good system. In principle, i’ve yet to see more than a couple AV-Testers put Norman through the paces, because its slower due to this system, and when they test 200,000 baddies, it would take a year. =) Anyway, I like the idea behind that system. (Clementi won’t test Norman because its too slow and I think that is doing the consumer a disservice, but i’ve put Norman through my 150 samples, and it was 100%, and i’ve heard similar stories from others)

Another system I was thinking about, is perhaps something like a behavior analysis situation, where the AV product simply knows how the system should behave, how a given file should react, and any deviation from this would cause further scrutiny. Artificial Intelligence if you will. Could that be the future? Rumor has it, McAfee is feverishly working on this behind closed doors.

Multi-Engined systems intrigue me, double definitions -cross checked/verified between each other. Which also would be double layered heuristics. F-Secure impresses me with its unmatched detection/heuristics, but behind the good, is a sloppily coded application. eXtendia AVK using KAV+RAV engine is the most powerful single AV product i’ve seen and the double heuristics work perfectly. Great system there, but the AV has a couple bugs that bug me, and lacks some features Avast has. Still, I like the multi-engine idea immensely.

Clearly though, I think its time for something new other than the grind out endless definitions MD5 compare techniques. I applaud Avasts exceptionally feature rich product, it really is a wonderfully designed and implemented piece of work! But I wish it employed something a bit more state of the art than a primary definition table system.

Heuristics may be a marketing scheme, but i’ve seen them in action, and if done right, they are what seperates the top shelf products, from the mid or bottom shelf offerings. Heuristics on my machine, have stopped rebased trojans on more than one occaision, and I hate to think what would have happened with a product LACKING heuristical detection!

Just my thoughts I guess… I’m pretty sad Avast lacks deep detection of unknown/rebased threats, i’m sad because its a great product, has great features, and I absolutely want to run it.

Followup to AV-Comparatives, I don’t fully agree with this testing methods, nor some of the rational there. For example, why bother testing DOS viruses? These are mostly WIN32 AV products here, and I know for a fact, DOS samples would skew results improperly.

Secondly, he doesn’t test multi-engine products, or unique products like Norman AV - a corporate strength sandbox architecture that is proven in the field. Various excuses are that multi-engined products wouldn’t be fair, or would ruin the results, and that Norman in full settings is too slow scanning 300,000 files.

Honestly, remove the dos viruses/trojans, and maybe it won’t take so long to scan. NOT testing some products is a bit disagreeable with me. But I understand he doesn’t do this for a living, and does it more out of a hobby - but the fact remains, if you are going to put your data out there for the world to see, expect critisisms if something doesn’t jive. Not trying to be harsh, i’m trying to be realistic here. Some of the most heuristicly advanced products weren’t even tested there!