Kobra's AV test on 6-14-04

Kobra’s 6-14-04 AV Test.

Testbed consisted of 321 Viruses, Trojans and Worms, all for the Windows32 environment, and all reasonably new samples. I don’t have any data on whether some of these are zoo, or ITW, but they are all real threats I feel someone is likely to encounter, since I got them off the internet (and i’ve verified they are real as each sample must be detected by at least 4 AV’s for me to consider it). All scanners were installed on a clean system, without any traces of other anti-virus softwares - between each test the system and directories were cleaned, and the registry was sweeped. Each AV product was treated with a double-reboot, one before, and one after installation. Each scanner was set at its highest possible settings, and was triple checked for proper options and configuration. Most products were the full registered version when possible, others were fully functional unrestricted trials. All products were tested with the current version as of 6-14-04, and the latest definitions for that date. Each product was run through the test set a minimum of 3 times to establish proper settings and reliability, the only product to exhibit some variance on this was F-Secure, which had one scan come up less than the other two without any settings changes indicating a possible stability issue.

The final standings:

  1. eXtendia AVK
  2. Kaspersky 5.0/4.5
  3. McAfee VirusScan 8.0
  4. F-Secure
  5. GData AVK
  6. RAV + Norton (2 way tie)
  7. Dr.Web
  8. CommandAV + F-Prot + BitDefender (3 Way Tie)
  9. ETrust
  10. Trend
  11. Avast! Pro
  12. Panda AV
  13. KingSoft
  14. NOD32
  15. AVG Pro
  16. AntiVIR
  17. ClamWIN
  18. UNA
  19. Norman
  20. Solo
  21. Proland
  22. Sophos
  23. Hauri
  24. CAT Quickheal
  25. Ikarus

Heuristics seemed to play some of a roll in this test, as no AV had every virus in my test in their definitions, and products with stronger heuristics were able to hold their position towards the top of the test. Double/Multi engined products put up strong showings as well, proving to me that the redundacy method works, and I think more AV companies should considering double-engines. The strongest heurisitical AV I noticed was F-Prot/Command, picking up only 247 samples with definitions but they were able to power through 67 additional hits on “Possible Virus” indicators - very strong! Norton with BloodHound activated had 30 Heuristical pickups, and DrWeb rounded up the pack with 20 heuristical pickups. eXtendia AVK grabs the number one slot with double engine scanning, anything the KAV engine missed, the RAV engine picked up with great redundancy on the double engine/definition system. McAfee actually missed only 2 samples with its definitions, but picked those 2 up as “Suspicious File”, and therefore, scores nearly perfect as well.

The biggest dissapointments for me were Norman and Nod32. Even with Advanced-Heuristics enabled, NOD32 failed to pick up a large portion of the samples. Norman, while finding some of the toughest samples, managed to completely miss a large portion of them! Showing that their sandbox-emulation system has great potetential, but its far from complete.

Actual test numbers were:

Total Samples/Found Samples (321 total possible) + Number Missed + Detection Percentage

  1. eXtendia AVK - 321/321 0 Missed - 100%
  2. Kaspersky 5.0 - 320/321 1 Missed - 99.70% (with Extended Database ON)
  3. McAfee VirusScan 8.0 - 319/321 + 2 (2 found as joke programs - heuristically) - 99%
  4. F-Secure - 319/321 2 Missed - 99.37%
  5. GData AVK - 317/321 4 Missed - 98.75%
  6. RAV + Norton (2 way tie) - 315/321 6 Missed - 98.13%
  7. Dr.Web - 310/321 11 Missed - 96.57%
  8. CommandAV + F-Prot + BitDefender (3 Way Tie) - 309/321 12 Missed - 96.26%
  9. ETrust - 301/321 20 Missed - 93.76%
  10. Trend - 300/321 21 Missed - 93.45%
  11. Avast! Pro - 299/321 22 Missed - 93.14%
  12. Panda - 298/321 23 Missed - 92.83%
  13. KingSoft - 288/321 33 Missed - 89.71%
  14. NOD32 - 285/321 36 Missed (results identical with or without advanced heuristics) - 88.78%
  15. AVG Pro - 275/321 46 Missed - 85.66%
  16. AntiVIR - 268/321 53 Missed - 83.48%
  17. ClamWIN - 247/321 74 Missed - 76.94%
  18. UNA - 222/321 99 Missed - 69.15%
  19. Norman - 215/321 106 Missed - 66.97%
  20. Solo - 182/321 139 Missed - 56.69%
  21. Proland - 73/321 248 Missed - 22.74%
  22. Sophos - 50/321 271 Missed - 15.57%
  23. Hauri - 49/321 272 Missed - 15.26%
  24. CAT Quickheal - 21/321 300 Missed - 6%
  25. Ikarus - Crashed on first virus. - 0%

Interesting also to note, is the detection level of the US AVK version with KAV+RAV engines was higher than the German version with KAV+BitDefender engines. Several vendors have free versions of their for purchase AV’s, we didn’t test the free versions, as it would serve no purpose for this test, but based on the results, none of the free versions would have been very impressive anyway. The term “Heuristics” seems like it should be taken very liberally, as some products that claim to be loaded with Heuristics scored miserably on items they clearly didn’t have definitions for. Scanning speed was not measured, as it was totally irrelevant to my testing, and on-access scanners were not tested, as it would have been too time consuming, but considering most products have similar on-access engines as on-demand, and use the same database, results most likely, would be very similar.

Cut through the hype, cut through the marketing schemes, this was a real test, with real samples, and none of these samples were provided to the antivirus software vendors in advance. This is real world, and these are likely badguys you’ll encounter, since I got them in my real encounters, and all were aquired on the internet in daily activities which anyone out there might be involved in. (Installing shareware, filesharing, surfing, etc). Keep in mind that with ITW tests the AV vendors have full disclosure of what they will be tested on in advance, not so here, so heuristics and real detection algorithms will play a big part, as well as the depth and scope of their definition database.

Ooooooo and where is NOD32’s excelent heuristic part :stuck_out_tongue:
Second one is that you didn’t include ligitim samples. Those would provide much higher false positive number for heuristic based antivirus programs :stuck_out_tongue:

These were all real virus/trojans/worms as indicated. No false samples, no cleaned samples, and no fake samples. So i’m not sure what you mean by “ligitim” samples? ???

NOD32 results were completely the same whether I used norman scan, or the shell extension /AH scan. No different.

So are you going to submit the threats that Avast missed, to the Avast team for consideration in their virus def updates?

Douglas

Honestly, I was HOPING to be surprised by a ton of things in this test, and really all I did was re-enforce many of the other testing sites on their results, mine are very close to theres, which actually shocked me, because i’m sure my samples aren’t the same. This tells me overall, I think this might be a great guage of these products.

Also, I wanted to test the multi-engined products against the others, since most testers seem to not like testing them. Strong showings by F-Secure, and the AVK’ brothers proved this idea works, and works incredibly well. The strength of the KAV engine cannot be denied as well, since all but one of the top 5 products use the KAV engine. :stuck_out_tongue: I forgot to add, one product I tested was called V-Catch, and turned out to be a trojan downloader and spyware application masking as a AV product… LOL! Thankfully it was the last product I tested, and I just reformatted, I think it downloaded 30 trojans to my system. :sunglasses:

I did NOT test any Dos viruses, as this is completely retarded to test these in a windows based environment, it tells us nothing. I cannot understand why Clementi at AV-Comparatives bothers to test them, all they do is skew his test results badly. For example on his test, NOD32 scored 95.51%, but without DOS or other OS samples, NOD32 scored only 87.71%. Which amazingly enough, is within 1% variance of MY results. So i’m oblivious as to why he skews his own results for no real purpose? Who the hell cares what a product scores on DOS?!? ???

Kobra, I admire the effort you put into these tests. But I find this

24) CAT Quickheal - 21/321 300 Missed - 6%
hard to belive! Quick Heal Has the Check-Mark Level one certification (meaning it detects all ITW viruses) and it had the VB100% award.

The Last time QuickHeal was reviewed by the VB guys on XP Pro this is what they had to say

Summary o ItW Overall - 100.00% o ItW Overall (o/a) - 100.00% o ItW File - 100.00% o Macro - 97.54% o Standard - 80.67% o Polymorphic - 91.08%
  Quick Heal has a tendency towards better detection of more recent viruses or those which are currently in the wild. This selectivity is commonly associated with a fast throughput rate for clean files, as was indeed the case for Quick Heal. With such selectivity the chance of false positives is reduced - Quick Heal generated none. With complete detection of viruses in the ItW test set, a VB 100% is netted by CAT.</blockquote> Dont get me wrong Im not saying anything bad about your tests im just saying how can CAT get all those awards and miss 300 of your viruses?

Maybe those were not exactly ITW :stuck_out_tongue: Non-ITW samples are also very important.

Mac, people need to understand what “ITW” means… Of course any AV should score 100% on ITW tests, because ITW Viruses are provided to the AV companies in ADVANCE of the test! Personally, I think ITW testing is virtually meaningless to real world users. ALL the ITW testing tells me, is how well a company is at maintaining definitions for the ITW institute test sample-set. Nothing more, nothing less.

I used to think “OMG, it scores 100% on ITW tests!” and started basing my usage of a AV off that, and let me tell you, it was a sad misconception on my part. People see ITW and automagically assume that it means 100% from everything thats out there. Hardly… LOL… I mean, look at NOD32, a product that on virtually every REAL test, scores in the 80 percentile range, but scores 100% on ITW. Why? Becuase they make sure they have all ITW definitions in their database, and check them extensively to avoid ITW false positives.

Now in addition, you’d be HORRIBLY mistaken to think that ITW covers Trojans, Worms, Malicious downloaders/droppers, and other things. It doesn’t… ITW covers exactly what it says, VIRUSES. You’d be further mistaken to assume ITW covers all known circulating viruses, it doesn’t, it just covers what ONE organization of people considers to be the most prevelant circulating threats out there. In fact, I personally no longer use VB or Checkmark to make my AV decisions, becuase they are so limited in their scope comparative to whats actually out there.

PS: Remember, my test bed included Viruses, Trojans, Droppers and Worms. Theres bigger threats out there than typical annoying viruses, and an AV that ignores those threats, is a poor AV in my opinion.

Side note: Someone recommended I test KAV4.5, and I did. It missed only 1 sample, and scored 99.68%. Considering i’d put the margin of error at 1% either way, thats a 100% product.

On recommendation from KAV5 users, i’m retesting KAV5 with the extended database download. Which should make it 100%, or very close to it, according to the people i’ve talked to that deal with KAV. KAV5 apparently defaults to the non-extended DB.

Edit: KAV5.0 now tested with extended DB option on, and it scores the same as 4.5, moving KAV5 up to second place along with KAV4.5.

Kobra still QuickHeal detects 80% of standard viruses. BTW did you turn on QuickHeals herustics they are not on by default

Just for you Mac, I retested, double checked every setting, and re-checked my testing setup. Same results. Check the time/date stamp on the Quickheal interface.

http://home.comcast.net/~prolawn00/cat.JPG

Personally, I put ZERO stock in what Virus Bulletin says. Remember, these are the same guys that say NOD32 scores 100% in ALL catagories, and thats just flat out BS with capitol letters…

Side note I reinstalled Avast again, went through settings MANY times, re-checked, checked again, and changed some things, and got Avast to detect 299 Viruses, up from 292. Nothing I can do will increase this further. I will adjust my rating of it in the review. =) Avast! Moves ahead of Panda in my test now. Also, someone requested a testing of Ahn’s V3 Pro… Man, it has a great interface, and tons of options, but sure missed the detections!

I’m preparing to zip up and submit these missed samples to Avast as well. Heres the updated results:

  1. eXtendia AVK - 321/321 0 Missed - 100%
  2. Kaspersky 5.0 - 320/321 1 Missed - 99.70% (with Extended Database ON)
  3. McAfee VirusScan 8.0 - 319/321 + 2 (2 found as joke programs - heuristically) - 99%
  4. F-Secure - 319/321 2 Missed - 99.37%
  5. GData AVK - 317/321 4 Missed - 98.75%
  6. RAV + Norton (2 way tie) - 315/321 6 Missed - 98.13%
  7. Dr.Web - 310/321 11 Missed - 96.57%
  8. CommandAV + F-Prot + BitDefender (3 Way Tie) - 309/321 12 Missed - 96.26%
  9. ETrust - 301/321 20 Missed - 93.76%
  10. Trend - 300/321 21 Missed - 93.45%
  11. Avast! Pro - 299/321 22 Missed - 93.14%
  12. Panda - 298/321 23 Missed - 92.83%
  13. KingSoft - 288/321 33 Missed - 89.71%
  14. NOD32 - 285/321 36 Missed (results identical with or without advanced heuristics) - 88.78%
  15. AVG Pro - 275/321 46 Missed - 85.66%
  16. AntiVIR - 268/321 53 Missed - 83.48%
  17. ClamWIN - 247/321 74 Missed - 76.94%
  18. UNA - 222/321 99 Missed - 69.15%
  19. Norman - 215/321 106 Missed - 66.97%
  20. Solo - 182/321 139 Missed - 56.69%
  21. V3 Pro - 109/321 212 Missed - 33.95%
  22. Proland - 73/321 248 Missed - 22.74%
  23. Sophos - 50/321 271 Missed - 15.57%
  24. Hauri - 49/321 272 Missed - 15.26%
  25. CAT Quickheal - 21/321 300 Missed - 6%
  26. Ikarus - Crashed on first virus. - 0%

I didn’t think it was that easy to simply go and download MAL-Ware. :-\

Did you find them all at one place? Can you supply a link\links?

Should the FBI be notified?

I think the issue becomes an issue when you start “Distributing” the viruses with malicious intent. Professionals use viruses to test all of the time, I use them to analyze and examine, and test, as I run into them. Since I write for several technical sites, i’m well within my legal right, especially considering my samples are read-only marked. ;D Remember, AV developers gotta gets there samples from somewhere, and most of the time, they get them from hobbiests or users that send them in after finding them.

On a side note, i’ve gone through all the logs again, and have found some issues with Avast “Skipping” files I don’t want it to skip. I’ll have to take these up with the Avast guys, because if thats the cast, it would dramatically effect Avasts scores. I’ve checked all my settings on my end, and theres nothing I can do i’ve not already done.

i wonder about this AV

http://www.v-buster.com/

;D :smiley: 8) ??? :o

Kobra I have notified Quick Heal support Staff (Useing my Dad’s Quick Heal registration). I gve them the link to this thread. Reply is as follows:

Dear Kyle,

 We are working on more advanced heuristics. This new engine will be introduced in Quick Heal 7.02. Please tell Kobra to retest Quickheal once the new engine is released. We really do not see how Quickheal missed that many of his samples. If he would like have him send the samples to you for submission (Dont forget to include your Registration code in the email).

Sincerely,
The Quick Heal Team
Http://www.QuickHeal.com

Any date/time on the new engine? The biggest thing I noticed with QuickHeal, was it claims to have heuristics, yet I witnessed NO Heuristics in action. I’ve re-installed it twice now and re-tested twice to make sure.

Secondly, how big is their database? I’m guessing its pretty small, and with or without heuristics, a new AV company is at a severe disadvantage because signatures take time to build, unless they arrange to buy or rent definitions from another company. One product, Ahn’s V3 Pro, has a EXCEPTIONAL interface and layout, and incredible options. But their definition base is so small, its just not a viable product for most people.

I’ve got one more test i’m going to try with Quickheal, and thats going to be on a Win98 machine and see if it behaves any differently, then i’m done with it for now.

oh it has herustics(Weak) However it does have a herustic-like worm detector called the Quick Heal Sensor which runs at startup to check for suspicious changes in the registry and also looks in real time for methods common for worms spreading.

SENSOR FOR NEW WORMS, TROJANS AND BACKDOORS

This new sensational technology is designed to fight the threats posed by new Trojans, Worms and BackDoors.

* Checks most sensitive areas of the system
* Traps and captivates any new Trojan, Worm, Backdoors and any other malicious code
* Powerful Protection from Internet Threats.
* Proactive Technology kills the malicious code before it can act.</blockquote>

The database is a decent sized one, however alot of the old DOS viruses are ommitted. (Most are extinct anyway)

BTW Quickheal has a Personal Firewall that is in BETA form right now. it will be $28 when finished. Mabye you could do a firewall roundup next?

You guys ready for this?

Ok, the bad first:
This is probably the worst layout and implemented AV i’ve ever seen in my life, its horrible to install, horrible to run, and really is a DOS program overlayed with a really bad WindowsGUI. This appears to have NO archival/packer support, and cannot detect archived baddies whatsoever. (well, it is a dos program after all). Overall, its a gross looking and old school operating program.

Now the good:
This thing, without a doubt, the coolest and neatest little thing i’ve seen in awhile. It is definition less, but manages to to detect 265 out of 321 baddies, and considering MANY are packed/archived, thats probably 100% score - right out of the box without any connection to the internet and no ability to update. Its scoring 82.55% without the ability to unpack/unarchive? Ironically, most of the baddies its finding, its finding with some type of Heuristics or code emulation and its very fast. Alerting me with “Definately a unknown Trojan” or “Strange Acting File, Probably Virus”.

Nobody could possibly like this program I don’t think, as far as running it, and using it, its rather a pain in the rear, but I cannot argue with its detection/heuristics and ability to find new stuff. Maybe the guys at Avast should contact this dude, and try to license his technology? :o

another email from CAT on your any date/time for new engine question

Hello Kyle,

Thanks for contacting Quick Heal support!  The new engine is in the 4th Beta stage. We run them through 5 Beta versions to work out most of the bugs. The Beta engine is only given to the public upon request. late June to mid July you should see the public release.

Sincerely,
The Quick Heal Team
Http://www.QuickHeal.com