New Virus.gr Tests! "2-16 April 2005"

Hi

this site " Virus.gr " make a tests for many antivirus .

and avast take this position " 21. Avast version 4.6.623 - 76.65% " !!, whay ??

2-16 April 2005 (NEW!!!) :
http://www.virus.gr/english/fullxml/default.asp?id=69&mnu=69

*      The test was made on 02-16 April 2005, using Windows XP Professional SP1 on a P4 2600 Mhz, 512MB DDRAM.
*      All programs tested had the latest versions, upgrades and updates and they were tested using their full scanning capabilities e.g. heuristics, full scan etc.
*      The 91202 virus samples were chosen using VS2000 according to Kaspersky, F-Prot, RAV, Nod32, Dr.Web, Sweep, BitDefender and McAfee antivirus programs. Each virus sample was unique by virus name, meaning that AT LEAST 1 antivirus program detected it as a new virus.
*      ALL virus samples were unpacked and the only samples that were kept were the ones that were packed using external-dos-packers (that means not winzip, winrar, winace etc).
*      The virus samples had the correct file extension using a special program (Renexts) and were unique, according to checksum32 filesize.
*      Most "fake" virus samples were removed, as well as "garbage" files.
*      The program PER was not tested because there was no english demo version available.
*      The programs Extendia AVK , BOClean , VET , Titan , RisingAV and Freedom were not tested because there was no demo version available.
* The program InVircible did not include a "typical" scanner-function and could not be tested.
*      The program V-Catch checks only mail accounts and could not be tested.

The following file types were used.

BAT, ΒΙΝ, CLA, CLASS, CLS, COM, CSC, DAT, DOC, ELF, EML, EXE, HLP, HQX, HTA, HTM, IMG, INF, INI, JS, MAC, MDB, MSG, OLE, PHP, PIF, PL, PPT, PRC, REG, SCR, SH, SHS, SMM, STI, TD0, TPU, VBA, VBS, WBT, XLS, XMI, XML.

The virus samples were divided into these categories, according to the type of the virus : 

*      File = BeOS, FreeBSD, Linux, Palm, OS2, Unix, BinaryImage, BAS viruses, MenuetOS viruses.
*      MS-DOS = MS-DOS and HLL*. viruses.
*      Windows = Win.*.* viruses.
*      Macro = Macro and Formula viruses.
*      Malware = Adware, DoS, Constructors, Exploit, Flooders, Hoax, Jokes, Nukers, Sniffers, Spoofers, Virus Construction Tools, Virus Tools, Corrupted, Droppers, Intended, PolyEngines.
*      Script = BAT, Corel, HTML, Java, Scripts, VBS, WBS, Worms, PHP, Perl viruses.
*      Trojans-Backdoors = Trojan and Backdoor viruses.

A lot of these programs I’ve never heard of, but the methodology seems sound and the results consistent with other reviews and tests I’ve seen.

I know it’s a big disappointment for fans of Avast!, but it’s not the best at detecting viruses.

I can confirm this from my own experience: I was cleaning a computer which had been connected to the internet with XP SP1, no anti-virus and no firewall. As you might imagine, it had just about every virus, Trojan, worm and spyware program ever written on it.

I cleaned it with Avast!, Trend Micro Sysclean and TDS-3 and they all found heaps of malware, but a worm remained in memory which non could shift. The removal tool for that worm from F-Secure couldn’t remove it either.

Finally I downloaded Kaspersky and it removed that worm and a couple of other Trojans.

Of course that’s not the whole story: on an up-to-date system with a firewall, Avast! provides good protection. I use it myself. Kaspersky is also very expensive.

However, in terms of detection rate, I’m not surprised it’s No 1.

That’s what makes the “test” completely wrong. You simply cannot use the results of a few antivirus scanners to decide whether something is a virus or not, and then use these results to test the antiviruses! Of course, such an approach favours the antiviruses that were used to choose the samples and is disadvantegous for the other ones.
Also, if just 1 antivirus program detects the sample as virus, it’s quite likely that it’s a false alarm.

I’d like to hear how this was achieved. By manual verification of 91202 samples, analysing each of them and deciding whether it’s a virus or not? I certainly doubt it.

I believe similar “tests” were discussed here a number of times already. It’s simply not that easy to perform a good test of antivirus detection capabilities.

I completely agree with you Igor :smiley: :smiley:

If you choose 3 or 4 editors of antivirus to select a list of testviruses, they will of course choose those who are detected by their own products !!! ::slight_smile:

What I don’t understand it the detection difference between Virus Chaser (88.31%) and Dr.Web (78.71%). AFAIK they still use the same scanning engine and virus defs.

From Virus Chaser site:

The engine of Virus Chaser is developed as a home product by technical affiliation with DrWeb engine of Dialogue Science. DrWeb is continuouslly ranking the 1st place in the bench marking test made by the worlds highest authoritative computer virus magazine, Virus Buletin

It’s easy to perform a good test of anti-virus detection capabilities. Run a scan with one anti-virus program and see what’s left: active worms and Tojans running as processes left behind means that anti-virus program’s detection capabilities leave something to be desired. Run another anti-virus program and if it finds those worms and Tojans, sorry but it’s a better program.

On a positive note, Avast!'s rate was better than AVG’s!

It's easy to perform a good test of anti-virus detection capabilities. Run a scan with one anti-virus program and see what's left: active worms and Tojans running as processes left behind means that anti-virus program's detection capabilities leave something to be desired.

Run another anti-virus program and if it finds those worms and Tojans, sorry but it’s a better program.

  1. Possible yes Easy, I don’t think so.

  2. This then assumes that that AV would have detected the ones the previous AV detected

In 2002, AVG was above Avast!, which confirms my belief that Avast! has improved greatly recently and is now probably the best free anti-virus you can get, and also a worthy pay program.

I can understand the argument that the big anti-viruses found the viruses in the first place so they’re more likely to detect them in a test like this. Are these somehow not real viruses? Shouldn’t other companies keep up with definitions once a virus is identified? If my anti-virus program doesn’t detect something another program would remove, am I supposed to feel happier saying to myself ‘Well, that’s a Kaspersky virus, so of course Kaspersky will remove it’. Or am I right in thinking why hasn’t my anti-virus program removed this ugly, three-headed worm squatting in the memory of this computer and grinning at me from Process Explorer?’

The big boys in the anti-virus field have huge budgets to spend on identifying new viruses: F-Secure has a team in the US as well as Finland, so they can work round the clock. It will always be difficult for smaller companies to have the same detection rate. But the difference is not that big: just a few percentage points. And the results of tests like this can change: I’ve heard the Avast! team is working on improving detection rates for version 5, and they’ve been doing such a good job with the program recently, I’m sure they’re going to close the gap even more!

Hi David,

Then do it in a lab, and do it both ways! Use one, see what’s left. Put your nasties back. use the other.

All programs miss something of course. Even the best have only a 90% hit rate, and Avast! isn’t far behind.

It's easy to perform a good test of anti-virus detection capabilities. Run a scan with one anti-virus program and see what's left: active worms and Tojans running as processes left behind means that anti-virus program's detection capabilities leave something to be desired. Run another anti-virus program and if it finds those worms and Tojans, sorry but it's a better program.

This is nonsense ;), i have said this before.
The second AV could quite easily of missed a couple of the viruses the first AV missed, of course the only way you could tell is to not remove the viruses, but then you wouldn’t know which had better removal rates.

Anyway, lets look at this logically for a second or two.

These AV testers are thowing like 100 malware files at these Anti-viruses and basing there overall view of the AV from it, which is near pointless.
I mean there are so many malware out there, we will give a rough guess at about 300,000 Malware files, so these testers are basing the AV detection rates on 0.03% of the total malware out there, surly you must see how fundamentally pointless this is.

Of course, this could just be my own opinon, but it just seems correct to me.

–lee

These tests are not trustworthy… The only ones you can safely trust are the Virus Bulletin…

If you look here,
http://www.virus.gr/english/fullxml/default.asp?id=69&mnu=69
in the top20 there are A/V’s that nobody knows that exist

  1. AVK version 15.0.5 - 97.93%
  2. Virus Chaser version 5.0 - 88.31%8
  3. CyberScrub version 1.0 - 87.87%
  4. Arcavir - 87.73%
  5. MKS_VIR 2005 - 87.70%
  6. RAV version 8.6.105 - 87.26%
  7. Command version 4.92.7 - 84.92%

What about NOD32 (pos18) which in my opinion is better than those that are above it…

You realise that test’s like this are garbage

Plus this
The programs Extendia AVK , BOClean , VET , Titan , RisingAV and Freedom were not tested because there was no demo version available.

Rank

  1. Kaspersky Personal Pro version 5.0.20 - 99.28%

  2. AVK version 15.0.5 - 97.93%

;D ;D ;D ;D

I perform AV test monthly using FreewheelinFrank logic. All my test are done through real World application, each and every month Avast! proves to be the best in detection. During my daily, worldly journeys Avast! pops up with virus detection and eliminates the pest. At the end of the month I do a scan with KAV, Bitdefender, a2 and ewindo, none of these find anything. :slight_smile:

The internet like in Life has a tiger lurking in darkness in some parts of the World, the parts of the world I visit that tiger doesn’t exist except under lock & key, should I carry a gun in-case he escapes ?

The hardest part any AV Company deals with is not the virus it’s self, misconceptions derived from opinions reported as Fact is the hard part.

To those of you who wish to use my opinion as Fact feel free to do so, a new misconception might be a welcomed change of pace.

http://img67.exs.cx/img67/2082/thcheer6mb.gif
Avast!
http://img67.exs.cx/img67/2082/thcheer6mb.gif

Never heard of disk imaging? Test a program on a infected hard disk. See what it cleans up. Restore the disk image and try another program.

Testing anti-virus programs against real malware is fundamentally pointless? Oh, come off it!

I agree that this test has a lot of unknown programs in it but I’ve said it before and I’ll say it again: throwing malware at an anti-virus program is a perfectly valid test, if done independently and scientifically.

Avast! provides a perfectly good level of protection and detection. The few things it does miss it probably catches a few days later. (I used Kaspersky to clean up a new variation of Codbot which Avast! couldn’t touch last week. The next day a new definition came through from Avast! for a Codbot variant. Betcha that was the same one!)

But anybody who claims in the face of ALL the tests and reviews published of anti-virus programs that Avast! has the best detection rate or is the best anti-virus program is living in cloud cuckoo land.

It’s like saying a Skoda (another fine Czech product I believe) is better than a BMW. Nobody will believe you. Tell me you drive a Skoda because it’s more economical, more practical around town, has more character, is less likely to get nicked and anyway most BMW drivers are knobheads and I’ll believe you.

I say this with the utmost respect for you guys’ knowledge and experience.

In my opinion, such a test like this (even the most respectable Virus Bulletin, ICSA Labs) almost tell nothing about the “real-world detection” of an antivirus software.

In my opinion, the “real-world detection” is not about the total number of malware detected in any tests but depends on speed of reaction/release the proper database/proactive detection to the new malware (e.g. the duration between malware are released, your AV company releases a new database and your antivirus is able to update for new database in order to detect those malware) so that’s why (one of many reasons) Kaspersky releases its database hourly and NOD32, Norman Sandbox desperately depend on proactive detection such as so-called heuristics.

Avast can be very fast to response to the major malware but also slow to some lesser priority malware.

I say this with the utmost respect for you guys' knowledge and experience.

We know, this is a discussion, the idea of what is to get your opinons voiced, so no offence taken :wink:

Anyway,

Never heard of disk imaging? Test a program on a infected hard disk. See what it cleans up. Restore the disk image and try another program.

Good point, didn’t think of that.

Testing anti-virus programs against real malware is fundamentally pointless? Oh, come off it!

I didn’t say that, i was saying that making a decision on a AV’s detection rate by using 0.03% of all known malware is fundamentally pointless (using my example anyway).

Avast! provides a perfectly good level of protection and detection. The few things it does miss it probably catches a few days later. (I used Kaspersky to clean up a new variation of Codbot which Avast! couldn't touch last week. The next day a new definition came through from Avast! for a Codbot variant. Betcha that was the same one!)

I would of agreed with this anyway, its a logical statment.

But anybody who claims in the face of ALL the tests and reviews published of anti-virus programs that Avast! has the best detection rate or is the best anti-virus program is living in cloud cuckoo land.

I didn’t say it is the best, there is no ‘best’ scanner, they all find and do different things, anyone trying to find the ‘best’ scanner is ‘living in cloud cuckoo land’, again, my opinon.

It's like saying a Skoda (another fine Czech product I believe) is better than a BMW. Nobody will believe you. Tell me you drive a Skoda because it's more economical, more practical around town, has more character, is less likely to get nicked and anyway most BMW drivers are knobheads and I'll believe you.

Depends on what kinda cars you like, same can be applied to anything.

In my opinion, such a test like this (even the most respectable Virus Bulletin, ICSA Labs) almost tell nothing about the "real-world detection" of an antivirus software.

True, i actulary liked FastGame’s way of testing an AV, this to me is the most logical and scientific way.

Norman Sandbox desperately depend on proactive detection such as so-called heuristics.

Never really liked the idea of relying on heuristics myself.

Avast can be very fast to response to the major malware but also slow to some lesser priority malware.

Mostly true, but overal the detection rate is still very very good, or it would be pointless using it.

–lee

What I find interesting is that product that came second (AVK) uses both the Kaspersky and Bitdefender engines and yet Kaspersky on its own beat AVK!!! ;D

And avast! beat 63.8% of the competition. Not too shabby. :smiley:

Maybe the reason that KAV did slightly better is because the use of extended database? I don’t know if AVK uses them.

But anybody who claims in the face of ALL the tests and reviews published of anti-virus programs that Avast! has the best detection rate or is the best anti-virus program is living in cloud cuckoo land.

Sorry, Lee.

This one wasn’t aimed at you specifically. Just at the general reaction to tests and reviews like this: that no such test can every have anything to say about the quality of the program (and other anti-virus programs) or could ever reveal any one program to be less good in any specific area.

I think Avast! is a very good anti-virus product, and getting better. I’d recommend it to anybody. Indeed I put it on three computers last week. I hope those people like it enough to buy it, but a least I know that if they can’t afford to, they’ll still have excellent protection from the free version. But I couldn’t honestly say to somebody that there are no other choices, some of which may have an edge in some areas.

I don’t think anybody here is doing Avast! any favours by denying that any review that does not put Avast! at No.1 has any value.

I always try to point out the positive things these tests show: the difference between these tests is small, Avast! protected against 100% of nasties in a recent major computer mag. review (where other major anti-virus programs [Panda] did not,) Avast! is improving all the time and is (in my opinion at least) probably the best free anti-virus program now, and a worthy pay program.

It got a recommendation from the BBC only last week, both in free and pay form (‘a bargain’.)

But I still think there’s room for improvement. Why, if there is no room for improvement in Avast!'s detection rate, is the big drive for Version 5 (as I’ve read) an improved detection rate?