Start a Conversation

Unsolved

This post is more than 5 years old

2 Intern

 • 

5.8K Posts

6109

April 12th, 2010 21:00

Virus Bulletin: April 2010 VB100 Results for Win XP

"In VB's largest ever VB100 comparative review, a total of 60 products were put to the test on Windows XP, with two thirds of the products earning VB100 certification."
http://www.virusbtn.com/vb100/latest_comparative/index
(free registration required)

Comment:

Which means that fully one third (20/60) of AV programs failed to make the grade, including:
- Kaspersky Anti-Virus 6 for Windows Workstations (1 wildlist miss)
- Lavasoft Internet Security (2 wildlist misses, 2 false positives)
- Microsoft Security Essentials (1 wildlist miss)
- Norman (110 wildlist misses, 3 false positives)
- Check Point Zone Alarm Suite (1 wildlist miss)
- Emsisoft a-squared Anti-Malware (974 wildlist misses, 1 false positive)
- Sunbelt VIPRE AntiVirus Premium (2 false positives)
- Ikarus (973 wildlist misses)

On a more positive note, both Avira Personal, and Alwil's avast! Free achieved certification. And to be fair, so did Kaspersky Anti-Virus 2010.

VB100 has strict standards, and I'm not sure how much importance to attach to a single wildlist miss, or a FP. But a-squared should seriously reconsider its partnership with Ikarus!

3 Apprentice

 • 

15.3K Posts

April 13th, 2010 05:00

Joe,

(being too lazy to register :emotion-5: )

did the article indicate which versions/builds of avast & avira were tested?

2 Intern

 • 

5.8K Posts

April 13th, 2010 17:00

VB didn't mention which free version/build of either avast or Avira were tested.

In their testing methods, VB does mention that testing is done over the entire month prior to the month of publication- in this case it would be March- so I suspect they were the latest versions available in February: Avira 9.x and avast! 5.x. (Avira 10 wasn't released til mid-March). The actual deadline for submission wasn't mentioned.

Products and versions submitted for testing are chosen by the vendors, and I can't imagine they would submit anything but the latest.

2 Intern

 • 

5.8K Posts

April 13th, 2010 20:00

RD:

Good links, and I agree with much of what is said.

On-demand and on-access detection is not the primary purpose of an AV, but is still one function that can be ojectively tested on a level playing field. It hardly tests the main purpose of preventing malware and virus infection in real-time use, and I would not use VB100 test results as a major factor in choosing an AV (or AV -comparatives' results either, for that matter, although I place more faith in their various other tests).

The claim in the Emisoft thread that VB testing is "sponsored" and thus somehow suspect is false. Testing is free of charge.

Another opinion:

"The surprising thing is that while many criticize WildList based tests for being limited in scope (the WildList certainly is not a comprehensive list of malware) so many products fail to pass these tests. This perhaps more than anything highlights their usefulness as a baseline. If your product isn’t reasonably consistent in achieving the VB 100 Award, perhaps you should think about a different one. Often the problem is not detection so much as false detection, making the FP part of the test very important. Any product could detect 100% of all viruses very easily, it’s much more difficult to detect ONLY viruses, and nothing else."
- http://avien.net/blog/?p=479

Thus, when I see Ikarus has yet to attain VB100 certification in 8 tries over many years, I'm hard-pressed to recommend it (and by extension, a2-anti-malware) as a resident AV. Particularly when it had a total of 99 FPs in those 8 tests. My use of a2-am as an on-demand scanner confirms that FPs are frequent.

Conversely, NOD32 hasn't failed certification in 44 tests since 2002, with no FPs over that period. This again agrees with my experience. I can't recall any FPs over 3 years I've used it.

In short, I think VB100 testing has some limited utility, but not as an overall test of effectiness against infection.


 

 

2 Intern

 • 

5.8K Posts

April 13th, 2010 23:00

Just an additional note on other testing done by VB: Reactive and Proactive (RAP) testing.

"Virus Bulletin’s RAP Testing measures products' reactive and proactive detection abilities against the most recent malware that has emerged around the world.

The test measures products' detection rates across four distinct sets of malware samples. The first three test sets comprise malware first seen in each of the three weeks prior to product submission. These measure how quickly product developers and labs react to the steady flood of new malware emerging every day across the world. A fourth test set consists of malware samples first seen in the week after product submission. This test set is used to gauge products' ability to detect new and unknown samples proactively, using heuristic and generic techniques."
- http://sunbeltblog.blogspot.com/2010/04/vbs-rap-on-vipre.html

There's a nice graphic at that link that plots reactive vs proactive detection for April; generally speaking the better performing AVs/suites are in the upper right-hand corner. FPs are not taken into account.

 

No Events found!

Top