Start a Conversation

Unsolved

This post is more than 5 years old

2 Intern

 • 

5.8K Posts

10457

April 25th, 2014 21:00

Recent AV-Comparatives Tests

File Detection Test March 2014
http://www.av-comparatives.org/wp-content/uploads/2014/04/avc_fdt_201403_en.pdf

Real-World Protection Test March 2014
http://www.av-comparatives.org/wp-content/uploads/2014/04/avc_factsheet2014_03.pdf

Comments:

-  The File Detection Tests (formerly called "On-Demand") basically tests an AV scanner's ability to detect malicious files usually obtained from sources other than the internet (such as USB devices and other media).  Most AVs typically have detection rates approaching 100%.  More interesting is the "False Positive" detection rates reported. Avast Free did not fare well in this department, with some 95 FPs this time around. AV-C provides a link to all FPs detected, and it appears related to the latest version of Avast (2014.9.0.2013). Most were reported as Win32:Evo-gen, which is a relatively new heuristic detection of Avast (and probably has been corrected since). Panda Cloud AV Free, and MSE had only 1 FP each.

- The Real-World Protection Test evaluates blocking of malware. A total of 1264 malicious URLs were used. MSE, with an 88.4% protection rate, was used as a baseline for comparison. Panda Cloud AV Free protected against about 99%, and Avast Free 2014 blocked about 98% (both with no FPs). These results rivalled their paid couterparts.

- These results are but a snapshot in time, and may not represent long-term protection/detection rates. But I see nothing here to change my opinion that there are good free AVs out there, and that the average home user need not pay for AV protection.

3 Apprentice

 • 

15.3K Posts

April 26th, 2014 06:00

I don't know that the avast F/P problem is inherent to the build (2013)... more likely, it was dependent on the virus database then under testing/consideration.   [For what it's worth, the current avast 2014 build is 2018.]

F/P's are the main reason I choose to have avast ASK me what to do when it finds a "problem".   Of course, not everyone is in a position to investigate such issues and make the appropriate decision --- most users rely on their anti-virus to proceed automatically... and as such, findings of many F/Ps is very problematic :emotion-6:

As Joe noted, Win32:Evo-gen [Susp] is a generic/heuristic detection used by Avast Antivirus products for a file that appears to have suspicious behavior.   By its very nature, any generic/heuristic/behavioral detection is more prone to a F/P than a signature-based detection... so users need take extra care whenever such generic detection is asserted.

Suffice it to say, I'm disappointed by avast's showing in this test... and pleased by Panda's.
 

2 Intern

 • 

5.8K Posts

April 26th, 2014 21:00

I don't worry too much about tests of  FPs from on-demand scans, when choosing an AV, for several reasons:

- I don't run too many on-demand scans, and certainly not deep or whole system scans.
- The FP files AV-C detects in this test I have mostly never heard of.
- Like ky331, I always configure my AV to notify only, not delete or quarantine anything.

Avast Free has no reputation for detecting lots of FPs- I too suspect the latest AV-C test was an aberration due to the database, as ky suggested. Six months ago, when AV-C last ran this test, the tables were turned, with Panda Coud Free having 20 FPs, and Avast Free only 10. (Again, most were generic heuristic detections). I was using both of these free AVs at the time, but recall no FP detections by either.

The undisputed king of low FP detections for some years now is MSE (Defender in Win8). Balanced against this is its consistently lower protection rates in the Real-World tests. Despite the test results, none of these 3 free AVs has allowed any malware on any of my systems. Depending on your safe-surfing habits, and additional layers of security, YMMV.

No Events found!

Top