Sep 21, 2009

Are anti-malware products a commodity?

As any antivirus/security vendor will tell you: "No. we're not all the same. Mine is better." :-) Yet, in the Sept 2009 issue of Infosecurity Magazine, Forrester analyst Natalie Lambert suggests that all anti-malware products are essentially the same. "Generally speaking, antimalware is antimalware; what you get from one vendor is not much different than what you get from another."

It is perhaps understandable how one might believe this given all the marketing and the sheer difficulty in empirically discerning otherwise (but not really for an analyst). Much of the testing shows scores between 98 and 99%. And other long-standing organizations have essentially declared as much through their certifications. Dozens of products have achieved the Virus Bulletin VB100%(tm) award, and still others tout the Westcoast Labs Checkmark(tm) certification as a moniker of distinction. And ICSA Labs has certified 52 antivirus products to be up to snuff. So they must all be great, right?

Wrong. This is where real-world independent testing comes in that actually measures meaningful differences, like proactive protection (keeping malware off the machine), time to add protection, and protection over an extended period of time. In our recent Group Test of corporate and consumer endpoint protection products using our Live Testing methodology, we found a dramatic stratification of products' abilities to stop socially engineered malware (the kind that tricks users into clicking 'download and run'), currently the largest infection vector. Here are some key findings from the consumer report:
  • Proactive 0-hour protection ranged from 26% to 70%
  • Overall protection varied between 67% and 96% (over the course of 17 days of 24x7 testing)
Given such vast differences in real-world effectiveness, what value are these certifications anyways? In our opinion, not much... The threatscape has accelerated and some vendors are adapting faster than others. Unfortunately, testing was not adapting, resulting in a huge gap in trusted, real-world knowledge. The ensuing false sense of security creates greater risk for companies and consumers. We are filling that hole by delivering data based on our Live Testing methodology.

Since we performed these tests on our own, without any vendor funding, we are selling the group test of corporate endpoint protection products. See all the anti-malware product reports.

Which products we tested:
  1. AVG Internet Security, version 8.5.364
  2. Eset Smart Security 4, version 4.0.437
  3. F-Secure Client Security version 8.01
  4. Kaspersky Internet Security 2010, version 9.0.0.459
  5. McAfee VirusScan Enterprise:8.7.0 + McAfee Site Advisor Enterprise:2.0.0
  6. Norman Endpoint protection for Small Business and Enterprise
  7. Sophos Endpoint Protection for Enterprise - Anti-Virus version 7.6.8
  8. Symantec Endpoint Protection (for Enterprise), version 11
  9. Panda Internet Security 2009, version 14.00.00
  10. Trend Micro Office Scan Enterprise, version 10

Sep 5, 2009

What % of threats do you expect your anti-malware product to stop?

We are about to publish a new round of anti-malware testing data and would like to compare perceptions with reality. I'm expecting some interesting results to say the least.

7 simple questions here:
http://www.surveymonkey.com/s.aspx?sm=oiGBnkYL3i_2bBTEE4P24QNA_3d_3d

Thanks for your help