Dec 10, 2009

Network IPS Group Test Results Available

Over the past few years of working with clients and testing IPS products, we began to notice some troubling trends. Some vendors missed more attacks than seemed acceptable. A number of vendors refused to participate in our testing—even at no cost. Enterprise readers asked some tough questions about current products and issues. And breaches resulting in compromised data continued to increase. Meanwhile, we heard several opinions from our clients, industry analysts, and researchers. These beliefs can be summarized as follows:

· IPS is a mature market. There is relatively little difference between products; thus, management and price are the key purchasing factors

· Best-of-breed products are more effective than those from strategic vendors who provide a wider range of products

· The market leader (from an installed base perspective) provides the best protection

· Organizations are protected as long as they keep their IPS systems updated

This group test set out to determine if these beliefs were correct. In order to garner the greatest participation, and allay any potential concerns of bias, we invited all leading vendors to submit their products at no cost. Every vendor below brought and configured their best Network IPS products; all were generally available (GA), no Beta or otherwise unavailable products were included. The following is a current list of the products that were tested, sorted alphabetically:

1. Cisco IPS 4260 Sensor, version 444.0

2. IBM Proventia Network IPS GX4004, version 29.100

3. IBM Proventia Network IPS GX6116, version 29.100

4. Juniper Networks IDP-250, version 5.0.110090831

5. Juniper Networks IDP-600c, version 5.0.110090831

6. Juniper Networks IDP-800, version 5.0.110090831

7. McAfee M-1250, signature version 5.4.5.23

8. McAfee M-8000, signature version 4.1.59.23

9. Sourcefire 3D 4500, rules version 4.8.2.1

10. Stonesoft StoneGate IPS 1030, 5.0.1 build 5053 update package 261

11. Stonesoft StoneGate IPS 1060, 5.0.1 build 5053 update package 261

12. Stonesoft StoneGate IPS 6105, 5.0.1 build 5053 update package 261

13. TippingPoint (TP) 10 IPS, DV 2.5.2.7834

14. TippingPoint 660N IPS, DV 2.5.2.7834

15. TippingPoint 2500N IPS, DV 2.5.2.7834

Some vendors provided multiple test units of varying performance levels. Across the board, all vendors claimed identical protection across models.

The results will be shocking to most and have already generated plenty of buzz. 1,159 live exploits is the most ever tested. If you currently operate one of these products, or are considering investing in an IPS, the information in this exclusive test report is invaluable.

Dec 8, 2009

TippingPoint Tests

In response to inquiries about the blog posting made by TippingPoint President, Alan Kessler, we provide the following:

In August 2009, NSS Labs performed an independent test of the TippingPoint 10 product and determined it only blocked 39% of common exploits. Subsequently, TP came to our lab for private testing for further assistance, as they stated. TP customers can see a spike of hundreds of filters which appeared in October and November.

In early December, NSS Labs released its independent IPS group test of 15 different IPS products submitted by 7 vendors, including TP. The product improved marginally, but is rated ‘caution’ due to its subpar protection on our tests. Now TippingPoint has publicly complained in this tippingpointblog that the test must be inaccurate because it didn’t correspond with the results of their private testing with NSS, with 'customer experience', nor with their internal testing.

3 response points:
1st. These modern IPS products are so complex, that customers will rarely be in position to question or test a vendor properly. And they rarely do when it is a brand name. Very few enterprises have the sophisticated testing tools, expertise and access to exploits like the vendors and a professional security testing lab like NSS. So, having a lot of customers does not necessarily mean they are aware of the true protection they are receiving. In fact, not knowing is a liability in itself for all involved.

2nd. RE: Private testing results. At NSS we don’t use the same attack set in our private testing, as we do in public testing. That would be like getting a copy of the test and answers beforehand, and would give private clients an unfair advantage over other vendors. We do test the same vulnerabilities, but the specific exploits we use vary. This should underscore the integrity of NSS Labs testing principles and procedures. In general, differences in results could be attributable to signatures written too narrowly; e.g. for specific exploits vs vulnerabilities, or to signatures written for a test lab environment.

3rd. We certainly cannot account for any vendor's internal testing procedures. However, the findings of our two previous tests were ultimately corroborated.

As far as delaying the Network IPS Group Test Report. It would be unfair to enterprise readers all around not to disclose validated testing results that could help them mitigate threats that might not be stopped by their defenses as they expect (that could be considered irresponsible non-disclosure). A delay would also not be fair to the other 6 vendors who also participated. As with the previous tests, NSS took great care to validate the results. Deciding to act positively upon them to deliver the better customer protection is the next imperative.

Dec 7, 2009

Maintaining Test Integrity during Private Testing

NSS does not utilize the same exploits in the Private test as in Public tests or certifications. This is to maintain the integrity of the public tests and ensure that vendors engaged in private testing are not given an unfair advantage.Attacks used in private testing target the same range of vulnerabilities used in the public tests. However, different versions of the exploits are used.

· The same range of vulnerabilities are represented in both the private and public tests. However, different exploit variances are used between the two types of test to ensure vendors are writing vulnerability-based signatures in order to adequately protect their customers, and not simply writing exploit-specific signatures to perform well in testing. For example, private tests utilize a higher number of Proof Of Concept (POC) exploits and PCAPs, whereas public testing and certification relies exclusively on NSS’ unique and comprehensive live exploit test harness.

· Vendors who write vulnerability based signatures rather than exploit specific ones will achieve similar results in both private and public tests

· Vendors that write signatures to catch POC PCAPs, but not real exploits and variants, may experience different test results between private and public tests.


Of course, this policy means we have to keep investing in keeping things fresh, accurate and relevant. It's a never-ending job, and we have some of the best people in the industry doing it.

Dec 6, 2009

Raising the Bar in Testing


It is important for NSS Labs to periodically raise the bar as the industry advances, both offensively and defensively. In other words, we increase the difficulty of the test to match the needs (as driven by cyber criminals' innovative new attacks), and the capabilities of vendors who have developed new approaches to countering the threats.

This is what NSS Labs has been doing over the past 2 years in particular: raising the bar. As such, our reports and the scores vendors receive on them deserve to be put into context. We perform a number of different types of tests over the past few years:

1. Product Certification - a full review of the whole product; including security effectiveness, performance, manageability and stability.

2. Group Tests - comparative testing of a class of products from leading vendors. Due to the volume of vendors, the depth of the testing may be abbreviated and focused. The browser security, endpoint protection, and Network IPS tests are great examples of these.

3. Security Update Monitor (SUM) Tests - unique to the industry, NSS Labs has been testing IPS products on a monthly basis. Every quarter, we tally the average scores and bestow awards. The attacks in this test set are focused on vulnerability disclosures made the previous month. As such, the volume of new additions is generally in the couple dozen range. Overall, there are currently 300 entries.

4. Individual Product Reports - these are brand new detailed reports that were created during group testing. They capture the nitty gritty details that are rolled up into the group test.

5. Exposure Reports - An industry and NSS first. These reports list specific vulnerabilities that are NOT shielded by an IPS. This information is critical in helping organizations knowing where their protection is and is not. Contact us for confidential demonstration. (Knowing can help identify appropriate mitigations, such as writing custom signatures, implementing new rules, resegmenting the network, or ultimately switching or adding a security product.)

By the Numbers:
Exploit selection and validation is a serious matter. Our engineers take care to identify the greatest risks to enterprises, based on commonly deployed systems, utilizing a modified CVSS.


Certifications performed in 2008 and 2009 tested 600 vulnerabilities. Compare this with other tests, e.g. ICSA IPS tests only 120 vulnerabilities (all server side). Our 600 is more than 480 better, because vendors did not know what they were, thus preventing the 'gaming of the test'.

SUM testing generally adds 20-30 vulnerabilities per month. Relatively small, directly reflective of popular attacks.

Our latest group test utilized 1,159 live exploits. More than 2x the number we used previously for certifications (and nearly 10x more than the next lab).

NSS Labs Tests are Harder Than The Other Guys'...
Difficult, real-world tests are an important part of raising the bar. Marketers like to have big numbers, and when it comes to scores, 100% is the target. Unfortunately, that's not the reality of information security products. We at NSS do not expect any product to catch 100% of the attacks in any of our tests. If they do, we probably are not working hard enough (or the bad guys gave up and went home - unlikely). There are more threats than can be protected against, and depending on the vendor, the margin can be acceptable, or pretty significant.

Take the difficulty of the test into consideration when comparing products. The lower the bar, the easier it is to score 100%. And the less meaningful it is. A 70% score on an NSS Labs Group Test (1,159 exploits) is still 6.7 times more validation than a 100% of 120 exploits in an ICSA Labs(r) test. And a 95% score on an NSS certification is 4.9 times more.

When we at NSS Labs raised the bar on this Q4 2009 IPS Group Test, we really cranked it up. So, if you're wondering why a vendor who previously scored in the 90's is scoring lower on the group test, it's not necessarily because they are slacking off. In most cases it is quite the opposite. Thus, one should be sure to compare products within the test set and methodology. Ergo, the 80% product definitely bested the 17% product by a wide berth. Be very cautious of the latter. And reward those vendors who submitted to this rigorous testing in the first place.

Dec 4, 2009

NSS Labs Mission Revisited

NSS Labs' mission is to help raise the bar of information security capabilities and practices, for enterprises and vendors alike.

For enterprises, that means helping them choose and implement better defenses. We do this by performing rigorous testing of leading products in various configurations and publishing test reports for purchase as an information service, much like other analyst firms, like Gartner, IDC, Forrester, etc. (However, we are the only ones that actually perform hands on, comprehensive testing of security products). There are several types of reports, individual certifications (full 360 reviews), comparative group tests, Security Update Monitoring, and our new Exposure Reports. The products in this information service help subscribers understand what is protected and what is not. Nothing protects 100%, so knowing the specifics is important.

For security product vendors, this means testing them against standardized evaluation criteria to establish a baseline, and drawing attention to key issues and requirements. We then test according to best practices methodologies. Our reports also reward those vendors that perform well, and they can use those for marketing. When it comes to improving products, vendors have great resources, and some of the smartest teams around. In addition, they often turn to outside experts for help. NSS is well equipped to assist with this type of private testing and consulting. However, we are always careful to maintain integrity during the process.