Re: [saag] Algorithms/modes requested by users/customers

Stephen Kent <kent@bbn.com> Tue, 19 February 2008 22:35 UTC

Received: from pacific-carrier-annex.mit.edu (PACIFIC-CARRIER-ANNEX.MIT.EDU [18.7.21.83]) by pch.mit.edu (8.13.6/8.12.8) with ESMTP id m1JMZlVo030923 for <saag@PCH.mit.edu>; Tue, 19 Feb 2008 17:35:47 -0500
Received: from mit.edu (M24-004-BARRACUDA-3.MIT.EDU [18.7.7.114]) by pacific-carrier-annex.mit.edu (8.13.6/8.9.2) with ESMTP id m1JMZbm1022094 for <saag@mit.edu>; Tue, 19 Feb 2008 17:35:37 -0500 (EST)
Received: from mx12.bbn.com (mx12.bbn.com [128.33.0.81]) by mit.edu (Spam Firewall) with ESMTP id 32200DDC1FE for <saag@mit.edu>; Tue, 19 Feb 2008 17:35:16 -0500 (EST)
Received: from col-dhcp33-244-170.bbn.com ([128.33.244.170]) by mx12.bbn.com with esmtp (Exim 4.60) (envelope-from <kent@bbn.com>) id 1JRb3T-0000VO-4g; Tue, 19 Feb 2008 17:35:15 -0500
Mime-Version: 1.0
Message-Id: <p06240509c3e1030dc6fe@[128.33.244.170]>
In-Reply-To: <p06240804c3e0ad5d1fa4@[10.20.30.152]>
References: <8329C86009B2F24493D76B486146769A9429B7A8@USEXCHANGE.corp.extremenetworks . com> <p06240804c3de211f0592@[10.20.30.162]> <p06240504c3e09559649c@[192.168.0.102]> <p06240804c3e0ad5d1fa4@[10.20.30.152]>
Date: Tue, 19 Feb 2008 17:35:18 -0500
To: Paul Hoffman <paul.hoffman@vpnc.org>
From: Stephen Kent <kent@bbn.com>
Content-Type: text/plain; charset="us-ascii"; format="flowed"
X-Spam-Score: 0
X-Spam-Flag: NO
X-Scanned-By: MIMEDefang 2.42
Cc: "saag@mit.edu" <saag@mit.edu>, Randall Atkinson <rja@extremenetworks.com>
Subject: Re: [saag] Algorithms/modes requested by users/customers
X-BeenThere: saag@mit.edu
X-Mailman-Version: 2.1.6
Precedence: list
List-Id: IETF Security Area Advisory Group <saag.mit.edu>
List-Unsubscribe: <http://mailman.mit.edu/mailman/listinfo/saag>, <mailto:saag-request@mit.edu?subject=unsubscribe>
List-Archive: <http://mailman.mit.edu/pipermail/saag>
List-Post: <mailto:saag@mit.edu>
List-Help: <mailto:saag-request@mit.edu?subject=help>
List-Subscribe: <http://mailman.mit.edu/mailman/listinfo/saag>, <mailto:saag-request@mit.edu?subject=subscribe>
X-List-Received-Date: Tue, 19 Feb 2008 22:35:48 -0000

At 8:18 AM -0800 2/19/08, Paul Hoffman wrote:
>At 10:15 AM -0500 2/19/08, Stephen Kent wrote:
>>Can you share the reasons cited by vendors to support the notion 
>>that the FIPS 140 process is broken.
>
>Sure.
>
>- It takes way too long from submission of system to validation, 
>even if no problems are found.

I have gone through a level 3 eval and compared to ALL other security 
criteria evaluations, the time to perform the eval is much, much 
better. So, my guess is that anyone complaining about this has no 
experience with evals for things like CC.

>- If problems are found during evaluation, the restart time is too long.

ibid.

>- Some of the tests are fairly subjective, and it becomes a game of 
>fixing code to please the testing service, not to make the product 
>more secure.

It is true that some test criteria are not completely cut and dried. 
That is a problem with every one of these secruity eval criteria. My 
experience is that FIPS 40 is better i this regard than all the 
others.

>- An evaluated product cannot have its core firmware updated without 
>losing the validation. For example, if a customer asks for a new 
>feature that might touch the crypto, the vendor has to choose 
>between losing the validation (and paying for a new one, and waiting 
>for the results) or keeping the customer happy.

Absolutely right, and it should be that way! To expect otherwise is 
to fail to understand what a security eval is al about. However, the 
representation above is silly. The old product with the old feature 
set is still validated. It's the new version that cannot yet be sold 
as evaluated. When we had a level 3 device, we also sold a version 
that was not yet evaluated, but had nifty new features, and explained 
the situation to clients, some of who opted for the old version until 
the new version was evaluated.

>- The validation doesn't even check for on-the-wire 
>interoperability, which is what the customers care about most.

It's a crypto security eval criteria, not a protocol compliance eval 
criteria. The labs are not testers of IPsec or TLS; they are testers 
of crypto module functions, period. a buyer who doesn't understand 
that is misinformed.

>- The test process is too expensive for many low-end devices that 
>would be very useful in USGovt offices.
>
>- The system introduces silly modes that make the systems more complicated.

e.g., ?

>These are the top complaints I hear from both large and small 
>vendors. There are certainly others. "Oh, don't get me started" is 
>commonly heard when discussing FIPS 140 testing.

Whiners :-).

>>   Compared to other security evaluation criteria, e.g. CC or the 
>>old Organge Book, most security folks I know view the FIPS 140 
>>evaluation program as well managed and not very onerous.
>
>"Security folks" are not good evaluators on how some processes 
>affect the market.

But we are experienced with various security eval criteria, and one 
ought to judge such criteria in context. In that context FIPS 140 is 
viewed as having a great track record.

>
>>Also, if buyers believe that a device that "could be evaluated" is 
>>good enough, they are being rather naive.
>
>Maybe. If a buyer cares about "does this box support the needed 
>crypto in an interoperable fashion", then the perception is fine. If 
>they care about all the details that FIPS 140 testing covers, then 
>of course it is naive.

Your statement seems to conflate two very separate issues.  if a 
buyer is concerned primarily with interoperability or (non-crypto) 
standards compliance, then FIPS 140 is irrelevant.  if they care 
about whether a product that purports to use crypto for security does 
so securely, then FIPS 140 eval is a solid prerequisite.  I think a 
client who cares more about interoperability of a crypto-based 
security product than the security of the product is naive.

>
>>Experience with the FIPS 140 program has shown that a significant 
>>fraction of of products submitted for evaluation fail the process.
>
>This statement is usually bandied about without any quantification 
>of what failures were found, as it is here. As someone who creates 
>test suites, I can assure you that I can make a few picky rules for 
>systems that would cause some of those systems to fail, but most end 
>users who understood those few rules would say that those rules were 
>silly. Of course, different end users have different requirements. 
>It is generally felt by vendors that some/many of the rules for FIPS 
>140 are silly and unneeded; on the other hand, it is likely that at 
>least a few USGovt customers would really want some of those rules 
>for particular reasons.

I have served several terms on an committee operated by the National 
Academy of Sciences that provides an external review for the NIST 
Security Lab. In our review meetings I have seen the figures for the 
pas/fail rate, and have seen descriptions of the reasons for 
failures, some of which are hilariously bad. I is not only US 
Givernment clients who care about most of the "picky rules."  The 
banking community has mandated FIPS evaluation for years. However, if 
a client is more concerned about the performance of a VPN appliance 
than the security of the device, FIPS 140 is not a good criteria for 
that client.

>
>>Presumably a vendor submitting a product for evaluation believes 
>>that the product is ready, and will pass, otherwise the vendor is 
>>wasting money on the evaluation effort.
>
>Wrong, very wrong. You cannot tell what the testing agency might try 
>to knock you down on ahead of time. Of course, you cover what you 
>can, but you also know that there is a good chance they'll just 
>whack you a bit because it makes them look good.

Nonsense, utter nonsense (says the guy who went trough this process, 
as opposed to the guy who is relating what he has been told by 
others). The criteria are complemented by guidelines for eval labs, 
and these guidelines are available to vendors. So, a smart vendor 
would study the criteria and the guidelines before designing a 
product that will be evaluated. A smart vendor will discuss the 
criteria with the lad it chooses, to minimize uncertainty.

>>The fact that many products fail evaluation (for good security 
>>reasons), tells me that a vendor's claim that a product "could be 
>>evaluated" is not much of an assurance.
>
>If you believe in the testing regime, that's fine. Others don't, and 
>I believe for very good reasons in their own use scenarios.

What I said is true, irrespective of whether one believes in the 
criteria or the testing methodology. The reality is the products 
fail, and many  (not all) of the failures are demonstrably due to 
serious security problems. So, given this experience, one really 
ought not believe a vendor (whose goal is to sell products) who says 
"oh yeah, our product would pass evaluation, but we don't think it's 
worth the investment." A client who believes this argument should buy 
a used car without a warranty or an inspection by a mechanic. The 
likelihood of a good outcome is similar.

Steve