Re: [Cfrg] likelihood that someone has a quantum computer

David McGrew <> Tue, 14 January 2014 14:43 UTC

Return-Path: <>
Received: from localhost ( []) by (Postfix) with ESMTP id AAF1F1A1F3F for <>; Tue, 14 Jan 2014 06:43:05 -0800 (PST)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -10.038
X-Spam-Status: No, score=-10.038 tagged_above=-999 required=5 tests=[BAYES_00=-1.9, DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=0.001, RP_MATCHES_RCVD=-0.538, SPF_PASS=-0.001, USER_IN_DEF_DKIM_WL=-7.5] autolearn=ham
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id xEdAQJYp3pjh for <>; Tue, 14 Jan 2014 06:43:02 -0800 (PST)
Received: from ( []) by (Postfix) with ESMTP id 5B8971ADBE5 for <>; Tue, 14 Jan 2014 06:43:02 -0800 (PST)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple;;; l=22196; q=dns/txt; s=iport; t=1389710571; x=1390920171; h=message-id:date:from:mime-version:to:cc:subject: references:in-reply-to; bh=jAsn+KWQHxuocsg8LvEJsiOeAngEHHUgihay5eir8ZA=; b=kQq+wuEB6SGp6LJM9u0BTvvgNJntKQix35omSnB7q7dQG4uQat7xh0st lTIOLNhmjw0sr6lb8fIeUaoTzayidnuqP7vC2WdT4gBWNOHkIVUfGMxi4 vtvyQW9X5HJw16FuADqbjOqUu+7QlQwLhrJhvTNMtknyTkMfD1KGVeGYh c=;
X-IronPort-Anti-Spam-Filtered: true
X-IronPort-AV: E=Sophos; i="4.95,658,1384300800"; d="scan'208,217"; a="12735869"
Received: from ([]) by with ESMTP; 14 Jan 2014 14:42:49 +0000
Received: from [] ( []) by (8.14.5/8.14.5) with ESMTP id s0EEgm4J030596; Tue, 14 Jan 2014 14:42:49 GMT
Message-ID: <>
Date: Tue, 14 Jan 2014 09:42:48 -0500
From: David McGrew <>
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/20130922 Icedove/17.0.9
MIME-Version: 1.0
To: William Whyte <>
References: <> <> <> <> <> <> <>
In-Reply-To: <>
Content-Type: multipart/alternative; boundary="------------020108080707070807050403"
Cc: Dan Brown <>,,
Subject: Re: [Cfrg] likelihood that someone has a quantum computer
X-Mailman-Version: 2.1.15
Precedence: list
List-Id: Crypto Forum Research Group <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Tue, 14 Jan 2014 14:43:05 -0000

Hi William,

On 01/13/2014 09:15 PM, William Whyte wrote:
> Hi David,
> >> I am skeptical about the approach of combining multiple public key 
> algorithms, especially for postquantum algorithms.  This is partly 
> because of the complexity of the specification and the implementation, 
> but also because it would require double the amount of work from 
> authors, reviewers, and implementers in order to be successful.
> I’m not sure it’s necessarily that complex. If we have an asymmetric 
> key transport mechanism KT that transports key material k and is 
> semantically secure, and we have a secure KDF, it seems that
> Transport: KT1(k1), KT2(k2), KT3(k3), …
> Secret k = KDF(k1, k2, k3, …)
> ..will also be semantically secure. Obviously this would need to be 
> demonstrated rigorously, but rather than seeing “double the amount of 
> work” from the standardization point of view, I see one task to define 
> a secure combination mechanism, and then a series of tasks to 
> standardize individual key transport mechanisms, where this series of 
> tasks would have to be done anyway by people who wanted their 
> algorithms adopted.

Let me explain my skepticism by pointing out some places where 
complexity could arise in this scenario.  What you describe would work 
for key transport, but not for key exchange, which excludes DH and 
ECDH.   For a protocol that negotiates algorithms, how would this 
combination be negotiated?   In TLS, for instance, would there need to 
be a ciphersuite for each possible combination of public key 
algorithms?    Or would there need to be two parallel negotiations, with 
an assurance that any possible combination of algorithms that might 
result from the negotiation can be used together?   If the protocol 
aimed to provide identity hiding, would it be expected that both KT 
algorithms would be run twice, once anonymously and once otherwise?  
Would the algorithms use the same nonces?   If so, what if the number of 
messages exchanged in the KT protocols are different?

It is important to establish what the goals would be in using two 
different algorithms.  One goal that some users have is to protect 
traffic with two completely independent crypto implementations, e.g. 
sourced from two different vendors, and thus designed, implemented, and 
tested by two independent teams.   This is done today to provide a 
higher level of assurance using commercial cryptography, by running two 
products that implement standards like IPsec, TLS, or SRTP so that the 
encrypted traffic from one implementation is treated as the plaintext 
traffic by the other implementation. Using two layers protects against 
the possibility that one of the implementations contains an exploitable 
vulnerability or a significant flaw.  This idea is used by the US govt 
information assurance program (see for instance Section 4.4 of ) and other nations as well 
(though I can't point to public information for the other cases that I 
know of).   An important property of a two-layer scheme is the 
independence of the two crypto implementations, and it seems that this 
property would be lost in the parallel key transport idea that you 
outlined above.

> For signatures, assuming we’re looking only at diversity in public key 
> algorithms and not hash algorithms, it’s the same: the message simply 
> takes two signatures, one from each algorithm.  Again, this would have 
> to be specified rigorously. I think diversity in signatures is less 
> urgent than for encryption algorithms, because of the risk of an 
> attacker storing messages for a long time and decrypting them when 
> possible, but it’s worth considering.

It seems that signatures would be easier to do, since signing could be 
done in parallel.  But what is not clear is that we would want or need a 
new signature standard to do this. A CA could create two certificates, 
one with each algorithm, using existing standards. (Probably the 
standards could be improved to better allow the selection of appropriate 
certificates by applications.)

Speaking as an individual member of the RG: I am not opposed work on the 
use of multiple algorithms, but I do not see it as a high priority right 
now, and I feel strongly that such work would be most impactful and 
beneficial if it enabled the use of two completely independent crypto 



> Similarly, implementations have one additional task rather than double 
> the amount of tasks. Testing may be more complicated if all pairwise 
> combinations of algorithms are to be tested, but I expect the total 
> number of algorithms will never be too big.
> Finally, authenticating the public keys may require building support 
> for new algorithms into CAs, which could be laborious (though again, I 
> would argue that you should do this when you don’t have to).
> > Patents could also be a significant issue, as you note.
> Yes, agreed. This would need to be looked into.
> Cheers,
> William
> *From:*David McGrew [ <>]
> *Sent:* Monday, January 13, 2014 7:18 AM
> *To:* William Whyte
> *Cc:* Dan Brown; <>; 
> <>
> *Subject:* Re: [Cfrg] likelihood that someone has a quantum computer
> Hi William,
> On 01/12/2014 11:16 PM, William Whyte wrote:
>     Hi all,
>     Sorry again for top-posting, that's gmail for you.
>     > I would put it more positively as: let's figure out how to use
>     post-quantum cryptography in practice, because if there is a
>     breakthrough in quantum computing, the mad rush will be ugly.
>     As I mentioned in a previous mail, I think the best way to do this
>     is to figure out how to avoid depending on a single public key
>     algorithm in any context, because all public key algorithms are
>     potentially vulnerable to technological and algorithmic
>     breakthroughs. So combining public key algorithms seems like a
>     prudent approach. I know there are Certicom patents on this but it
>     seems that this shouldn't be insuperable.
> I am skeptical about the approach of combining multiple public key 
> algorithms, especially for postquantum algorithms.  This is partly 
> because of the complexity of the specification and the implementation, 
> but also because it would require double the amount of work from 
> authors, reviewers, and implementers in order to be successful.   
> Patents could also be a significant issue, as you note.
> David