[CFRG] NSA vs. hybrid

"D. J. Bernstein" <djb@cr.yp.to> Fri, 12 November 2021 09:28 UTC

Return-Path: <djb-dsn2-1406711340.7506@cr.yp.to>
X-Original-To: cfrg@ietfa.amsl.com
Delivered-To: cfrg@ietfa.amsl.com
Received: from localhost (localhost [127.0.0.1]) by ietfa.amsl.com (Postfix) with ESMTP id A970B3A0867 for <cfrg@ietfa.amsl.com>; Fri, 12 Nov 2021 01:28:46 -0800 (PST)
X-Virus-Scanned: amavisd-new at amsl.com
X-Spam-Flag: NO
X-Spam-Score: 1.295
X-Spam-Level: *
X-Spam-Status: No, score=1.295 tagged_above=-999 required=5 tests=[BAYES_00=-1.9, PP_MIME_FAKE_ASCII_TEXT=0.964, RISK_FREE=2.229, SPF_HELO_NONE=0.001, SPF_PASS=-0.001, UNPARSEABLE_RELAY=0.001, URIBL_BLOCKED=0.001] autolearn=no autolearn_force=no
Received: from mail.ietf.org ([4.31.198.44]) by localhost (ietfa.amsl.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id a6dIy_vAyWaN for <cfrg@ietfa.amsl.com>; Fri, 12 Nov 2021 01:28:42 -0800 (PST)
Received: from salsa.cs.uic.edu (salsa.cs.uic.edu [131.193.32.108]) by ietfa.amsl.com (Postfix) with SMTP id 9DB2C3A085C for <cfrg@irtf.org>; Fri, 12 Nov 2021 01:28:42 -0800 (PST)
Received: (qmail 17873 invoked by uid 1010); 12 Nov 2021 09:28:39 -0000
Received: from unknown (unknown) by unknown with QMTP; 12 Nov 2021 09:28:39 -0000
Received: (qmail 628366 invoked by uid 1000); 12 Nov 2021 09:28:12 -0000
Date: Fri, 12 Nov 2021 09:28:11 -0000
Message-ID: <20211112092811.628364.qmail@cr.yp.to>
From: "D. J. Bernstein" <djb@cr.yp.to>
To: cfrg@irtf.org
Mail-Followup-To: cfrg@irtf.org
Archived-At: <https://mailarchive.ietf.org/arch/msg/cfrg/T3XgKeJr4-PvmPrS5TwVNfW9t_w>
Subject: [CFRG] NSA vs. hybrid
X-BeenThere: cfrg@irtf.org
X-Mailman-Version: 2.1.29
Precedence: list
List-Id: Crypto Forum Research Group <cfrg.irtf.org>
List-Unsubscribe: <https://www.irtf.org/mailman/options/cfrg>, <mailto:cfrg-request@irtf.org?subject=unsubscribe>
List-Archive: <https://mailarchive.ietf.org/arch/browse/cfrg/>
List-Post: <mailto:cfrg@irtf.org>
List-Help: <mailto:cfrg-request@irtf.org?subject=help>
List-Subscribe: <https://www.irtf.org/mailman/listinfo/cfrg>, <mailto:cfrg-request@irtf.org?subject=subscribe>
X-List-Received-Date: Fri, 12 Nov 2021 09:28:47 -0000

This looks to me like something that should be discussed in CFRG rather
than LAMPS:

   https://datatracker.ietf.org/meeting/112/materials/slides-112-lamps-hybrid-non-composite-multi-certificate-00
   https://mailarchive.ietf.org/arch/msg/spasm/McksDhejGgJJ6xG617FEWLbBq0k/

This is one part of a big push by NSA across multiple non-CFRG venues to
convince everyone to

   * deploy small lattice systems---which _hopefully_ protects against
     quantum computers---and

   * _turn off ECC_---this is the scary part, since there's a serious
     risk that the small lattice systems are easier to break than ECC.

I would like to see CFRG instead advising integration of ECC into all
post-quantum deployments for the foreseeable future. There's no reason
that this advice has to wait for NISTPQC standards.

Specific comments follow.

> Rigorous, effective algorithm vetting is a must, NSA has confidence in
> the NIST PQC process

After evaluations from 2017--2020, the "NIST PQC process" selected
various third-round primitives for which NIST was subsequently so
surprised by attacks that NIST in 2021 is suddenly promoting an
"alternate" primitive and issuing a call for new primitives.

Given that 2017--2020 wasn't enough time for the process to reach safe
decisions, why should anyone believe that decisions made by the same
process in 2021 or 2022 are safe? It sounds bizarre to me that anyone
would be declaring confidence in the process. Where does this confidence
come from?

Most of NIST's evaluation process has been hidden from public view.
Anyone who skims

   https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8240.pdf
   https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8309.pdf
   https://nvlpubs.nist.gov/nistpubs/jres/104/5/j45nec.pdf
   https://nvlpubs.nist.gov/nistpubs/jres/106/3/j63nec.pdf

can see that NIST's reports on the post-quantum competition are much
less detailed than NIST's reports on the AES competition, even though
the post-quantum competition is much more complicated.

NIST refuses to call this a "competition", doesn't follow NIST's rules
for "competitions", and doesn't follow the transparency rules from
NIST's Dual EC post-mortem. FOIA requests have revealed _some_ useful
information, but the public can't see most of NIST's work. What the
public _can_ see is deeply concerning; see, e.g., Section 5 of
https://cr.yp.to/papers.html#categories.

If NSA's "confidence in the NIST PQC process" is based on NSA's special
access to that process (apparently stretching back to the outset in
2015, kept secret until July 2020) then NSA should publish the details
for everyone else to evaluate. NIST has recently stated in

   https://web.archive.org/web/20211027161303/https://www.nist.gov/blogs/taking-measure/post-quantum-encryption-qa-nists-matt-scholl

that "We operate transparently. We’ve shown all our work" so I don't
think NSA can point to NIST's privacy as a reason to avoid disclosure.

> NSA only anticipates using hybrid solutions to maintain
> interoperability during the transition (or where direct drop-in is not
> feasible)

This sounds reckless to me, especially in conjunction with NSA pushing
specifically for small lattice-based encryption systems. We've seen
breaks of "provably secure" lattice submissions, we've seen continued
degradation of security for all lattice submissions, and we've seen one
claimed "barrier" after another being broken by new lattice attacks. See

   https://ntruprime.cr.yp.to/latticerisks-20211031.pdf

for how complicated and unstable the security analysis is. 

> Any hybrid method adopted should allow for a quick transition to
> PQ-only solutions

Why is this particular transition important, and what concretely is this
goal supposed to mean? How would a choice of "hybrid method" end up
delaying such a transition beyond the natural costs of any upgrade?

_If_ everybody sees that big quantum computers have been built _and_ are
so cost-effective that ECC obviously no longer has any security value,
_then_ there will be a push to declare ECC obsolete and to remove it
from protocols, simplifying implementations. I don't see how any
particular choice of hybrid approach is an obstacle in this scenario.

> Ensure interoperability with PQ-only systems is included for forward
> compatibility and to allow for use of direct drop-in of PQ

Is this saying something different from the previous goal, aside from
adding the words "forward compatibility" and "direct"?

The slides continue by comparing two approaches to hybrid certificates.
The idea of having certificates work as usual, except for upgrading to a
P+Q signature system that internally requires pre-quantum signature P
and post-quantum signature Q to be valid, is labeled here as "composite
certs". The idea of requiring two separate certificates, one using P and
one using Q (maybe with some effort to compress redundancy), is what
seems to be labeled here as "non-composite certs".

  [ under "PROS" of "composite" certs ]
> Matching security levels of algorithms is built into composite pairs

"Matching security levels" is usually a tool to reduce security rather
than to ensure security. See the first three paragraphs of Section 3.2
of https://cr.yp.to/papers/footloose-20210705.pdf.

As a concrete example (with encryption rather than signatures), the
typical concept of "matching security levels" would object to the new

   sntrup761x25519-sha512@openssh.com

as overkill, since _known attacks_ against sntrup761 are much more
expensive than attacks against X25519. However, an analysis that
properly accounts for _risks_ is much more complicated and isn't
structurally compatible with the "security level" oversimplification.

  [ under "CONS" of "composite" certs ]
> Can require reworking of certificate validation

Yikes---sounds scary given how tricky certificate validation is. But
wait a minute. Isn't certificate validation simply calling signature
verification as a black box? What has to be reworked here?

  [ under "CONS" of "composite" certs ]
> Maintenance concerns surrounding deprecated algorithms

Why is this supposed to be more of an issue for "composite" certs than
for "non-composite" certs?

  [ under "CONS" of "composite" certs ]
> Requires another transition and set of standards from hybrid to PQ

Even _if_ this transition is something we definitely want and something
to be worrying about now (see above), how would "non-composite" certs
avoid the need for new standards and a transition? For interoperability
someone would have to say "Okay, clients have to stop requiring ECC"
followed eventually by "Okay, now servers should stop sending ECC";
sounds to me like transitioning to a new standard!

  [ under "PROS" of "non-composite" certs ]
> Computational processes remain unchanged (but perhaps multiple
> iterations)

This sounds weird, given that the whole point is to be adding new
post-quantum primitives into the picture. Perhaps I'm missing some
restricted definition of "computational process".

  [ under "PROS" of "non-composite" certs ]
> UDP-based protocols potentially avoid fragmentation issues

I'm puzzled. Are we supposed to be envisioning a post-quantum signature
system that just barely fits into a UDP packet but not if one adds 64
bytes for an Ed25519 signature? It would help to have pointers to the
specific protocol and specific post-quantum signature system showing
that this is a problem, and an explanation of why the correct fix isn't
to have the protocol upgraded to be able to use multiple packets so as
to support higher-security post-quantum primitives.

  [ under "PROS" of "non-composite" certs ]
> Ease of use for backward compatibility

How is a P certificate plus a Q certificate easier to use than a P
certificate plus a P+Q certificate? If this is supposed to be an
objection to the cost of the extra 64 bytes then there also has to be an
analysis of how much space is lost by the "non-composite" approach.

  [ under "PROS" of "non-composite" certs ]
> Facilitates seamless transition to PQonly, no new standards needed

This sounds like another repetition of the ideas that (1) "PQonly" is
the desired goal and (2) specifying "composite" certificates somehow
interferes with this. See above.

  [ under "PROS" of "non-composite" certs ]
> Requires support for only two types of structures (traditional and PQ)

Not sure what this is supposed to mean.

> The quantum-resistant algorithms resulting from the NIST program are
> well-vetted, and NSA has no concerns in this area regarding their
> security.

Was GeMSS "well-vetted" in mid-2020? Is the same process suddenly
risk-free a year later?

> Based on our analysis and experience, NSA has decided that the NSS
> path forward is toward strictly-PQ solutions, and thus we require
> flexibility in the architecture of hybrid protocols.

As a procedural note, I don't believe that CFRG, IRTF, and IETF are
under any obligation to do what NSA wants, even if NSA labels this as a
requirement, even if NSA uses its power to skew the market. The top goal
here has to be security.

> As you pointed out, implementation errors in current cryptographic
> algorithms are quite common, and the statistics you provided on CVEs
> for 2020 are telling that issue is prevalent even today, with
> well-established algorithms. As such, if implementation errors are
> your largest concern, doubling up on algorithms introduces a larger
> surface for error.

No, it's not that simple. The fact that _both_ primitives are being
independently asked to protect the user's data means that one primitive
could save the user from errors in the implementation of the other
primitive.

Obviously a buffer overflow in one implementation can destroy security,
and there are always systems-level concerns regarding extra software,
but the risks from the existing X25519/Ed25519 software ecosystem are
much smaller than the risks being added by the much more complicated new
post-quantum software ecosystem.

> I will emphasize that our non-composite approach does propose changes
> to existing protocol logic; however, in most instances this is
> primarily a matter of duplicating existing processes, and not
> introducing entirely new structure.

So building a signature system that verifies a P signature and a Q
signature is "entirely new structure", while building a protocol that
verifies a certificate using a P signature and that verifies a
certificate using a Q signature isn't "entirely new structure"? I don't
get it.

---Dan