Re: Suggested changes for DSA2 (Daniel A. Nagy) Tue, 28 March 2006 09:14 UTC

Received: from [] ( by with esmtp (Exim 4.43) id 1FOAHZ-000364-Os for; Tue, 28 Mar 2006 04:14:33 -0500
Received: from ([]) by with esmtp (Exim 4.43) id 1FOAHZ-00082I-Cu for; Tue, 28 Mar 2006 04:14:33 -0500
Received: from (localhost []) by (8.13.5/8.13.5) with ESMTP id k2S8kO17050845; Tue, 28 Mar 2006 01:46:24 -0700 (MST) (envelope-from
Received: (from majordom@localhost) by (8.13.5/8.13.5/Submit) id k2S8kOTc050844; Tue, 28 Mar 2006 01:46:24 -0700 (MST) (envelope-from
X-Authentication-Warning: majordom set sender to using -f
Received: from ( []) by (8.13.5/8.13.5) with ESMTP id k2S8kNE1050838 for <>; Tue, 28 Mar 2006 01:46:23 -0700 (MST) (envelope-from
Received: by (Postfix, from userid 1001) id F30DA5440; Tue, 28 Mar 2006 10:46:21 +0200 (CEST)
Date: Tue, 28 Mar 2006 10:46:21 +0200
Subject: Re: Suggested changes for DSA2
Message-ID: <>
References: <>
Mime-Version: 1.0
Content-Type: text/plain; charset="us-ascii"
Content-Disposition: inline
In-Reply-To: <>
User-Agent: Mutt/1.5.9i
Precedence: bulk
List-Archive: <>
List-Unsubscribe: <>
List-ID: <>
X-Spam-Score: 0.1 (/)
X-Scan-Signature: f4c2cf0bccc868e4cc88dace71fb3f44

On Mon, Mar 27, 2006 at 03:22:15PM -0800, "Hal Finney" wrote:
> David writes:
> > For implementation of signature verification you can just take p and q
> > straight from the public key.  You don't need to guess since the key
> > has all the information you need.
> With signatures, it is the verifier more than the signer who is vulnerable
> and who needs to be protected.  The problem is that as the verifying
> software it is my responsibility to provide some level of assurance to
> the user about how strong this signature is.

Right, but it still boils down to whether or not the verifier trusts a
certain public key. Thus, the decision needs to be made on a per-key, rather
than per-signature basis. I am not arguing here; it is just a remark.
> Right now at best we only report the key size.  I'd like to make sure that
> q is as strong as p.  Otherwise we might see a 4096 bit key with a 160 bit
> q, so it is really no stronger than a 1024 bit key.

That is not quite precise, either. Increasing the size of q and the size of p
protect against two different attacks. Large q's protect against (optimized)
brute-force or random guessing of the discrete logarithm, while large p's
protect against sieve methods. The relative strength of the two attacks is
not easily assessed. Sieve methods are getting better and better. Thus, in
the future it may very well happen that the balanced choice will be 4096
bits for p and 160 bits for q.

As desirable as describing strength in some one-dimensional quantity is, it
is hardly possible. NIST's choice of matching modulus sizes and group orders
reflect the balancing of time costs of state-of-the-art attacks with no
regard to memory costs. (by the way, this same approach is reflected in
declaring 3DES as strong as a 112-bit cipher -- as if 2^62 bits of memory
were free) This is a long-standing tradition in the main-steam crypto
community, but there is no universal consensus about this.

> It is hard to report
> to the user how strong a signature by that key should be considered to be.

Yes, it is. That is one reason not to reflect _our_ judgement (even if we
could ever come to an agreement) about it in the standard.

> This problem goes away if we standardize on the q sizes that go with
> certain p sizes.  That's what I'd like to do.  Any keys that break the
> rules would be considered invalid. 

No, it won't go away. Moreoever, why would you declare |p|=1024 |q|=160
valid, while |p|=4096 |q|=160 invalid, while the second choice is clearly no
weaker than the first one?

Putting lower limits on both the modulus size and the group order makes more
sense, but that also does not merit more than a passing remark in the
standard. IMHO, of course.