Re: [dane] Digest Algorithm Agility discussion

Viktor Dukhovni <> Tue, 18 March 2014 14:14 UTC

Return-Path: <>
Received: from localhost ( []) by (Postfix) with ESMTP id D8B271A037C for <>; Tue, 18 Mar 2014 07:14:09 -0700 (PDT)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -1.9
X-Spam-Status: No, score=-1.9 tagged_above=-999 required=5 tests=[BAYES_00=-1.9] autolearn=ham
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id oyGoBRFLQICA for <>; Tue, 18 Mar 2014 07:14:07 -0700 (PDT)
Received: from ( []) by (Postfix) with ESMTP id A65261A012F for <>; Tue, 18 Mar 2014 07:14:06 -0700 (PDT)
Received: by (Postfix, from userid 1034) id 494E82AB22D; Tue, 18 Mar 2014 14:13:56 +0000 (UTC)
Date: Tue, 18 Mar 2014 14:13:56 +0000
From: Viktor Dukhovni <>
Message-ID: <>
References: <> <> <> <>
MIME-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
In-Reply-To: <>
User-Agent: Mutt/1.5.23 (2014-03-12)
Subject: Re: [dane] Digest Algorithm Agility discussion
X-Mailman-Version: 2.1.15
Precedence: list
List-Id: DNS-based Authentication of Named Entities <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Tue, 18 Mar 2014 14:14:10 -0000

On Tue, Mar 18, 2014 at 10:26:47PM +1100, Mark Andrews wrote:

> This whole argument of weakest vs strongest was had years ago in
> DNSSEC and quite frankly is a waste of time trying to pick the
> strongest as you are often comparing apples and oranges.

The client ranks the algorithms by preference, in much the same
way that TLS ranks cipher-suites by preference and in TLSv1.2 even
signature algorithms by preference.  It does not matter whether
the rankings are objectively justified.  The client uses its most
preferred digest (perhaps it has hardware support for it).

> DNSSEC validators just have a way to say "we no longer trust this
> algorithm" and once that is set all records with that algorithm are
> ignored when doing validation regardless of whether there is code
> to support that algorithm or not.

If things are as you describe, this works poorly, since clients
continue to accept weak signatures on zones that have both strong
and weak signatures (RSA/SHA1 and RSA/SHA256) if the strong signature
fails to match the client trusts the weak, right?

It becomes difficult to phase out a weak algorithm because the
incentive to publish the strong along with the weak is reduced
as the benefit from doing so is delayed until clients actually
discontinue the weak algorithm.