Re: [apps-discuss] [websec] [saag] [kitten] HTTP authentication: the next generation

Marsh Ray <> Mon, 13 December 2010 19:23 UTC

Return-Path: <>
Received: from localhost (localhost []) by (Postfix) with ESMTP id 7037028C0DF; Mon, 13 Dec 2010 11:23:03 -0800 (PST)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -2.591
X-Spam-Status: No, score=-2.591 tagged_above=-999 required=5 tests=[AWL=0.008, BAYES_00=-2.599]
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id eiu44KNobs2s; Mon, 13 Dec 2010 11:23:01 -0800 (PST)
Received: from ( []) by (Postfix) with ESMTP id 11CFF3A6DEB; Mon, 13 Dec 2010 11:23:01 -0800 (PST)
Received: from ([]) by with esmtpa (Exim 4.68) (envelope-from <>) id 1PSE0k-000367-F9; Mon, 13 Dec 2010 19:24:38 +0000
Received: from [] (localhost []) by (Postfix) with ESMTP id 794AD60C6; Mon, 13 Dec 2010 19:24:34 +0000 (UTC)
X-Mail-Handler: MailHop Outbound by DynDNS
X-Report-Abuse-To: (see for abuse reporting information)
X-MHO-User: U2FsdGVkX1/Q37PlkAZHnTRK33n7uni+GlAsvheYsbA=
Message-ID: <>
Date: Mon, 13 Dec 2010 13:24:34 -0600
From: Marsh Ray <>
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv: Gecko/20101208 Thunderbird/3.1.7
MIME-Version: 1.0
To: Yoav Nir <>
References: <> <p06240809c928635499e8@[]> <> <> <> <> <> <> <> <> <2229.1292235952.971571@puncture> <> <2229.1292239384.281779@puncture> <> <> <> <>
In-Reply-To: <>
Content-Type: text/plain; charset="ISO-8859-1"; format="flowed"
Content-Transfer-Encoding: 7bit
X-Mailman-Approved-At: Tue, 14 Dec 2010 08:58:19 -0800
Cc: "" <>, protocols <>, websec <>,, Generation <>, Yaron Sheffer <>, "" <>, " Group" <>, "" <>
Subject: Re: [apps-discuss] [websec] [saag] [kitten] HTTP authentication: the next generation
X-Mailman-Version: 2.1.9
Precedence: list
List-Id: General discussion of application-layer protocols <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Mon, 13 Dec 2010 19:23:03 -0000

On 12/13/2010 09:29 AM, Yoav Nir wrote:
 > On Dec 13, 2010, at 4:24 PM, Rene Struik wrote:
 >> Hi Yoav:
 >> Could you summarize the main problems with client certificates. To
 >> my knowledge, there are no technical problems with computational
 >> bottlenecks on the client side yet. The only problem area I could
 >> think of would be storage of private keys, but this seems
 >> solvable.
 >> Similarly, if you could point out main usability problems with
 >> certs that would be great (is this a general problem, or just an
 >> artifact of the way these are currently used?). Problems stem from
 >> realistic requirements not being met, so it is good to capture
 >> these.
 > I can see several problems with client certificates, most related to
 >  usability, but some also to security:
 > * Certificates do not authenticate the user. They authenticate a
 > device.

I don't think they do that exactly either. The client cert is generally 
public, its private key is a secret like a password, but one that's too 
hard to memorize.

 > Placing a certificate on my laptop is a close enough
 > approximation to authenticating *me*, but then I can't use the same
 > certificate on my home computer, my phone,

Why not?

 > or the computer at some
 > Internet cafe or hotel business center.

But there's nothing that you can do securely on an untrusted computer. 
Sometimes people propose the idea of a secure boot CD users carry 
around, but those can't defend against trojaned firmware or hardware.

 > Passwords, on the other hand,
 > are with me wherever I go, and can be used with any device.

They have some pretty significant limitations too.

 > * A possible solution to the first problem would be to issue multiple
 > certificates for use in phone, laptop and desktop. But this makes the
 > management of all these certificates even more complicated,

N users, M sites. N is billions, M is millions.

N * M = a big number.

Perhaps if we accept it as an inherently complicated problem then we can 
give people tools that they need?

 > and increases the attack surface.

Honestly, could it be any worse?

 > * While some places of business, governments and militaries are
 > willing to spend the money and effort required to provision
 > certificates for all employees, I don't see companies doing it for
 > customers. It's never a good idea to bet that something is too big
 > for Google to do, but I can't imagine them issuing client
 > certificates to all gmail users.

Ha! I bet if we double-dog dared them, they just might.
Remember that they went all-TLS for gmail, when every other smaller site 
was saying it was un-economic.

 > Even banks, with real money at stake
 > don't do it, because the support costs would exceed the losses to
 > phishing.

My understanding is that some banks do, in fact, use client certs.

But from what I've seen, the economics and risk motivations of the 
banking sector are just really weird. It varies by country, and in the 
US it's still not entirely clear yet who's liable for losses due to 
online security. Banking is just so different than everyone else that 
it's usually not helpful to use it as an example it in general security 

 > * Issuing certificates does not solve the problem with Internet
 > cafes. It makes no sense for me to install a browser certificate on
 > some random computer.

Again, there's nothing you can do to make a login session secure from an 
untrusted machine.
Perhaps we should throw this scenario out for the purposes of this 

 > But don't take my word for it. Certificates are so inconvenient, that
 >  people would rather use some two-factor authentication solution,
 > where you type in a password, and then get a one-time code on either
 > a fob or through an SMS message to your phone, which is what Marsh's
 > company does.

The key thing is that the phone is something most people already have 
and are comfortable using. It's hard to overstate the value of that.

Nevertheless, what we (well, Marketing in particular :-) are continually 
up against is this: on some level, the entire purpose of strengthening 
the authentication is to make it harder to log in to the computer. We 
can do our best to minimize how much harder it is for the legitimate 
user, and maximize how much harder it is for the bad guy, but at the end 
of the day the system ends up having more ways to say 'no'. For most 
systems, the vast majority of 'no's are not actual attacks.

Theatrics that don't provide real security are obviously worthless, but
so are strong systems that people don't use. This trade-off between 
security and usability is very real, not theoretical.

Perfect example:

On 12/13/2010 06:14 AM, Carsten Bormann wrote:
 > As a webapp developer, I want to control the user interface during
 > authentication. Leaving the user alone with the bland and
 > unforgiving browser user interface for HTTP auth is suicide. I want
 > to provide extra buttons for forgotten passwords, maybe even
 > password hints, and a "sign up" method.

Yeah. How did the user select their password for the website in the 
first place, if not by an HTML form POST?

If it was good enough for the initial sign up, why should the web 
designer use something other than HTML form POST for the regular login?

(These are rhetorical questions.)

 > HTML/CSS/JS lends itself
 > well to providing such a friendly user interface.
 > As a security geek, I recognize this as exactly the problem that
 > creates the potential for phishing. Having the user type
 > credentials into a random form is never going to be secure, HSTS
 > notwithstanding.

Right. Any time there's an untrusted app painting into a rectangle (e.g. 
the browser window) the only ways to communicate with the user securely 
is outside of that rectangle. Which is why the URL and lock icon are in 
the browser chrome (and why we're in the out-of-band authentication 
business). As people use more mobile devices with smaller screens, this 
chrome gets mostly optimized away (and may confuse the degree to which 
the phone is truly out-of-band).

There's clearly a large set of users for whom this "rectangle" principle 
is too complicated. (It probably doesn't help that browsers allow web 
pages to open new windows, set the title bar and icon, and that they 
repurpose the untrusted rectangle to discuss certificate errors.)

The most secure way I've seen to handle password entry is (don't laugh) 
MS Windows NT through Vista. Windows NT implemented Orange Book security 
and required a "secure attention key" sequence (ctrl+alt+del) before 
every password request. Vista UAC, though a failure in many ways, would 
switch the entire console session, going so far as to reinitializing 
hardware drivers in the process. It performs a whole-screen dimming 
effect which was said to be difficult to duplicate (I'm not sure in 

So IMHO, significant improvements in web authentication would be greatly 
beneficial. But they will have to:

* Require connection integrity. This probably means mandatory TLS.

* Require a user interaction that takes place outside the insecure 
browser rectangle and feels different enough that it's easy to explain 
the difference.

* Not leak info to untrusted parties. These days privacy is often more 
important than traditional security.

* Support browser vendors in making a UI that "sucks less" to have to 
use, or possibly have to use it less. Put users in control of their 
identity and auth credentials without nagging them repeatedly until they 
give in and click the "Yes to all" button.

* Represent an actual improvement in security over the current standard 
of HTML form POST password and secret HTTP session cookie.

- Marsh