RE: Minimizing/avoiding User-Agent, was: SPDY Header Frames

Anil Sharma <asharma@sandvine.com> Wed, 18 July 2012 11:34 UTC

Return-Path: <ietf-http-wg-request@listhub.w3.org>
X-Original-To: ietfarch-httpbisa-archive-bis2Juki@ietfa.amsl.com
Delivered-To: ietfarch-httpbisa-archive-bis2Juki@ietfa.amsl.com
Received: from localhost (localhost [127.0.0.1]) by ietfa.amsl.com (Postfix) with ESMTP id 3375421F86E1 for <ietfarch-httpbisa-archive-bis2Juki@ietfa.amsl.com>; Wed, 18 Jul 2012 04:34:55 -0700 (PDT)
X-Virus-Scanned: amavisd-new at amsl.com
X-Spam-Flag: NO
X-Spam-Score: -9.999
X-Spam-Level:
X-Spam-Status: No, score=-9.999 tagged_above=-999 required=5 tests=[BAYES_00=-2.599, J_CHICKENPOX_63=0.6, RCVD_IN_DNSWL_HI=-8]
Received: from mail.ietf.org ([12.22.58.30]) by localhost (ietfa.amsl.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id vdFzKQmR1y-J for <ietfarch-httpbisa-archive-bis2Juki@ietfa.amsl.com>; Wed, 18 Jul 2012 04:34:54 -0700 (PDT)
Received: from frink.w3.org (frink.w3.org [128.30.52.56]) by ietfa.amsl.com (Postfix) with ESMTP id 8AF0421F86E0 for <httpbisa-archive-bis2Juki@lists.ietf.org>; Wed, 18 Jul 2012 04:34:53 -0700 (PDT)
Received: from lists by frink.w3.org with local (Exim 4.72) (envelope-from <ietf-http-wg-request@listhub.w3.org>) id 1SrSX6-0006xG-Vo for ietf-http-wg-dist@listhub.w3.org; Wed, 18 Jul 2012 11:35:09 +0000
Resent-Date: Wed, 18 Jul 2012 11:35:08 +0000
Resent-Message-Id: <E1SrSX6-0006xG-Vo@frink.w3.org>
Received: from lisa.w3.org ([128.30.52.41]) by frink.w3.org with esmtp (Exim 4.72) (envelope-from <asharma@sandvine.com>) id 1SrSWw-0005Mt-U5 for ietf-http-wg@listhub.w3.org; Wed, 18 Jul 2012 11:34:58 +0000
Received: from mail1.sandvine.com ([64.7.137.134]) by lisa.w3.org with esmtps (TLS1.0:RSA_AES_128_CBC_SHA1:16) (Exim 4.72) (envelope-from <asharma@sandvine.com>) id 1SrSWv-0001lo-8X for ietf-http-wg@w3.org; Wed, 18 Jul 2012 11:34:58 +0000
Received: from blr-exch-1.sandvine.com (10.30.4.60) by WTL-EXCH-1.sandvine.com (192.168.196.31) with Microsoft SMTP Server (TLS) id 14.1.339.1; Wed, 18 Jul 2012 07:34:31 -0400
Received: from BLR-EXCH-1.sandvine.com ([fe80::b896:bd62:3a8d:e51d]) by blr-exch-1.sandvine.com ([fe80::9cf8:c4d0:d0d9:2ac1%10]) with mapi id 14.01.0289.001; Wed, 18 Jul 2012 17:03:33 +0530
From: Anil Sharma <asharma@sandvine.com>
To: Nicolas Mailhot <nicolas.mailhot@laposte.net>, Julian Reschke <julian.reschke@gmx.de>
CC: HTTP Working Group <ietf-http-wg@w3.org>
Thread-Topic: Minimizing/avoiding User-Agent, was: SPDY Header Frames
Thread-Index: AQHNZE0QvBk8aHq48kiosdJUnSFtMJcteOGAgAAFpACAAWgskA==
Date: Wed, 18 Jul 2012 11:33:32 +0000
Message-ID: <06923070D47780469EBD9A1CDABAB6D6197FD991@blr-exch-1.sandvine.com>
References: <34f9c6a0d9659dcfc9adcf38fdace0dd.squirrel@arekh.dyndns.org> <5005B6FD.2030706@gmx.de> <39a908269aed0fec3d0456ce7f7f38b2.squirrel@arekh.dyndns.org>
In-Reply-To: <39a908269aed0fec3d0456ce7f7f38b2.squirrel@arekh.dyndns.org>
Accept-Language: en-US
Content-Language: en-US
X-MS-Has-Attach:
X-MS-TNEF-Correlator:
x-originating-ip: [10.30.10.225]
Content-Type: text/plain; charset="utf-8"
Content-Transfer-Encoding: base64
MIME-Version: 1.0
Received-SPF: none client-ip=64.7.137.134; envelope-from=asharma@sandvine.com; helo=mail1.sandvine.com
X-W3C-Hub-Spam-Status: No, score=-1.9
X-W3C-Hub-Spam-Report: BAYES_00=-1.9, T_RP_MATCHES_RCVD=-0.01
X-W3C-Scan-Sig: lisa.w3.org 1SrSWv-0001lo-8X 3e64a714eac776da2c839c2389b3c991
X-Original-To: ietf-http-wg@w3.org
Subject: RE: Minimizing/avoiding User-Agent, was: SPDY Header Frames
Archived-At: <http://www.w3.org/mid/06923070D47780469EBD9A1CDABAB6D6197FD991@blr-exch-1.sandvine.com>
Resent-From: ietf-http-wg@w3.org
X-Mailing-List: <ietf-http-wg@w3.org> archive/latest/14480
X-Loop: ietf-http-wg@w3.org
Resent-Sender: ietf-http-wg-request@w3.org
Precedence: list
List-Id: <ietf-http-wg.w3.org>
List-Help: <http://www.w3.org/Mail/>
List-Post: <mailto:ietf-http-wg@w3.org>
List-Unsubscribe: <mailto:ietf-http-wg-request@w3.org?subject=unsubscribe>

I was trying to understand impact of HTTP2.0 or SPDY on intermediaries.

I think Metadata will always be in cleartext so redirect or proxies will still be low cost operation..

Even if major traffic sources such as Google, Facebook and Twitter go the TLS route, I think valid intermediaries (SPs doing traffic classification, traffic policy control, CDNs doing embedded video optimization) can always use TLS proxy functionality to achieve their goal.

Not sure if I understood the impact perfectly yet but I think intermediaries won't be severely impact even if most of the major players chose to use TLS....

Anil 

-----Original Message-----
From: Nicolas Mailhot [mailto:nicolas.mailhot@laposte.net] 
Sent: Wednesday, July 18, 2012 12:54 AM
To: Julian Reschke
Cc: Nicolas Mailhot; HTTP Working Group
Subject: Re: Minimizing/avoiding User-Agent, was: SPDY Header Frames


Le Mar 17 juillet 2012 21:03, Julian Reschke a écrit :
> On 2012-07-17 20:50, Nicolas Mailhot wrote:
>> Julian Reschke <julian.reschke@...> writes:
>>
>>>
>>> On 2012-07-17 15:38, Poul-Henning Kamp wrote:
>>
>>>> There must be a smarter way than "User-Agent:"...
>>>
>>> Actually one nice potential optimization is if the server can declare
>>> that it's not interested in the User-Agent at all; see
>>> <http://tools.ietf.org/html/draft-nottingham-http-browser-hints-03#section-5.7>
>>
>> The server may not be interested by intermediaries may still be
>>
>> (while ugly user-agent special-casing is quite useful for proxy
>> operators
>> that have to contend with web clients that were never really tested with
>> proxies and misbehave big way)
>
> Could you elaborate? What kind of misbehavior are you referring to?

Typical misbehaviour is inability to cope with intermediary-inserted
redirect and error codes, and retrying in a loop (because the developer
could not bother to write error handling code, considers that all errors
on the Internet are transient, so with enough retrying he will get
whatever he expected to get at first)

Then you get the web clients that try to interpret error pages as if they
were whatever they expected to get at first, with sometimes weird results.

Then you get the 'pump as much as you can' web client that will starve
everything else.

It's usually less invasive to blacklist or limit just the specific
troublesome client instead of targeting the URLs it tries to access or the
system it runs on (because the buggy part is the web client,the user who
installed it may do legitimate work with other web clients at the same
time, the broken web client will misbehave the same way tomorrow with
other web sites, and this way the protection is in place before other
systems are infected with brokenware)

I really hope HTTP/2 makes intermediaries first-call citizens and
clarifies intermediary/web client signalling so such things happen less
often in the http/2 future (though bugs will always exist, so a client
signature to home on is nice)

-- 
Nicolas Mailhot