Re: thoughts on the Boston meeting

Ned Freed <NED@innosoft.com> Mon, 03 August 1992 14:22 UTC

Received: from NRI.NRI.Reston.Va.US by IETF.NRI.Reston.VA.US id aa06581; 3 Aug 92 10:22 EDT
Received: from venera.isi.edu by NRI.Reston.VA.US id aa09808; 3 Aug 92 10:22 EDT
Received: by venera.isi.edu (5.65c/5.65+local-6) id <AA15758>; Mon, 3 Aug 1992 00:38:41 -0700
Received: from THOR.INNOSOFT.COM ([192.160.253.66]) by venera.isi.edu (5.65c/5.65+local-6) id <AA15754>; Mon, 3 Aug 1992 00:38:35 -0700
Received: from INNOSOFT.COM by INNOSOFT.COM (PMDF #1336 ) id <01GN4J6B8F0C94DST0@INNOSOFT.COM>; Mon, 3 Aug 1992 00:33:29 PST
Date: Mon, 03 Aug 1992 00:33:29 -0800
From: Ned Freed <NED@innosoft.com>
Subject: Re: thoughts on the Boston meeting
To: KLENSIN@infoods.mit.edu
Cc: ietf@isi.edu
Message-Id: <01GN4J6B8F0E94DST0@INNOSOFT.COM>
X-Vms-To: IN%"KLENSIN@INFOODS.MIT.EDU"
X-Vms-Cc: IN%"ietf@isi.edu"
Mime-Version: 1.0
Content-Type: TEXT/PLAIN; CHARSET="US-ASCII"
Content-Transfer-Encoding: 7bit
Status: O

> >actually, i have been doing alot of thinking about standards lately.
> >i am not sure i am convinced that the IETF really *is* a "standards body".
> >as an example, consider server/client based mail. currently, the IETF has
> >several standards: POP2, POP3, PCMAIL, IMAP, and SMTP. in my mind, a
> >"standards body" woudl say : if you are doing mail, you do it this way. the
> >IETF seems to say, if you are doing mail like this, do it this way. 

First of all, this statement is not, on the face of it, correct. Although SMTP
can play a part in client-server mail it was and is the basis for peer-to-peer
mail transport, which is a very different beast. The client-server
implementations that use SMTP don't have corresponding functionality elsewhere,
so when SMTP is used it is orthogonal and there is no overlap.

POP2 is a historical, "not recommended" protocol. POP3 is an elective draft
standard. IMAP is experimental. PCMAIL has, I believe, been moved to
informational status, since it clearly was not advancing. Only its installed
base kept it from being moved to historical status. (One only has to look at
FIPS 98, the predecessor of X.400-1984, to see that support for old protocols
is an issue for everyone. There are a lot of systems out there that still use
FIPS 98.)

All this leaves us with one draft standard and one experimental protocol that
each occupy very different niches. Now, you can argue that two protocols are
not needed, but that's a design issue. It all depends on where you draw the
lines. Consider the ISO, which has P1, P2, P3, and P7. You can certainly argue
that a more general protocol could subsume P3 and P7 and perhaps even P1.
(Actually, the reality is that nobody uses P3 and nobody cares about it.) But
the lines got drawn in a particular way and that's that.

> Funny you should say that.  Or "an interesting and widely-held
> misunderstanding".  Not about IETF, mind you, but about the bodies that
> are usually held up as prototypical "standards bodies".  It is almost
> always, even in ANSI-land and ISO-land, a matter of "if you are doing
> X like this, do it this way".  We have, for example, ANSI Standard
> Fortran, and COBOL, and PL/I, and Mumps, and Pascal, and C, and..., not
> "if you need a procedure-oriented programming language, use this one". 

Precisely correct. Speaking from some experience, there is little if any
thought given to making any particular language the be-all and end-all of
programming. It would be foolish to think that any language could ever aspire
to such a position, and the people on the committees are usually not this
foolish. Much more attention is paid to fitting in functionality the committee
has agreed is useful while maintaining the "style" of the language.

It is also rather astonishing how much influence the committee members have on
what ends up in the language. In many cases things are added simply to appease
one committee member with axe to grind in a particular area. Language design is
particularly well-suited to this since it is so malleable. Communications
protocols are not nearly so forgiving and offer relatively fewer opportunities
for, shall we say, "the personal touch".

> The giant and growing family of 802.N standards are another good
> example: "if you decide to use this particular low-level protocol, this
> is its definition", not "choose this one".

>   There is lots of pressure, especially in ISO (ANSI gave up), to
> prevent overlap and provide clear "if you do it, do it this way" advice,
> but it doesn't go anywhere, probably for obvious reasons.  If there is a
> difference, it may be that the CCITT/ISO model tends to generate very
> complex, but single, standards that one has to select a profile to use
> (since you used mail as an example, X.400 comes to mind) while we tend
> toward lots of simple protocols and selection between them.

All true, but the simple protocols tend to dovetail rather well, in my
experience. Internet protocols are largely independent (in the sense of
standing alone) of each other but also largely orthogonal to each other. This
is partly a result of the try-before-accepting mechanism, which tends to block
the standardization of things that don't get used or overlap something else
that's already deployed. The need to show implementations rather quickly also
leads to the simplification of protocols, and standards are simpler when they
don't depend on other standards.

On the other hand, the large and comprehensive ISO protocols tend to cover too
much territory and incorporate functionality that is rarely used (like
X.400-P3). There is also a higher degree of overlap (look at character set
standards!) in many areas, since redundancy does not get caught by actual
implementations. And worst of all, it is extremely hard to pry a typical ISO
protocol away from its dependency on the ISO layers below it. Dependencies are
rampant and excessive. All this makes it more and more difficult to develop and
deploy things in a timely fashion. And much of this would not be this way if
a "try-before-acceptance" rule existed.

> As self-righteous as some of us get sometimes about "simple protocols, few
> options", there may not be a lot of difference other than the thickness
> of the documents and the consequent amount of noise that must be
> traversed once appropriate profiles are chosen (just take that as a
> cynical observation, I'm not trying to start a religious war here).

One way to look at it is that the IETF's process of proposed-draft-full
standards serves as a kind of profiling mechanism.

> >it seems
> >to me that, because of the current configuration of the IETF, that the IETF
> >is really a standards registering body. currently, there are really two
> >...

The IETF does things in two rather different ways. In some cases working
groups produce standards. In other cases documents produced by individuals
or concerns are accepted as standards. And there are various gradations
between these extremes as well. So the process in practice varies from
simple registration to pure development.

>    Mmm.  So is ANSI.  They really only do two things.  One is to 
> accredit standards developers, basically a process of figuring out who
> they are going to trust, pay attention to, etc.  And the other is to 
> "approve" standards.  But "approve" really means "register", to a much 
> greater degree than in IETF, since the only criteria that are ever
> applied at the ANSI level have to do with whether procedures were
> followed, whether dissenters were heard, etc.  ANSI's Board of Standards
> Review (the folks who are at the very end of the "authorizing
> registration" pipe) consists mostly of lawyers with no pretentions of
> technical or product-oriented expertise about *anything*.  And, since
> they didn't come up through the technical side, they don't have any
> trouble remembering that they aren't in the business of second-guessing
> technical decisions by those accredited standards developers. (No
> criticism intended; technical review *is* in IAB's and IESG's job 
> descriptions at the moment and most of the SDOs have more than
> sufficient layers of technical review internally.)

This is once again absolutely correct -- the IESG and IAB have significant
influence on the technical content of standards. As such, there is a lot more
happening than simple registration (at least there was in the case where I was
involved) at the upper levels of the process.

The IETF even has mechanisms for providing feedback to standards developers
as to what's going to fly and what will not. Although these things are _very_
informal (formal mechanisms would be far less useful) they are nevertheless
an essential part of getting work done in a timely fashion. There would be
many more iterations between working group and IESG and IAB if it weren't
for these things.

When I compare this to the work I've been involved in with ANSI the differences
are staggering. I don't know anybody one level up in the heirarchy; there is no
reason for me to know or care (I do happen to people in other places in the
hierarchy, but that's not the point). Only my committee chairman interacts with
them directly. The committee gets occasional letters and memos from other parts
of ANSI that have to be dealt with, but if they come from higher up they are
always nontechnical in nature. (We do interact occasionally with other
technical committees on technical issues -- this is a Good Thing that there
should be a lot more of.)

When a procedural error occurs the number of iterations needed to correct it
can be huge. It can literally take years to correct some oversight in the
paperwork. Fortunately this rarely results in overt delays in getting standards
out -- it usually just annoys the technical people and makes them less willing
to stay involved.

				Ned