Re: chalky cheese and slippery floors

Eddie Zedlewski <Eddie.Zedlewski@niss.ac.uk> Thu, 06 April 1995 12:29 UTC

Received: from ietf.nri.reston.va.us by IETF.CNRI.Reston.VA.US id aa01667; 6 Apr 95 8:29 EDT
Received: from CNRI.Reston.VA.US by IETF.CNRI.Reston.VA.US id aa01663; 6 Apr 95 8:29 EDT
Received: from norn.ncl.ac.uk by CNRI.Reston.VA.US id aa04310; 6 Apr 95 8:29 EDT
Received: by norn.mailbase.ac.uk id <NAA11262@norn.mailbase.ac.uk> (8.6.11/ for mailbase.ac.uk); Thu, 6 Apr 1995 13:07:46 +0100
Received: from tamarin.bath.ac.uk by norn.mailbase.ac.uk id <NAA11251@norn.mailbase.ac.uk> (8.6.11/ for mailbase.ac.uk) with ESMTP; Thu, 6 Apr 1995 13:07:42 +0100
Received: from niss.ac.uk (actually host hands.bath.ac.uk) by tamarin.bath.ac.uk with SMTP (PP); Thu, 6 Apr 1995 13:07:33 +0100
Sender: ietf-archive-request@IETF.CNRI.Reston.VA.US
From: Eddie Zedlewski <Eddie.Zedlewski@niss.ac.uk>
Date: Thu, 06 Apr 1995 13:07:33 -0000
Message-Id: <29088.9504061207@niss.ac.uk>
To: peterd@bunyip.com
Subject: Re: chalky cheese and slippery floors
Cc: nir@mailbase.ac.uk
X-List: nir@mailbase.ac.uk
Reply-To: Eddie Zedlewski <Eddie.Zedlewski@niss.ac.uk>
X-Orig-Sender: nir-request@mailbase.ac.uk
Precedence: list

> The original posting has already sparked some comments,
> but here are a few additional observations...
> 
> [  "C.K.Work" wrote: ]
> 
> } .  .  .  Unless I'm misunderstanding things,
> } this is not the case. WWW as a technology surely offers much more than
> } Gopher can - both now, but perhaps more importantly in terms of scope
> } for development. At present Gopher space is effectively a subset of Web
> } space, and this will remain the case until Gopher clients can read html
> } (in which case won't they be Web clients?).
> .  .  .
. . .

Some further observations.

As always Peter, you present a persuasive argument.  It does seem that
I have been around this loop a number of times with my colleagues in efforts
to plan the future development of NISS which serves 5-10,000 UK HE
users a day via telnet/X.29 and now an increasing amount of web traffic.  

These days the primary focus of these discussions usually gravitate to user demand, 
both in terms of access to and provision of information, and much less towards 
technology issues, though they are important too.  This implies to me that 
despite the additional burden (for now at least) with data preparation in HTML, 
that WWW will be more widely used than Gopher technology for the forseeable future 
and is corroborated by all the commercial interest and development (aka hype).  
No doubt the situation will change some time not too distant but for now it seems 
as though the battle is largely won in favour of WWW - more new recruits are 
joining the web army accompanied by a steady movement of surrenders.  Arguably 
the usability of a VT based gopher client is better but if it doesn't allow access 
to html based data then I don't believe it competes.  From a service providers 
point of view it's only relevant to support both (largely overlapping) 
technologies if the respective user groups are similar in size.

The bandwidth issue however is an increasingly important one.  In my opinion less 
so in the low bandwidth links but more in the network bandwidth capacity.  If 
gopher/a.n.other browsers were to be as successful as web ones then they are facing the 
same problems in dealing with available bandwidth effectively. [... insert lots 
of debate about the technology issues ...]

> Another important issue, and one often ignored in the "WWW
> versus Gopher" wars, is the question of maintainability.
> This has nothing to do with transport and little to do
> with rendering. The question is how easy is it to set up
> and operate a particular server?
> 

> And now a final comment on bandwidth. There are still
> _significant_ numbers of Internet users who do not have
> access to WWW, primarily because of their limited
> bandwidth capabilities. As the net continues to expand
> into more countries and more user communities, I think
> we're going to continue to have need for a system that not
> just tolerates, but supports slow speed links. Lynx makes
> the Web a little more accessible to VT100-capable users,
> but it's not as usable as some of the best Gopher browsers.
> 
> The Web is primarily popular because of the extra info
> sent on each page, primarily in the form of graphics, and
> this pushes a minimum usable Web connection to something
> around 14.4k. In my experience, anything less is
> _painful_. In terms of bandwidth used, the Web community
> boasted when their total backbone traffic (measured in
> bytes transported to port 80) exceeded that served by
> gopher (to port 70). What Web enthusiasts failed to point
> out was that they were doing this with an order of
> magnitude fewer packets (which implied they were still an
> order of magnitude below the number of gopher _references_
> at the time). I think their reference counts have now
> passed Gopher, and I'm _NOT_ questioning the Web's
> popularity, but we do need to look beyond the hype and see
> what people are actually using. Gopher is still popular.
> There are still significant numbers of Gopher users,
> because a) there's data they want and b) it's a system
> they can use.
> 

> Yes, _Mosaic_ has attracted tremendous commercial
> development and opened the net to a new class of user, but
> that says more about the integration of multiple protocols
> into a single client than it does about the Web itself.
> 
> I agree that the long term future of Internet information
> services requires a graphics capability and I also don't
> dispute the popularity of the Web, but I don't think we
> should disenfranchise those on slow speed links, and we

Agreed - but some of this issue is related to how the data/service 
provider presents their information and data format standards.

> shouldn't necessarily endorse a "one protocol/data
> model/browser model fits all" philosophy. That seems
> premature, to say the least.
> 
> Neither WWW nor Gopher are yet anywhere near being done
> (as but one example, neither is at all good at supporting
> reliable services, since neither has any support for
> detecting/correcting broken links short of "try it and see
> if it works"). Finally, there are still significant moves
> to happen in the content type side of things (for example,
> this week Adobe announced a deal with Netscape to provide
> PDF support. Ask yourself what this means for HTML and
> then ask yourself whether you care what protocol serves a
> PDF document to you). 
> 
> In the long run, users don't care two hoots about any
> particular technology (how many people know even the
> slightest thing about the engine in their cars, anyways?).
> As a developer/integrator I care only as insofar as it
> allows me to provide my customers with the tools they need
> at a price they can afford. I say, cutting off branches of
> the development tree at this point seems risky, to say the
> least.
> 
> I've said it before, and I'll say it again. If we declare
> victory today, we risk being accused of endorsing
> WorldWideWeb as the DOS of the '90s.  Sure, DOS works, but
> it could have been so much better if only they'd kept
> working on it, and if only it hadn't spread as quickly as
> it did...
> 
Agreed, there is a great deal left to do and we don't need 
another DOS but everyone is already feeling thinly spread 
and in danger of spending more time sliding around than
running to simply stand still.  

Eddie Zedlewski

> 					- peterd
> 
> 
> -- 
>