Re: Distributed Multimodal Synchronization Protocol [dmsp]

Graham Klyne <> Thu, 09 March 2006 13:11 UTC

Received: from [] ( by with esmtp (Exim 4.43) id 1FHKvI-0007A9-FU; Thu, 09 Mar 2006 08:11:20 -0500
Received: from [] ( by with esmtp (Exim 4.43) id 1FH2wv-0001OR-DG; Wed, 08 Mar 2006 12:59:49 -0500
Received: from ([]) by with esmtp (Exim 4.43) id 1FH2wu-0002OT-R3; Wed, 08 Mar 2006 12:59:49 -0500
Received: from [] ( []) by (8.12.11/8.12.11) with ESMTP id k28I03g6023535; Wed, 8 Mar 2006 10:00:04 -0800
Message-ID: <>
Date: Wed, 08 Mar 2006 17:48:56 +0000
From: Graham Klyne <>
User-Agent: Thunderbird 1.5 (Windows/20051201)
MIME-Version: 1.0
To: Chris Cross <>
Subject: Re: Distributed Multimodal Synchronization Protocol [dmsp]
References: <>
In-Reply-To: <>
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
X-SongbirdInformation: for more information
X-Songbird: Found to be clean
X-Spam-Score: 0.0 (/)
X-Scan-Signature: 93e7fb8fef2e780414389440f367c879
X-Mailman-Approved-At: Thu, 09 Mar 2006 08:11:19 -0500
X-Mailman-Version: 2.1.5
Precedence: list
List-Id: Distributed Multimodal Synchronization Protocol <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>

Chris Cross wrote:
> Hi Everyone,
> I'm sending a note to your list to make the members aware of this
> activity. The I-D "Distributed Multimodal Synchronization Protocol "
> (dmsp) can be found at


Very interesting to see this pop up... I'm thinking about the web view
synchronization problem in a couple of completely different contexts, and I note
that the diagram on page 7 of your Internet draft is almost exactly what I have
been proposing in another context [1].

But this draft seems to combine the basic event propagation model with details
of the specific events used for media synchronization, and I'd like to suggest
that, if this work goes ahead, that the event distribution protocol be defined
separately from the details of the events themselves (e.g. like SMTP/RFC2822 for
email).  Then other applications could use the same event distribution model for
completely different purposes.  (E.g. I have some similar event distribution
requirements for web-based home-control applications.) I also note that existing
work on publish/subscribe systems and/or instant messaging presence protocols
might be used for the protocol.

Turning to the events themselves, I think that to deploy in a Web context it
would be sensible for event types to be identified by URIs rather than binary
codes.  In my own private work to date, I am using a simple event model that
consists of event type and event source, both URIs, and arbitrary additional
payload based on the event type.

Unfortunately, I shall not be present in Dallas to participate in the debate.  I
do think that the requirement that this proposal addresses is symptomatic of a
wider gap in Web/Internet application architecture, and it would be good to see
a simple common solution emerge.  I seem to recall that, some time ago, Lisa
Dussault was discussing some thoughts about notification systems for Internet
applications, which may well have a part to play in this debate.



Chris Cross wrote:
> Hi Everyone,
> I'm sending a note to your list to make the members aware of this
> activity. The I-D "Distributed Multimodal Synchronization Protocol "
> (dmsp) can be found at
> We have
> a mailing list at and a twice monthly phone conference.
> Dmsp is being developed to enable distributed multimodal systems. I've
> been invited to present an overview at both the apps and rai general
> meetings in Dallas. We welcome your attendance at these presentations
> and interested individuals' participation in our mail list and
> conference calls.
> At the bottom of this note is a draft charter we've discussed on the
> list that gives a more detailed description of dmsp.
> thanks,
> chris
> Chris Cross
> Multimodal Browser Architect
> _________________________
> IBM Boca Raton
> voice 561.862.2102
> mobile 561.317.0700
> fax 501.641.6727
> The convergence of wireless communications with information technology
> and the miniaturization of computing platforms have resulted in advanced
> mobile devices that offer high resolution displays, application programs
> with graphical user interfaces, and access to the internet through full
> function web browsers.
> Mobile phones now support most of the functionality of a laptop
> computer. However the miniaturization that has made the technology
> possible and commercially successful also puts constraints on the user
> interface. Tiny displays and keypads significantly reduce the usability
> of application programs.
> Multimodal user interfaces, UIs that offer multiple modes of
> interaction, have been developed that greatly improve the usability of
> mobile devices. In particular multimodal UIs that combine speech and
> graphical interaction are proving themselves in the marketplace.
> However, not all mobile devices provide the computing resources to
> perform speech recognition and synthesis locally on the device. For
> these devices it is necessary to distribute the speech modality to a
> server in the network.
> The Distributed Multimodal Working Group will develop the protocols
> necessary to control, coordinate, and synchronize distributed modalities
> in a distributed Multimodal system. There are several protocols and
> standards necessary to implement such a system including DSR and AMR
> speech compression, session control, and media streaming. However, the
> DM WG will focus exclusively on the synchronization of modalities being
> rendered across a network, in particular Graphical User Interface and
> Voice Servers.
> The DM WG will develop an RFC for a Distributed Multimodal
> Synchronization Protocol that defines the logical message set to effect
> synchronization between modalities and enough background on the expected
> multimodal system architecture (or reference architecture defined
> elsewhere in W3C or OMA) to present a clear understanding of the
> protocol. It will investigate existing protocols for the transport of
> the logical synchronization messages and develop an RFC detailing the
> message format for commercial alternatives, including, possibly, HTTP
> and SIP.
> While not being limited to these, for simplicity of the scope the
> protocol will assume RTP for carriage of media, SIP and SDP for session
> control, and DSR and AMR for speech compression. The working group will
> not consider the authoring of applications as it will be assumed that
> this will be done with existing W3C markup standards such as XHTML and
> VoiceXML and commercial programming languages like Java and C/C++.
> It is expected that we will coordinate our work in the IETF with the W3C
> Multimodal Interaction Work Group.
> The following are our goals for the Working Group.
> Date Milestone
> TBD Submit Internet Draft Describing DMSP (standards track)
> TBD Submit Drafts to IESG for publication
> TBD 2006 Submit DMSP specification to IESG

Graham Klyne
For email:

Dmsp mailing list