Re: Distributed Multimodal Synchronization Protocol [dmsp]

Chris Cross <> Sun, 12 March 2006 13:39 UTC

Received: from [] ( by with esmtp (Exim 4.43) id 1FIQn6-0004SU-LE; Sun, 12 Mar 2006 08:39:24 -0500
Received: from [] ( by with esmtp (Exim 4.43) id 1FIQn5-0004SP-IY for; Sun, 12 Mar 2006 08:39:23 -0500
Received: from ([]) by with esmtp (Exim 4.43) id 1FIQn3-0004Xb-55 for; Sun, 12 Mar 2006 08:39:23 -0500
Received: from ( []) by (8.12.11/8.12.11) with ESMTP id k2CDdK07031493 for <>; Sun, 12 Mar 2006 08:39:20 -0500
Received: from ( []) by (8.12.10/NCO/VER6.8) with ESMTP id k2CDgDfw138904 for <>; Sun, 12 Mar 2006 06:42:13 -0700
Received: from (loopback []) by (8.12.11/8.13.3) with ESMTP id k2CDdKKo019707 for <>; Sun, 12 Mar 2006 06:39:20 -0700
Received: from ( []) by (8.12.11/8.12.11) with ESMTP id k2CDdJEG019703 for <>; Sun, 12 Mar 2006 06:39:19 -0700
In-Reply-To: <>
Subject: Re: Distributed Multimodal Synchronization Protocol [dmsp]
X-Mailer: Lotus Notes Release 7.0 HF85 November 04, 2005
Message-ID: <>
From: Chris Cross <>
Date: Thu, 9 Mar 2006 11:29:58 -0500
X-MIMETrack: Serialize by Router on D03NM119/03/M/IBM(Release 6.53HF654 | July 22, 2005) at 03/12/2006 06:41:59
MIME-Version: 1.0
X-Spam-Score: 0.3 (/)
X-Scan-Signature: 27216fd639035830d9361a5ade4ff99c
X-Mailman-Version: 2.1.5
Precedence: list
List-Id: Distributed Multimodal Synchronization Protocol <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
Content-Type: multipart/mixed; boundary="===============0874229479=="

Its a good suggestion to look at other publish/subscribe models for
guidance. Can you point to other RFCs in this domain?

I'm interested in the details of your URI event model. Have you published
anything? Its important to note that in a multimodal interaction,
synchronization of distributed modalities must take place with  sub-second
latency for the usability of the application to be worth the effort. We
concentrated first on a binary message set to reduce the latency of a
distributed multimodal system implemented over a relatively low bandwidth
cellular network. In that context I would be concerned that a URI scheme
would both inflate the message size and introduce additional turns in a
system that is very sensitive to network latency.


Chris Cross
Multimodal Browser Architect
IBM Boca Raton
voice 561.862.2102  t/l 975.2102
mobile 561.317.0700
fax 501.641.6727

             Graham Klyne                                                  
             g>                                                         To 
                                       Chris Cross/West Palm               
             03/08/2006 12:48          Beach/IBM@IBMUS                     
             PM                                                         cc 
                                       Re: Distributed Multimodal          
                                       Synchronization Protocol [dmsp]     

Chris Cross wrote:
> Hi Everyone,
> I'm sending a note to your list to make the members aware of this
> activity. The I-D "Distributed Multimodal Synchronization Protocol "
> (dmsp) can be found at


Very interesting to see this pop up... I'm thinking about the web view
synchronization problem in a couple of completely different contexts, and I
that the diagram on page 7 of your Internet draft is almost exactly what I
been proposing in another context [1].

But this draft seems to combine the basic event propagation model with
of the specific events used for media synchronization, and I'd like to
that, if this work goes ahead, that the event distribution protocol be
separately from the details of the events themselves (e.g. like
SMTP/RFC2822 for
email).  Then other applications could use the same event distribution
model for
completely different purposes.  (E.g. I have some similar event
requirements for web-based home-control applications.) I also note that
work on publish/subscribe systems and/or instant messaging presence
might be used for the protocol.

Turning to the events themselves, I think that to deploy in a Web context
would be sensible for event types to be identified by URIs rather than
codes.  In my own private work to date, I am using a simple event model
consists of event type and event source, both URIs, and arbitrary
payload based on the event type.

Unfortunately, I shall not be present in Dallas to participate in the
debate.  I
do think that the requirement that this proposal addresses is symptomatic
of a
wider gap in Web/Internet application architecture, and it would be good to
a simple common solution emerge.  I seem to recall that, some time ago,
Dussault was discussing some thoughts about notification systems for
applications, which may well have a part to play in this debate.



Chris Cross wrote:
> Hi Everyone,
> I'm sending a note to your list to make the members aware of this
> activity. The I-D "Distributed Multimodal Synchronization Protocol "
> (dmsp) can be found at
> We have
> a mailing list at and a twice monthly phone conference.
> Dmsp is being developed to enable distributed multimodal systems. I've
> been invited to present an overview at both the apps and rai general
> meetings in Dallas. We welcome your attendance at these presentations
> and interested individuals' participation in our mail list and
> conference calls.
> At the bottom of this note is a draft charter we've discussed on the
> list that gives a more detailed description of dmsp.
> thanks,
> chris
> Chris Cross
> Multimodal Browser Architect
> _________________________
> IBM Boca Raton
> voice 561.862.2102
> mobile 561.317.0700
> fax 501.641.6727
> The convergence of wireless communications with information technology
> and the miniaturization of computing platforms have resulted in advanced
> mobile devices that offer high resolution displays, application programs
> with graphical user interfaces, and access to the internet through full
> function web browsers.
> Mobile phones now support most of the functionality of a laptop
> computer. However the miniaturization that has made the technology
> possible and commercially successful also puts constraints on the user
> interface. Tiny displays and keypads significantly reduce the usability
> of application programs.
> Multimodal user interfaces, UIs that offer multiple modes of
> interaction, have been developed that greatly improve the usability of
> mobile devices. In particular multimodal UIs that combine speech and
> graphical interaction are proving themselves in the marketplace.
> However, not all mobile devices provide the computing resources to
> perform speech recognition and synthesis locally on the device. For
> these devices it is necessary to distribute the speech modality to a
> server in the network.
> The Distributed Multimodal Working Group will develop the protocols
> necessary to control, coordinate, and synchronize distributed modalities
> in a distributed Multimodal system. There are several protocols and
> standards necessary to implement such a system including DSR and AMR
> speech compression, session control, and media streaming. However, the
> DM WG will focus exclusively on the synchronization of modalities being
> rendered across a network, in particular Graphical User Interface and
> Voice Servers.
> The DM WG will develop an RFC for a Distributed Multimodal
> Synchronization Protocol that defines the logical message set to effect
> synchronization between modalities and enough background on the expected
> multimodal system architecture (or reference architecture defined
> elsewhere in W3C or OMA) to present a clear understanding of the
> protocol. It will investigate existing protocols for the transport of
> the logical synchronization messages and develop an RFC detailing the
> message format for commercial alternatives, including, possibly, HTTP
> and SIP.
> While not being limited to these, for simplicity of the scope the
> protocol will assume RTP for carriage of media, SIP and SDP for session
> control, and DSR and AMR for speech compression. The working group will
> not consider the authoring of applications as it will be assumed that
> this will be done with existing W3C markup standards such as XHTML and
> VoiceXML and commercial programming languages like Java and C/C++.
> It is expected that we will coordinate our work in the IETF with the W3C
> Multimodal Interaction Work Group.
> The following are our goals for the Working Group.
> Date Milestone
> TBD Submit Internet Draft Describing DMSP (standards track)
> TBD Submit Drafts to IESG for publication
> TBD 2006 Submit DMSP specification to IESG

Graham Klyne
For email:

Dmsp mailing list