Re: [MMUSIC] What is an m-line?

Bernard Aboba <> Wed, 15 May 2013 22:44 UTC

Return-Path: <>
Received: from localhost (localhost []) by (Postfix) with ESMTP id 286A711E80BA for <>; Wed, 15 May 2013 15:44:04 -0700 (PDT)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -102.458
X-Spam-Status: No, score=-102.458 tagged_above=-999 required=5 tests=[AWL=0.140, BAYES_00=-2.599, HTML_MESSAGE=0.001, USER_IN_WHITELIST=-100]
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id 78UT7dI1jsfu for <>; Wed, 15 May 2013 15:43:59 -0700 (PDT)
Received: from ( []) by (Postfix) with ESMTP id 05F0921F8437 for <>; Wed, 15 May 2013 15:43:52 -0700 (PDT)
Received: from BLU169-W74 ([]) by with Microsoft SMTPSVC(6.0.3790.4675); Wed, 15 May 2013 15:43:51 -0700
X-TMN: [UV2y9OllaQA/eXhq0MXqSTBZG/O8+2BTeX3+3IRkx8w=]
X-Originating-Email: []
Message-ID: <BLU169-W743682718AEA906C0F3C2A93A20@phx.gbl>
Content-Type: multipart/alternative; boundary="_2f96a665-00b8-4027-95aa-7534604a700e_"
From: Bernard Aboba <>
To: Harald Alvestrand <>, "" <>
Date: Wed, 15 May 2013 15:43:51 -0700
Importance: Normal
In-Reply-To: <>
References: <>, , <>, , <>, , <>, <BLU169-W119163BAE68316B331EE12193A20@phx.gbl>, <>
MIME-Version: 1.0
X-OriginalArrivalTime: 15 May 2013 22:43:51.0719 (UTC) FILETIME=[AC07E370:01CE51BD]
Subject: Re: [MMUSIC] What is an m-line?
X-Mailman-Version: 2.1.12
Precedence: list
List-Id: Multiparty Multimedia Session Control Working Group <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Wed, 15 May 2013 22:44:04 -0000

Harald said: 



"I take a somewhat different approach:

    If it's possible to *write an application* that uses the WebRTC
    interface, does not mangle the SDP, and sends that to an existing
    implementation (supporting SRTP and ICE), and has something useful
    happen, I think it's the success we can hope for." [BA] Since existing non-WebRTC audio applications typically have to do at least some SDP mangling,  WebRTC would be at parity if the SDP mangling is no worse than what is typically required today.  If we can reach that point with an audio application (which I think is plausible) then the situation would be tolerable, albeit not perfect.  The SDP mangling in the WebRTC sample application is at that level -- not pretty, but not shocking by any means.  For video, the non-WebRTC interop situation is much worse -- both SDP mangling (nastier than for audio) and RTP/RTCP mangling is typically involved, even for implementations of the same codec.  Since RTP/RTCP mangling is the really expensive operation, if this can be avoided, the cost of  interop can be brought way down.  In the non-WebRTC world, we are just reaching the point where this has become feasible, so I would set the WebRTC bar in roughly the same place - we may end up with more SDP mangling than is comfortable, but we need to avoid RTP/RTCP mangling if at all possible.  Harald also said:

    "This may mean having the application use content settings to force
    tracks to different m-lines, keep low the number of tracks offered,
    and other tricks that make sense in the context of *that

    It's the *application* that needs to interwork, it's not "WebRTC in
    general". WebRTC just needs to make it possible." [BA] Right.  The reality is that most "applications" (90+ percent of them) won't need to put in this effort, so it just needs to be possible to achieve for those that do.   And as long as it is not necessary to touch every RTP packet in the process, we are just talking about a one-time application development cost that is "just software".  Harald finally said:

    "I see the idea that *any application* should produce SDP that's
    useful in such a scenario as an illusion, and pursuing that illusion
    is just going to hurt us." [BA] Agree. s/illusion/delusion/ :)




        What I *would* like is for
            the RTP/RTCP produced by WebRTC implementations to not
            require a compute intensive media gateway/transcoder/munger
            to enable media to be handled by existing systems, at least
            when the same video codecs are available (and no, that
            shouldn't necessarily mean "video codec implementations
            based on the same source code", because the spec should be
            good enough to enable interoperability of independent


    I find nothing to disagree with in this paragraph.



mmusic mailing list