Re: [codec] A concrete proposal for requirements and testing

Paul Coverdale <coverdale@sympatico.ca> Fri, 08 April 2011 01:25 UTC

Return-Path: <coverdale@sympatico.ca>
X-Original-To: codec@core3.amsl.com
Delivered-To: codec@core3.amsl.com
Received: from localhost (localhost [127.0.0.1]) by core3.amsl.com (Postfix) with ESMTP id DEFAC3A6934 for <codec@core3.amsl.com>; Thu, 7 Apr 2011 18:25:51 -0700 (PDT)
X-Virus-Scanned: amavisd-new at amsl.com
X-Spam-Flag: NO
X-Spam-Score: -1.796
X-Spam-Level:
X-Spam-Status: No, score=-1.796 tagged_above=-999 required=5 tests=[AWL=-0.001, BAYES_00=-2.599, HTML_MESSAGE=0.001, MSGID_FROM_MTA_HEADER=0.803]
Received: from mail.ietf.org ([64.170.98.32]) by localhost (core3.amsl.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id NCDCpfvPii8K for <codec@core3.amsl.com>; Thu, 7 Apr 2011 18:25:50 -0700 (PDT)
Received: from blu0-omc2-s30.blu0.hotmail.com (blu0-omc2-s30.blu0.hotmail.com [65.55.111.105]) by core3.amsl.com (Postfix) with ESMTP id 668953A6881 for <codec@ietf.org>; Thu, 7 Apr 2011 18:25:50 -0700 (PDT)
Received: from BLU0-SMTP69 ([65.55.111.71]) by blu0-omc2-s30.blu0.hotmail.com with Microsoft SMTPSVC(6.0.3790.4675); Thu, 7 Apr 2011 18:27:35 -0700
X-Originating-IP: [67.70.131.179]
X-Originating-Email: [coverdale@sympatico.ca]
Message-ID: <BLU0-SMTP69AE4140E39A6676FE8924D0A70@phx.gbl>
Received: from PaulNewPC ([67.70.131.179]) by BLU0-SMTP69.blu0.hotmail.com over TLS secured channel with Microsoft SMTPSVC(6.0.3790.4675); Thu, 7 Apr 2011 18:27:33 -0700
From: Paul Coverdale <coverdale@sympatico.ca>
To: 'Koen Vos' <koen.vos@skype.net>, 'Roman Shpount' <roman@telurix.com>
References: <BANLkTinTZUyRBYUQq7igHB74r8cXsUMPCg@mail.gmail.com> <68265148.2587531.1302224259820.JavaMail.root@lu2-zimbra>
In-Reply-To: <68265148.2587531.1302224259820.JavaMail.root@lu2-zimbra>
Date: Thu, 07 Apr 2011 21:27:27 -0400
MIME-Version: 1.0
Content-Type: multipart/alternative; boundary="----=_NextPart_000_0013_01CBF56A.99843CF0"
X-Mailer: Microsoft Office Outlook 12.0
Thread-Index: Acv1iAzXr8zxWcGSRl+mQqC6+m2bewAApUDg
Content-Language: en-us
X-OriginalArrivalTime: 08 Apr 2011 01:27:33.0824 (UTC) FILETIME=[22CC6C00:01CBF58C]
Cc: codec@ietf.org, 'Stephen Botzko' <stephen.botzko@gmail.com>
Subject: Re: [codec] A concrete proposal for requirements and testing
X-BeenThere: codec@ietf.org
X-Mailman-Version: 2.1.9
Precedence: list
List-Id: Codec WG <codec.ietf.org>
List-Unsubscribe: <https://www.ietf.org/mailman/listinfo/codec>, <mailto:codec-request@ietf.org?subject=unsubscribe>
List-Archive: <http://www.ietf.org/mail-archive/web/codec>
List-Post: <mailto:codec@ietf.org>
List-Help: <mailto:codec-request@ietf.org?subject=help>
List-Subscribe: <https://www.ietf.org/mailman/listinfo/codec>, <mailto:codec-request@ietf.org?subject=subscribe>
X-List-Received-Date: Fri, 08 Apr 2011 01:25:52 -0000

To me, the “finished product” means the final, frozen, design, complete with all of the inherent trade-offs that may have been made. Formal testing, based on an agreed test plan, can then be carried out to determine whether that design does indeed meet all of the initial requirements.

 

Regards,

 

…Paul

 

From: codec-bounces@ietf.org [mailto:codec-bounces@ietf.org] On Behalf Of Koen Vos
Sent: Thursday, April 07, 2011 8:58 PM
To: Roman Shpount
Cc: codec@ietf.org; Stephen Botzko
Subject: Re: [codec] A concrete proposal for requirements and testing

 

Roman:

How could what Stephen Botzko describes as "a codec characterization/quality assessment that will be done on the finished product" possibly improve coding efficiency?

best,
koen.



  _____  

From: "Roman Shpount" <roman@telurix.com>
To: "Koen Vos" <koen.vos@skype.net>
Cc: "Stephen Botzko" <stephen.botzko@gmail.com>, codec@ietf.org
Sent: Thursday, April 7, 2011 5:50:06 PM
Subject: Re: [codec] A concrete proposal for requirements and testing

Quoting Nokia's paper: "Codecs done without thorough standardization effort like Speex and iLBC offer significantly reduced efficiency, probably due to much lesser optimization, listening tests and IPR free design." Testing is an important part of development and standardization process for a CODEC.

It is in everybody's interest to produce the best CODEC possible. Comments that "we listen to it and we like it" and "you are free to run your own tests", even though valid, are not very productive. I think the best course possible is to treat testing the same way as we do development. We need to collaborate in putting together a comprehensive test plan, and then in actual testing. I am not sure I can contribute a lot in creating a test plan, but I can definitely contribute in testing if such test plan exists. I think this is a common situation for a lot of people in this group.
_____________
Roman Shpount



On Thu, Apr 7, 2011 at 8:24 PM, Koen Vos <koen.vos@skype.net> wrote:

Stephen Botzko wrote:
> This is not a debugging task, it is a codec characterization/quality assessment that will be done on the finished product. 

Does such testing have a realistic chance of revealing something that should alter the course of the WG?  
If not, then the testing can be done separately.
If yes: how precisely?

Is the concern that the codec may not be good enough to be worth publishing, given the other codecs already out there?
--> In the tests so far, Opus has consistently outperformed iLBC, Speex, G.722.1, etc.  To me it seems not realistic to expect that pattern to reverse with more or "better" testing.

Or is the concern that for certain input signals or network conditions the codec produces results that are somehow unacceptable?
--> Opus has gone through all kinds of testing during development.  Feel free to do some more, you don't need test plans or consensus for that.  Remember that there will always be untested cases (speech with dog barking in background over dial up hasn't been tried yet I believe), and that's ok.  

Characterization may be useful, but I don't see why it should be a deliverable of the WG.

best,
koen.



  _____  

From: "Stephen Botzko" <stephen.botzko@gmail.com>
To: "Ron" <ron@debian.org>
Cc: codec@ietf.org
Sent: Thursday, April 7, 2011 8:46:05 AM


Subject: Re: [codec] A concrete proposal for requirements and testing

Hi Ron

This is not a debugging task, it is a codec characterization/quality assessment that will be done on the finished product. 

In my view, the proper way to go about it is to first get consensus on the tests that need to be run, what results are needed, and how the tests need to be conducted in order to meaningfully compare the results.  Then get folks signed up to actually do the tests.

This is not about collaboration vs competition, rather it is about running distributed tests with results that can integrated and scientifically analyzed/compared.  And recording the test methods, in order to allow other people in the future to re-do the tests and duplicate the results.

Regards,
Stephen Botzko