[codec] Testing, requirements, and IETF process

Cullen Jennings <fluffy@cisco.com> Fri, 15 April 2011 00:55 UTC

Return-Path: <fluffy@cisco.com>
X-Original-To: codec@ietfc.amsl.com
Delivered-To: codec@ietfc.amsl.com
Received: from localhost (localhost [127.0.0.1]) by ietfc.amsl.com (Postfix) with ESMTP id 9C622E065C for <codec@ietfc.amsl.com>; Thu, 14 Apr 2011 17:55:26 -0700 (PDT)
X-Virus-Scanned: amavisd-new at amsl.com
X-Spam-Flag: NO
X-Spam-Score: -110.58
X-Spam-Level:
X-Spam-Status: No, score=-110.58 tagged_above=-999 required=5 tests=[AWL=0.019, BAYES_00=-2.599, RCVD_IN_DNSWL_HI=-8, USER_IN_WHITELIST=-100]
Received: from mail.ietf.org ([208.66.40.236]) by localhost (ietfc.amsl.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id TTqd+-ST89cT for <codec@ietfc.amsl.com>; Thu, 14 Apr 2011 17:55:26 -0700 (PDT)
Received: from sj-iport-2.cisco.com (sj-iport-2.cisco.com [171.71.176.71]) by ietfc.amsl.com (Postfix) with ESMTP id B9E66E0679 for <codec@ietf.org>; Thu, 14 Apr 2011 17:55:25 -0700 (PDT)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=cisco.com; i=fluffy@cisco.com; l=2657; q=dns/txt; s=iport; t=1302828925; x=1304038525; h=from:content-transfer-encoding:subject:date:message-id: to:mime-version; bh=Hvs6ohEieA4IK+CFPuXUyyQoc46q9sE7zbBIeI8AZ0s=; b=YyuTs/SopvLU2ycpddFN3nfExuroL+zrfw8WKwYCbQT9oqeXvauCmRWr 5CQ0EYKtASz9xb5jjCaQio/DFxg/DZ+q7aWAXuiiG7ezpETJ9V+mg+SDe S8cWzQsuHH10SxClf3BeI5qaeFHtm8U0zjHiWdUfL4OWuO+7CLGO3Qqh5 Y=;
X-IronPort-Anti-Spam-Filtered: true
X-IronPort-Anti-Spam-Result: ApkHAEGXp02rRDoI/2dsb2JhbACYQI0/d6cxnQ6FbgSFWogVg3M
X-IronPort-AV: E=Sophos;i="4.64,214,1301875200"; d="scan'208";a="337825032"
Received: from mtv-core-3.cisco.com ([171.68.58.8]) by sj-iport-2.cisco.com with ESMTP; 15 Apr 2011 00:55:08 +0000
Received: from [192.168.4.100] (rcdn-fluffy-8712.cisco.com [10.99.9.19]) by mtv-core-3.cisco.com (8.14.3/8.14.3) with ESMTP id p3F0sEwk016196 for <codec@ietf.org>; Fri, 15 Apr 2011 00:55:07 GMT
From: Cullen Jennings <fluffy@cisco.com>
Content-Type: text/plain; charset="us-ascii"
Content-Transfer-Encoding: quoted-printable
Date: Thu, 14 Apr 2011 18:55:07 -0600
Message-Id: <8FDB3712-6A59-4E6E-872B-B69C4E574257@cisco.com>
To: codec@ietf.org
Mime-Version: 1.0 (Apple Message framework v1084)
X-Mailer: Apple Mail (2.1084)
Subject: [codec] Testing, requirements, and IETF process
X-BeenThere: codec@ietf.org
X-Mailman-Version: 2.1.12
Precedence: list
List-Id: Codec WG <codec.ietf.org>
List-Unsubscribe: <https://www.ietf.org/mailman/options/codec>, <mailto:codec-request@ietf.org?subject=unsubscribe>
List-Archive: <http://www.ietf.org/mail-archive/web/codec>
List-Post: <mailto:codec@ietf.org>
List-Help: <mailto:codec-request@ietf.org?subject=help>
List-Subscribe: <https://www.ietf.org/mailman/listinfo/codec>, <mailto:codec-request@ietf.org?subject=subscribe>
X-List-Received-Date: Fri, 15 Apr 2011 00:55:26 -0000

Just wanted to comment on a few bits from other threads. 

Requirements, and testing to see if a technology meets those requirements, can serve many purposes. One is to help guide the development and select between alternative proposals. Often we use requirements to drive design decisions. Other times we use testing of requirements to help evaluate if a given codec would meet our needs or not. Other times we use requirements and test results to help a particular implementer or user decide which technology was optimal for a given product they are developing or deploying. I think there are several points that many people would agree with 

1) test results are good

2) understanding how a test was done and being able to reproduce it increases the confidence in test results 

3) we can't delay forever doing testing

4) perfection is the enemy of good

When this WG decides if opus is ready to publish, they key questions will be do people think this specification should be published or not. Clearly to make an informed decision about that many people will want to know things like which requirements does it meet and which does it not meet and what sort of testing has been done and the results of that testing. At that point opus might proceed to become a proposed standard. Note it is *proposed* - after further deployment experience and testing and possible fixes it might become a draft standard and then it could finally become an internet standard after even more work. In parallel to all this, as someone else pointed out, testing will continue and that testing will help inform people about how this compares to other codecs so that can decide in which applications opus might be an appropriate codec to use. 

In the last meeting there was some discussion on they idea that there are key use cases that are important to lots of people, these use cases implied certain requirements, and prioritizing testing of theses requirements was probably a good idea. Some people would be interested in how this compares to codecs that are not royalty free, and I'm sure that test results with non RF codecs will help inform some decisions. However, the WG would like to produce a royalty free codec and is particularly interested in how opus compares royalty free codecs. The question of if opus is royalty free is not a question this WG is debating though the IPR disclosures are noted. The question of how opus compares to royalty free codecs is one that many people see as critical to the decision to publish opus or not.