Re: [codec] Summary of test results

Jean-Marc Valin <jmvalin@jmvalin.ca> Mon, 27 June 2011 14:35 UTC

Return-Path: <jmvalin@jmvalin.ca>
X-Original-To: codec@ietfa.amsl.com
Delivered-To: codec@ietfa.amsl.com
Received: from localhost (localhost [127.0.0.1]) by ietfa.amsl.com (Postfix) with ESMTP id EAD5C21F864E for <codec@ietfa.amsl.com>; Mon, 27 Jun 2011 07:35:43 -0700 (PDT)
X-Virus-Scanned: amavisd-new at amsl.com
X-Spam-Flag: NO
X-Spam-Score: 0.001
X-Spam-Level:
X-Spam-Status: No, score=0.001 tagged_above=-999 required=5 tests=[BAYES_50=0.001]
Received: from mail.ietf.org ([64.170.98.30]) by localhost (ietfa.amsl.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id v7O6LfjGJ8Im for <codec@ietfa.amsl.com>; Mon, 27 Jun 2011 07:35:42 -0700 (PDT)
Received: from smtpi4.usherbrooke.ca (smtpi4.USherbrooke.ca [132.210.236.3]) by ietfa.amsl.com (Postfix) with ESMTP id 3233621F8651 for <codec@ietf.org>; Mon, 27 Jun 2011 07:35:38 -0700 (PDT)
Received: from localhost (www09.sti.USherbrooke.ca [132.210.244.33]) by smtpi4.usherbrooke.ca (8.13.8/8.13.8) with ESMTP id p5REZIL7015470; Mon, 27 Jun 2011 10:35:19 -0400
Received: from 207.61.160.13 ([207.61.160.13]) by www.usherbrooke.ca (Horde Framework) with HTTP; Mon, 27 Jun 2011 10:35:18 -0400
Message-ID: <20110627103518.13554r8fe6qf38kk@www.usherbrooke.ca>
Date: Mon, 27 Jun 2011 10:35:18 -0400
From: Jean-Marc Valin <jmvalin@jmvalin.ca>
To: Christian Hoene <hoene@uni-tuebingen.de>
References: <027A93CE4A670242BD91A44E37105AEF18634B6957@ESESSCMS0351.eemea.ericsson.se> <1826386229.2244245.1308749493214.JavaMail.root@lu2-zimbra> <002501cc33f0$5f1c2770$1d547650$@uni-tuebingen.de>
In-Reply-To: <002501cc33f0$5f1c2770$1d547650$@uni-tuebingen.de>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; DelSp="Yes"; format="flowed"
Content-Disposition: inline
Content-Transfer-Encoding: 8bit
User-Agent: Internet Messaging Program (IMP) H3 (4.3.7)
X-Originating-IP: 207.61.160.13
X-UdeS-MailScanner-Information:
X-UdeS-MailScanner-ID: p5REZIL7015470
X-UdeS-MailScanner: Aucun code suspect =?ISO-8859-1?Q?d=E9tect=E9?=
X-MailScanner-SpamCheck: n'est pas un polluriel, SpamAssassin (not cached, score=-7.499, requis 5, autolearn=not spam, BAYES_00 -2.60, RDNS_NONE 0.10, UDES_MONBUREAU01 -5.00)
X-UdeS-MailScanner-From: jmvalin@jmvalin.ca
Cc: codec@ietf.org
Subject: Re: [codec] Summary of test results
X-BeenThere: codec@ietf.org
X-Mailman-Version: 2.1.12
Precedence: list
List-Id: Codec WG <codec.ietf.org>
List-Unsubscribe: <https://www.ietf.org/mailman/options/codec>, <mailto:codec-request@ietf.org?subject=unsubscribe>
List-Archive: <http://www.ietf.org/mail-archive/web/codec>
List-Post: <mailto:codec@ietf.org>
List-Help: <mailto:codec-request@ietf.org?subject=help>
List-Subscribe: <https://www.ietf.org/mailman/listinfo/codec>, <mailto:codec-request@ietf.org?subject=subscribe>
X-List-Received-Date: Mon, 27 Jun 2011 14:35:44 -0000

Hi,

Indeed, the encoder version that Anssi tested was very prone to  
crashing when operating in ranges it didn't like or when fed options  
combinations it didn't like. That has (AFAICT) been fixed now.  
Fortunately, in the cases that did not crash, the behaviour is still  
the same now as it was for the test, so the results are still valid.  
That being said, we welcome feedback on any remaining code quality  
issues. For example, if anyone can still get the encoder or decoder to  
crash using the latest version from the *draft* (haven't uprated the  
website download yet), please let us know.

Cheers,

     Jean-Marc


Christian Hoene <hoene@uni-tuebingen.de> a écrit :

> Hi,
>
> Erik comments are valid. We should make some tests on the final codec
> specification before WGLC. I mean, in the latest listening tests which used
> Opus taken from GIT February 16th, 2011, I found the statement that "with
> lower bitrates the codec crashes".
>
> I mean, just checking the implementation against major programming mistakes
> is not a bad think. And this should not be made by the codec designers
> themselves.
>
> With best regards,
>
>  Christian
>
>> -----Original Message-----
>> From: codec-bounces@ietf.org [mailto:codec-bounces@ietf.org] On Behalf
>> Of Koen Vos
>> Sent: Wednesday, June 22, 2011 3:32 PM
>> To: Erik Norvell
>> Cc: codec@ietf.org
>> Subject: Re: [codec] Summary of test results
>>
>> Hi Erik,
>>
>> > However, the only way to make correct statements about the performance
>> > of the final Opus codec is to test this final codec.
>>
>> We took the MPEG approach to standardization (like with MP3/AAC), which
>> means that the codec is defined by its bit-stream rather than its
> bit-exact
>> behavior.  For this reason, the version tested from February is the final
> Opus.
>> Besides, the encoder hasn't actually changed in any meaningful way since
>> February.
>>
>> best,
>> koen.
>>
>>
>> ----- Original Message -----
>> From: "Erik Norvell" <erik.norvell@ericsson.com>
>> To: "Jean-Marc Valin" <jean-marc.valin@octasic.com>
>> Cc: codec@ietf.org
>> Sent: Wednesday, June 22, 2011 2:00:10 AM
>> Subject: Re: [codec] Summary of test results
>>
>> > -----Original Message-----
>> > From: Jean-Marc Valin [mailto:jean-marc.valin@octasic.com]
>> > Sent: den 21 juni 2011 21:40
>> > To: Erik Norvell
>> > Cc: codec@ietf.org
>> > Subject: Re: [codec] Summary of test results
>> >
>> > On 11-06-21 09:04 AM, Erik Norvell wrote:
>> > > Thank you for compiling this summary of pre-Opus tests. It should
>> > > definitely help in designing the listening test on the final Opus.
>> >
>> > Just to clarify, the Opus bit-stream *is* final and, as far as these
>> > tests (for both speech and music) are concerned, has been since
>> > February.
>> > The latest draft also has the final stereo bit-stream for voice, but
>> > all the rest is long frozen.
>> >
>>
>> There are a number of tests which are older than that which are still
>> referenced when making statements about Opus performance.
>> In addition, a frozen bit-stream is not equal to frozen quality. If the
> codec
>> itself is still permitted to change its quality may be affected.
>>
>>
>> > > One comment to section 3: "While Opus has evolved since these tests
>> > > were conducted, the results should be considered as a
>> > _lower bound_ on
>> > > the quality of the final codec."
>> > >
>> > > I would like to think that the sum is always greater than
>> > it's parts,
>> > > but it is definitely possible to make something worse by
>> > working on it.
>> > > Hence, statements about Opus performance must be based on
>> > tests made
>> > > on the final codec.
>> >
>> > Of course it's not a guarantee, but there's definitely value in those
>> > tests results in that it's unlikely that everything always worked fine
>> > and then we just screwed everything up at the end (if that was the
>> > case we would have realised it in the other tests).
>> >
>>
>> I agree the tests are valuable as quality indicators for the codec by the
> time
>> they were conducted. However, the only way to make correct statements
>> about the performance of the final Opus codec is to test this final codec.
> To
>> deduce that performance from tests of previous versions is bound to
> include
>> some amount of speculation.
>>
>> Best,
>> Erik
>> _______________________________________________
>> codec mailing list
>> codec@ietf.org
>> https://www.ietf.org/mailman/listinfo/codec
>> _______________________________________________
>> codec mailing list
>> codec@ietf.org
>> https://www.ietf.org/mailman/listinfo/codec
>
> _______________________________________________
> codec mailing list
> codec@ietf.org
> https://www.ietf.org/mailman/listinfo/codec
>
>