Re: [iola-conversion-tool] Bug in Datatracker Agenda Code?

Henrik Levkowetz <> Mon, 05 March 2012 18:06 UTC

Return-Path: <>
Received: from localhost (localhost []) by (Postfix) with ESMTP id 8F62521F87C3 for <>; Mon, 5 Mar 2012 10:06:58 -0800 (PST)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -102.513
X-Spam-Status: No, score=-102.513 tagged_above=-999 required=5 tests=[AWL=0.087, BAYES_00=-2.599, NO_RELAYS=-0.001, USER_IN_WHITELIST=-100]
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id 6AS3XCGjBwL6 for <>; Mon, 5 Mar 2012 10:06:57 -0800 (PST)
Received: from ( [IPv6:2a01:3f0:1:2::30]) by (Postfix) with ESMTP id C08ED21F85D3 for <>; Mon, 5 Mar 2012 10:06:57 -0800 (PST)
Received: from [2a01:3f0:1:0:21e:c2ff:fe13:7e3e] (port=65308 by with esmtpsa (TLS1.0:DHE_RSA_AES_256_CBC_SHA1:32) (Exim 4.77) (envelope-from <>) id 1S4cJE-0001US-RQ; Mon, 05 Mar 2012 19:06:57 +0100
Message-ID: <>
Date: Mon, 05 Mar 2012 19:06:55 +0100
From: Henrik Levkowetz <>
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:10.0.2) Gecko/20120216 Thunderbird/10.0.2
MIME-Version: 1.0
To: Ryan Cross <>
References: <> <> <> <> <>
In-Reply-To: <>
X-Enigmail-Version: 1.3.5
Content-Type: text/plain; charset="ISO-8859-1"
Content-Transfer-Encoding: 7bit
X-SA-Exim-Connect-IP: 2a01:3f0:1:0:21e:c2ff:fe13:7e3e
X-SA-Exim-Version: 4.2.1 (built Mon, 26 Dec 2011 16:24:06 +0000)
X-SA-Exim-Scanned: Yes (on
Cc: Alexa Morris <>,
Subject: Re: [iola-conversion-tool] Bug in Datatracker Agenda Code?
X-Mailman-Version: 2.1.12
Precedence: list
List-Id: Discussion of the IOLA / DB Schema Conversion Tool Project <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Mon, 05 Mar 2012 18:06:58 -0000

Hi Ryan, Alexa, all,

I think I've now got this working.  Both administrative and technical
plenary shows up as they should.

There were some assumptions which didn't hold, which lead to some code
changes, and there was an encoding problem with the agenda file on disk,
as follows:

  1. The code assumed that the uploaded document would have a title which
     indicated which plenary agenda it was for -- no title was set.  I've
     now changed the code to look at the document name instead.

  2. There were actually 4 different agendas which matched the plenaries,
     and the code didn't check for state active, so I added this.

  3. Even after I fixed the things above, I went 'round and 'round a couple
     of times, as I only saw a blank space where the agenda should have
     been.  This seems to have been caused by the uploaded document having
     a mac character set encoding and mac line-endings -- Ryan, you maybe
     should investigate that further, and see if you should do character
     set and line-ending conversion.  I think what's on disk should either
     be ascii or unicode -- for drafts we check that it's pure ascii, but
     the agendas are different, and we should maybe enforce unicode?

Best regards,


On 2012-03-05 18:20 Ryan Cross said:
> Hi Henrik,
> I show two plenary Timeslots for meeting 83, both with a Session associated.  And both sessions with agenda material attached.  So the data looks good unless I'm missing some detail of scheduling plenaries.
> In [38]: q=TimeSlot.objects.filter(meeting=83,type='plenary')
> In [39]: q
> Out[39]: [<TimeSlot: 83: 03-28 16:30-19:30 IETF Operations and Administration Plenary, Amphitheatre Bleu>, <TimeSlot: 83: 03-26 16:30-19:30 Technical Plenary, Amphitheatre Bleu>]
> In [40]: q[0].session  
> Out[40]: <Session: IETF-83: iesg 1630>
> In [41]: q[1].session
> Out[41]: <Session: IETF-83: iab 1630>
> Thanks,
> Ryan
> On Mar 5, 2012, at 8:46 AM, Henrik Levkowetz wrote:
>> Mmm, what I see here is "The plenary has not been scheduled", which
>> is a different matter.  I think this may be a failure in the scheduling
>> tool; it is possible that it hasn't connected the session with the meeting
>> slot (maybe because at the time the two plenaries were associated with the
>> same group?).  I'll try to dig a bit in the data to see what's up.