Re: [Json] Limitations on number size?

"Peter F. Patel-Schneider" <> Wed, 10 July 2013 10:07 UTC

Return-Path: <>
Received: from localhost (localhost []) by (Postfix) with ESMTP id 3CE8321E8053 for <>; Wed, 10 Jul 2013 03:07:08 -0700 (PDT)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -2.387
X-Spam-Status: No, score=-2.387 tagged_above=-999 required=5 tests=[AWL=0.212, BAYES_00=-2.599]
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id tPk2hOKLRniq for <>; Wed, 10 Jul 2013 03:07:07 -0700 (PDT)
Received: from ( [IPv6:2607:f8b0:4003:c02::22a]) by (Postfix) with ESMTP id 28FC221E804E for <>; Wed, 10 Jul 2013 03:07:07 -0700 (PDT)
Received: by with SMTP id j6so9464740oag.29 for <>; Wed, 10 Jul 2013 03:07:06 -0700 (PDT)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;; s=20120113; h=message-id:date:from:user-agent:mime-version:to:cc:subject :references:in-reply-to:content-type:content-transfer-encoding; bh=faHDJ4j4IZKtN44nrAIEc71k0Zhe/zJ1v6gd+QvAZXk=; b=aSibNmXzvO+4tUbjzTj8CHrsVqgud5ioaYM5oMjJZCXA4AmLxzKwATNxdJJnhRldzV 5hqh3ry9b8mdnVguDHPfFOKb3HV4b1g3ATAvW7TiR7aDnakZ3kWkRrwbrX/euw300JUW 0KHB//JsVKN+CpWnB+R3Ibf0J+YhzZODvtXvdXNkNvNNCEcTFVYVzC34vHf6vDRuSHRL wIm0vsfYlARIftqcFsbL7LAfv7Z6MOFd6Ay0GoHjgGaAOINIcFICQjz/VofoY1sx6F05 boeA9DnoSypIhzM199ZqeU5PHsG+Wr0p7ZDVqOuCxh0vg6FjaQ5SF+HI5ykpCxkkClTE GlcA==
X-Received: by with SMTP id zb8mr27387591obb.101.1373450826657; Wed, 10 Jul 2013 03:07:06 -0700 (PDT)
Received: from [] ( []) by with ESMTPSA id tv3sm42922340obb.8.2013. for <multiple recipients> (version=TLSv1 cipher=RC4-SHA bits=128/128); Wed, 10 Jul 2013 03:07:06 -0700 (PDT)
Message-ID: <>
Date: Wed, 10 Jul 2013 03:07:04 -0700
From: "Peter F. Patel-Schneider" <>
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:17.0) Gecko/20130514 Thunderbird/17.0.6
MIME-Version: 1.0
To: Stephan Beal <>
References: <> <> <> <> <> <> <> <> <>
In-Reply-To: <>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Cc: "" <>
Subject: Re: [Json] Limitations on number size?
X-Mailman-Version: 2.1.12
Precedence: list
List-Id: "JavaScript Object Notation \(JSON\) WG mailing list" <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Wed, 10 Jul 2013 10:07:08 -0000

On 07/10/2013 02:47 AM, Stephan Beal wrote:
> On Wed, Jul 10, 2013 at 1:44 AM, Peter F. Patel-Schneider 
> < <>> wrote:
>     That's a very unhappy situation.   My interest in JSON is to consume
>     data in JSON documents (mostly to use as input into representation
>     systems that also use the W3C semantic web languages RDF and OWL). If
>     JSON is ambiguous (e.g., as to whether 0.0 and 0 encode/stand
>     for/represent the same thing) then JSON isn't very suitable for
>     transmitting data, at least for me.
> While i think we will all agree that, at a technically pedantic level, 
> you're absolutely right, JSON has been in heavy use for about 10(?) years 
> now with _relatively_ few instances of this causing a problem.

Relatively is one of these weasel words that we all use.  I certainly agree 
that JSON is useful for transmitting certain kinds of data.

However, the working group mailing archives contain evidence that there are 
indeed significant problems when using JSON to portably interchange data, 
particularly  binary data.

> Implementors tend to use whatever default limits the platform provides (e.g. 
> 32-bit on 32-bit platforms and 64 on 64-bit, and 6-digit precision in 
> doubles seems to be conventional in C libraries).
> People using high-precision/very large/very small numbers are certainly 
> aware of the limitations/portability problems, and will (possibly after 
> falling on their face with JSON) pick a different format.

Are they really aware of all the potential problems?   And just what counts as 
high-precision/very large/very small?  Does 0. belong to any of these categories?

> That's all fine and good - i haven't seen anyone here argue that JSON needs 
> to be _the_ data format. It needs to be a _useful_ format for a wide range 
> of applications, and it is that even if it's hard-coded to be limited to 
> 31-bit integer ranges. In my implementations i have had to be very aware of 
> system-level precision limits, but i simply document them, add build options 
> to use, e.g. 64-bit integers if available, and leave it at that. Those 
> details fall comfortably into the normal range of "implementation defined" 
> details, IMO, and do _not_ (IMO) fall into JSON's realm of authority (JSON 
> just needs to tell me the BNF for reading a number, though one could argue 
> that the BNF should/does also imply certain limits). It would be impossible 
> to enforce that arbitrary implementations must support arbitrarily long 
> numbers, just as it would be silly to arbitrarily limit JSON to, say, 20-bit 
> precision.
> Using your case of 0 vs 0.0. The vast, vast majority of JSON consumers are 
> JavaScript, and JS doesn't differentiate between doubles and integers, so 0 
> is, in effect equivalent to 0.0. In fact, there are few real-world 
> applications using JSON where the two are _not_ equivalent (barring 
> scientific, high-precision, math-centric apps, of course, and those should 
> probably be looking for a different format which guarantees them their 
> desired ranges/limits).

I would appreciate some evidence to back up the claim that the vast, vast 
majority of JSON is handled in an environment where the JSON numbers 0 and 0.0 
do indeed represent the same thing. The RDF W3C workiing group is in the last 
stages of putting its stamp of approval on JSON-LD, which presents the JSON 
numbers 0 and 0.0 to RDF as being different.

> -- 
> ----- stephan beal
> ___________________
Peter F. Patel-Schneider