Re: [Json] Limitations on number size?

Stephan Beal <> Wed, 10 July 2013 10:15 UTC

Return-Path: <>
Received: from localhost (localhost []) by (Postfix) with ESMTP id 1129121F894E for <>; Wed, 10 Jul 2013 03:15:24 -0700 (PDT)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -1.977
X-Spam-Status: No, score=-1.977 tagged_above=-999 required=5 tests=[AWL=-0.000, BAYES_00=-2.599, FM_FORGED_GMAIL=0.622, HTML_MESSAGE=0.001, NO_RELAYS=-0.001]
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id vCydc2qWVvzB for <>; Wed, 10 Jul 2013 03:15:23 -0700 (PDT)
Received: from ( [IPv6:2607:f8b0:400e:c03::229]) by (Postfix) with ESMTP id 2926F21F8526 for <>; Wed, 10 Jul 2013 03:15:23 -0700 (PDT)
Received: by with SMTP id bj3so6586723pad.0 for <>; Wed, 10 Jul 2013 03:15:22 -0700 (PDT)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=Kckh47A68Bzi1fglGnM731/kv/Jonu/VeqQau8zUayE=; b=r7pqXUA6bqRQ2T+nPwv7830wUTT6K+CZBUb6vld2mTprxAuuQVHaLtSS48f3NT2qTS DZLIx1Oh0BJaKhsjysFk2inJ3c34YFbTYWRbZJPotVg4Ej5xlNkJrpQI4GyXCaSGj0mX ojoe+e/fLTi7OQ/lVhg3OsAvUy2HWXyrRwa3aaWZb5+xLhiZJNoaXhdpnBE3oLKsvZsB 4m2hclet4OQKwqAwZuHicOxqEUO0K0VdFRIoFEj64ucJHie2JuHu6hYigAzR+hfcduqv NAx1Lnf0wD9rorC7WlXOASLAicDPUhY4rWDiO7IXMt6sxHbNMLOVCnVY/Kol1dwnlHoD h4Cw==
MIME-Version: 1.0
X-Received: by with SMTP id t6mr31079855pbj.15.1373451322516; Wed, 10 Jul 2013 03:15:22 -0700 (PDT)
Received: by with HTTP; Wed, 10 Jul 2013 03:15:22 -0700 (PDT)
In-Reply-To: <>
References: <> <> <> <> <> <> <> <> <> <>
Date: Wed, 10 Jul 2013 12:15:22 +0200
Message-ID: <>
From: Stephan Beal <>
To: "" <>
Content-Type: multipart/alternative; boundary=bcaec520f2db31f6f504e1258e5b
Subject: Re: [Json] Limitations on number size?
X-Mailman-Version: 2.1.12
Precedence: list
List-Id: "JavaScript Object Notation \(JSON\) WG mailing list" <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Wed, 10 Jul 2013 10:15:24 -0000

On Wed, Jul 10, 2013 at 12:07 PM, Peter F. Patel-Schneider <> wrote:

> Relatively is one of these weasel words that we all use.


> However, the working group mailing archives contain evidence that there
> are indeed significant problems when using JSON to portably interchange
> data, particularly  binary data.

Such cases are their own fault: JSON never, every claimed to be able to
support binary data. Wrong tool for the job, period.

> Implementors tend to use whatever default limits the platform provides
> (e.g. 32-bit on 32-bit platforms and 64 on 64-bit, and 6-digit precision in
> doubles seems to be conventional in C libraries).
>> People using high-precision/very large/very small numbers are certainly
>> aware of the limitations/portability problems, and will (possibly after
>> falling on their face with JSON) pick a different format.
> Are they really aware of all the potential problems?   And just what
> counts as high-precision/very large/very small?  Does 0. belong to any of
> these categories?

Excellent point - no, they are most certainly not aware of all potential
problems, and that's why they rely on the system-level defaults - because
they know that very smart people have spent time implementing them and
making sure they are reasonable for a wide variety of use cases. Least
common denominator. It would be impossible to expect every JSON parser
implementor to understand the intricacies of binary encodings of floating
point values. (i've implement 3 or 4 JSON parsers in C and C++ and have
never read any of the relevant IEEE docs - i rely on the system to define
my legal ranges).

>> Using your case of 0 vs 0.0. The vast, vast majority of JSON consumers
>> are JavaScript, and JS doesn't differentiate between doubles and integers,
>> so 0 is, in effect equivalent to 0.0. In fact, there are few real-world
>> applications using JSON where the two are _not_ equivalent (barring
>> scientific, high-precision, math-centric apps, of course, and those should
>> probably be looking for a different format which guarantees them their
>> desired ranges/limits).
> I would appreciate some evidence to back up the claim that the vast, vast
> majority of JSON is handled in an environment where the JSON numbers 0 and
> 0.0 do indeed represent the same thing.

You're right, and i'll see if i can find some properly collected
statistics. My "evidence" is purely anecdotal - 90+% of the applications i
work with which use JSON are www-based or www-bound.

The RDF W3C workiing group is in the last stages of putting its stamp of
> approval on JSON-LD, which presents the JSON numbers 0 and 0.0 to RDF as
> being different.

Different standard - doesn't interest me ;). i don't have a single
application where the difference between 0 and 0.0 is relevant to the
outcome of a calculation (with the minor exception of maybe output

----- stephan beal