Re: [Json] Limitations on number size?

Stephan Beal <> Wed, 10 July 2013 12:49 UTC

Return-Path: <>
Received: from localhost (localhost []) by (Postfix) with ESMTP id 37B1C11E8167 for <>; Wed, 10 Jul 2013 05:49:27 -0700 (PDT)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -1.977
X-Spam-Status: No, score=-1.977 tagged_above=-999 required=5 tests=[AWL=-0.000, BAYES_00=-2.599, FM_FORGED_GMAIL=0.622, HTML_MESSAGE=0.001, NO_RELAYS=-0.001]
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id vqW-0vMq-umW for <>; Wed, 10 Jul 2013 05:49:26 -0700 (PDT)
Received: from ( [IPv6:2607:f8b0:400e:c03::22a]) by (Postfix) with ESMTP id 40D8311E8121 for <>; Wed, 10 Jul 2013 05:49:25 -0700 (PDT)
Received: by with SMTP id rl6so6698317pac.15 for <>; Wed, 10 Jul 2013 05:49:25 -0700 (PDT)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=nXXXkCHKYUewbT8TLHxa8Hj9Q9fzUScoFdIWABKuJcs=; b=xSZ6dsfunCC7bTE1EraeWhzAH7UXEPBUFlspJP68niJANK/y/dx/uXZy7+svexG1jv tVYgalti88rCj7GoZLzFmZgBsHwUcz3xyPokAqb2fv60Kq5RfEpvfJBupxrxmvk3ifxu YfUDVB4d+VSlDAyoVYJfyAJcpycXPHwtmW88X6cd8RWcwoorMUNZdVO9/q5yWdt26LVf floNxcg3aAB1TyHSnck16TBDwtXNTVAUkKFmp4IWG9c0uNZCf8tM/vJWlVrQRRp9/85W svgbYh0Suhp+Euxvc/DXESFIi+wzPblMOEDB3cMt7PLpTvxFQN9GM3Vv80OYIGPpf8rp AEvw==
MIME-Version: 1.0
X-Received: by with SMTP id p10mr31685370pbo.92.1373460565667; Wed, 10 Jul 2013 05:49:25 -0700 (PDT)
Received: by with HTTP; Wed, 10 Jul 2013 05:49:25 -0700 (PDT)
In-Reply-To: <>
References: <> <> <> <> <> <> <> <> <> <> <> <>
Date: Wed, 10 Jul 2013 14:49:25 +0200
Message-ID: <>
From: Stephan Beal <>
To: "" <>
Content-Type: multipart/alternative; boundary=bcaec54307a02139c404e127b566
Subject: Re: [Json] Limitations on number size?
X-Mailman-Version: 2.1.12
Precedence: list
List-Id: "JavaScript Object Notation \(JSON\) WG mailing list" <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Wed, 10 Jul 2013 12:49:27 -0000

On Wed, Jul 10, 2013 at 2:29 PM, Peter F. Patel-Schneider <> wrote:

> On 07/10/2013 03:15 AM, Stephan Beal wrote:
>> Such cases are their own fault: JSON never, every claimed to be able to
>> support binary data. Wrong tool for the job, period.
> That's one opinion (which I agree with, by the way), but there appears to
> be significant support for a different opinion.

That's the beauty of it - in the context of the JSON RFC it's not an
opinion - it's a technical fact. Unless the WG wants to extend JSON in
incompatible ways (which they are prohibited from doing, from what i
understand) there is zero chance of this revision of it supporting JSON.

> [...]
>> Excellent point - no, they [users] are most certainly not aware of all
>> potential problems, and that's why they rely on the system-level defaults -
>> because they know that very smart people [the IETF JSON working group?]
>> have spent time implementing them and making sure they are reasonable for a
>> wide variety of use cases. Least common denominator.
> But what is this least common denominator?  There was a (probably not
> serious) post that suggested that 5-bit JSON numbers would be viable.
>  There were serious posts suggesting that one should not count on anything
> more than 32-bit (signed) integers, i.e., that 0.1 or even 0.0 might be
> rejected by some JSON implementations.

As long as JSON does not specify a numeric precision (and IMO it "MUST NOT"
;) then any precision which can be represented per the grammar is legal.
i.e. any precision for which we can represent digits 0 to 9 (which rules
out the 0-bit precision someone suggested ;).

> Different standard - doesn't interest me ;). i don't have a single
>> application where the difference between 0 and 0.0 is relevant to the
>> outcome of a calculation (with the minor exception of maybe output
>> formatting).
> I find this particular attitude very troubling, even spoken in jest.

It was not intended as a jest. The ONLY JSON documents which interest me
are the original RFC and the one being drafted now. i have ZERO
applications for which there is a semantic difference between 0 and 0.0,
-0, +0, and all the other zeroes. They are the proverbial "noisy minority,"
for which much of the fuss has been raised regarding numbers, but which
have zero impact on "the majority" of applications (==none i've ever worked
on in nearly 30 years of programming).

>  You may not care whether the JSON numbers 0 and 0.0 represent different
> things, but others do, and are about to push an answer into a W3C
> recommendation.

And the current RFC does not distinguish, so  what's the problem if it
continues not to? JSON, as it is now, is NOT a format for high-precision
numerics because the RFC _clearly_ and _unambiguously_ does _not_ cover the
details important to such cases (e.g. 0!==0.0 and a myriad of others which
have been mentioned in recent weeks). If JSON had been designed for that
then i'm quite certain that Doug would have covered those topics.

In my opinion, it should be the goal of any standard (or quasi-standard)
> setting body to try to cover all the reasonable cases (without, of course,
> getting bogged down on things like how many Unicode surrogate characters
> can dance on the head of a JSON string)

One of the charters of the WG is to NOT break JSON compatibility, so their
hands are tied in this regard. They only way to introduce that level of
detail is if one "breaks" JSON vis-a-vis many of the existing
implementations, or otherwise breaks them in the sense of, "they worked
yesterday but are no longer compliant under the new definition" (kinda like
what happened to poor old Pluto - it's big enough to have more moons than
Earth but is no longer considered a planet).

----- stephan beal