Re: [Cfrg] Fwd: Encryption is less secure than we thought - MIT News Office

Dan Brown <> Fri, 16 August 2013 15:38 UTC

Return-Path: <>
Received: from localhost (localhost []) by (Postfix) with ESMTP id 1712211E8119 for <>; Fri, 16 Aug 2013 08:38:40 -0700 (PDT)
X-Virus-Scanned: amavisd-new at
X-Spam-Flag: NO
X-Spam-Score: -8.598
X-Spam-Status: No, score=-8.598 tagged_above=-999 required=5 tests=[BAYES_00=-2.599, GB_I_LETTER=-2, HTML_MESSAGE=0.001, RCVD_IN_DNSWL_MED=-4]
Received: from ([]) by localhost ( []) (amavisd-new, port 10024) with ESMTP id UTFrte2n7pci for <>; Fri, 16 Aug 2013 08:38:34 -0700 (PDT)
Received: from ( []) by (Postfix) with ESMTP id 1F4A411E8155 for <>; Fri, 16 Aug 2013 08:38:30 -0700 (PDT)
X-AuditID: 0a41282f-b7f8b6d000004656-40-520e476f99fb
Received: from ( []) by (SBG) with SMTP id 9B.18.18006.F674E025; Fri, 16 Aug 2013 10:38:23 -0500 (CDT)
Received: from ( by ( with Microsoft SMTP Server (TLS) id; Fri, 16 Aug 2013 11:38:22 -0400
Received: from ([fe80::fcd6:cc6c:9e0b:25bc]) by ([::1]) with mapi id 14.03.0123.003; Fri, 16 Aug 2013 11:38:22 -0400
From: Dan Brown <>
To: "''" <>, "''" <>
Thread-Topic: [Cfrg] Fwd: Encryption is less secure than we thought - MIT News Office
Thread-Index: AQHOmn0aBVGzTxgTP02h2SMzK0sRUZmX5wWg
Date: Fri, 16 Aug 2013 15:38:22 +0000
Message-ID: <>
References: <> <> <>
In-Reply-To: <>
Accept-Language: en-US
Content-Language: en-US
X-MS-Has-Attach: yes
x-originating-ip: []
Content-Type: multipart/signed; protocol="application/x-pkcs7-signature"; micalg="SHA1"; boundary="----=_NextPart_000_0000_01CE9A75.1C2C27B0"
MIME-Version: 1.0
X-Brightmail-Tracker: H4sIAAAAAAAAA+NgFlrJLsWRmVeSWpSXmKPExsXC5bjwtG6+O1+QQf8GdYtFszktun8cZLI4 +ifIgdlj96Qmdo/JGw+zeWxumsMWwBwlYpOSmpNZllqkb4dgJYhkfJg8i7Xg4V6miktH+pka GOfMZOpi5OSQEDCRWLLoPSuELSZx4d56ti5GLg4hgZWMEt333zPBOW13nkJl5jBKfNl2jB2k hU1AVeL+0XPMILaIQJzEwcZ2NhCbGSg+a8kfRhBbWCBMonHhYqB6DqCacIl/M0Ugyo0kdnQt B2tlASrv6P4IdhGvgJvEhE9/oXYdY5T4uP0a2ExOgUCJNTcbwPYyAp36/dQaJohd4hK3nsyH ekdE4uHF02wQtqjEy8f/oF5TlHh2Zyk7yFBmgV5GiQnLVrJBbBOUODnzCQuILSSgIHHl+j6W CYzis5DMnYWsZxaSHoiiaIn+xyvYIGwDifuHOlghbG2JZQtfM0PY+hJtx1YzY4rbSuy/uhLK VpSY0v2QHcI2lXh99CPjAkbuVYyCuRnFBmYGyXnJekWZuXp5qSWbGMFpQkN/B+Pb9xaHGAU4 GJV4eN11+YKEWBPLiitzDzGqAM14tGH1BUYplrz8vFQlEd6tBkBp3pTEyqrUovz4otKc1OJD jLuZgCE/kVmKOzkfmNzySuKNDQwo54jCOIZGluaGlmbGZqYmhmZDVVhJnNdV5UOgkEB6Yklq dmpqQWoRLPiYODilGhh71HK33wjvYrW88nbzDz9XD+WPf4KniUxaeHbOth8Kz/deDO6eo9o1 V+6/zXwxffmaBdELg/dYinKcLrPRN772pbZYqzJIZ13Jqhd9rvlWWQ8e3p154tf2hZ/Fed5O WvCw3uq1k96EG6aqd7myrnHoHogt8nrp63H6a9WuGavrWqWuJ1Zs379FiaU4I9FQi7moOBEA VvEEDjwEAAA=
Cc: "''" <>
Subject: Re: [Cfrg] Fwd: Encryption is less secure than we thought - MIT News Office
X-Mailman-Version: 2.1.12
Precedence: list
List-Id: Crypto Forum Research Group <>
List-Unsubscribe: <>, <>
List-Archive: <>
List-Post: <>
List-Help: <>
List-Subscribe: <>, <>
X-List-Received-Date: Fri, 16 Aug 2013 15:38:40 -0000

Cryptographers on this list, and cryptographic implementers in general,
should be aware of at least NIST SP 800-90{A,B,C}.


I agree with the article(s) that 


a) parts of industry pay inadequate attention to the randomness on which
keys rely.  Some examples are failed RNGs leading to security
vulnerabilities are well known.

b) assumptions about uniform distribution of keys, e.g. provable security is
common (but I view this as forgivable from the standpoint of modular design)

c) the common perpetuation of the misconception to use Shannon entropy for
crypto may have contributed, or risk contributing, to overestimates of


I am baffled by (or else misunderstand) the news article stating that
cryptographers assume that content files are assumed to be perfectly.  I
never came across such an assumption: often encryption papers assume that
the plaintext is highly biased, and intended to work properly under a biased
message, and uniform key.


The problem of true randomness is difficult, both philosophically and
technically.  Many have written on the subject, including me:


I disagree with the below-cited arXiv article’s use of expected guesswork:
see Remark 3.15 of my own eprint above.


Best regards,






From: [] On Behalf Of Ben
Sent: Friday, August 16, 2013 8:35 AM
To: Robert Moskowitz
Subject: Re: [Cfrg] Fwd: Encryption is less secure than we thought - MIT
News Office




On 16 August 2013 07:44, Robert Moskowitz <> wrote:

So what are the thoughts about this here?  I have been asked to do a little
digging amongst my contacts.  Seems it clearly states that our secure
communications stuff we do is not affected by this work, but perhaps secure
data objects and various wireless password passing technologies might be at


This seems to be restating essentially what Joe Bonneau said far more
readably a year ago:


-------- Original Message --------



>From Evernote: <> 

Encryption is less secure than we thought - MIT News Office

Clipped from:

Encryption is less secure than we thought 

For 65 years, most information-theoretic analyses of cryptographic systems
have made a mathematical assumption that turns out to be wrong.

Larry Hardesty, MIT News Office


Muriel Médard is a professor in the MIT Department of Electrical
Engineering. Photo: Bryce Vickmark 

Information theory — the discipline that gave us digital communication and
data compression — also put cryptography on a secure mathematical
foundation. Since 1948, when the
<> paper that
created information theory first appeared, most information-theoretic
analyses of secure schemes have depended on a common assumption.

Unfortunately, as a group of researchers at MIT and the National University
of Ireland (NUI) at Maynooth, demonstrated in a paper presented at the
recent International Symposium on Information Theory (
<> view PDF ), that assumption is false.
In a follow-up paper being presented this fall at the Asilomar Conference on
Signals and Systems, the same team shows that, as a consequence, the
wireless card readers used in many keyless-entry systems may not be as
secure as previously thought.

In information theory, the concept of information is intimately entwined
with that of entropy. Two digital files might contain the same amount of
information, but if one is shorter, it has more entropy. If a compression
algorithm — such as WinZip or gzip — worked perfectly, the compressed file
would have the maximum possible entropy. That means that it would have the
same number of 0s and 1s, and the way in which they were distributed would
be totally unpredictable. In information-theoretic parlance, it would be
perfectly uniform.

Traditionally, information-theoretic analyses of secure schemes have assumed
that the source files are perfectly uniform. In practice, they rarely are,
but they’re close enough that it appeared that the standard mathematical
analyses still held.

“We thought we’d establish that the basic premise that everyone was using
was fair and reasonable,” says Ken Duffy, one of the researchers at NUI.
“And it turns out that it’s not.” On both papers, Duffy is joined by his
student Mark Christiansen; Muriel Médard, a professor of electrical
engineering at MIT; and her student Flávio du Pin Calmon.

The problem, Médard explains, is that information-theoretic analyses of
secure systems have generally used the wrong notion of entropy. They relied
on so-called Shannon entropy, named after the founder of information theory,
Claude Shannon, who taught at MIT from 1956 to 1978.

Shannon entropy is based on the average probability that a given string of
bits will occur in a particular type of digital file. In a general-purpose
communications system, that’s the right type of entropy to use, because the
characteristics of the data traffic will quickly converge to the statistical
averages. Although Shannon’s seminal 1948 paper dealt with cryptography, it
was primarily concerned with communication, and it used the same measure of
entropy in both discussions.

But in cryptography, the real concern isn’t with the average case but with
the worst case. A codebreaker needs only one reliable correlation between
the encrypted and unencrypted versions of a file in order to begin to deduce
further correlations. In the years since Shannon’s paper, information
theorists have developed other notions of entropy, some of which give
greater weight to improbable outcomes. Those, it turns out, offer a more
accurate picture of the problem of codebreaking.

When Médard, Duffy and their students used these alternate measures of
entropy, they found that slight deviations from perfect uniformity in source
files, which seemed trivial in the light of Shannon entropy, suddenly loomed
much larger. The upshot is that a computer turned loose to simply guess
correlations between the encrypted and unencrypted versions of a file would
make headway much faster than previously expected.

“It’s still exponentially hard, but it’s exponentially easier than we
thought,” Duffy says. One implication is that an attacker who simply relied
on the frequencies with which letters occur in English words could probably
guess a user-selected password much more quickly than was previously
thought. “Attackers often use graphics processors to distribute the
problem,” Duffy says. “You’d be surprised at how quickly you can guess

In their Asilomar paper, the researchers apply the same type of mathematical
analysis in a slightly different way. They consider the case in which an
attacker is, from a distance, able to make a “noisy” measurement of the
password stored on a credit card with an embedded chip or a key card used in
a keyless-entry system. 

“Noise” is the engineer’s term for anything that degrades an electromagnetic
signal — such as physical obstructions, out-of-phase reflections or other
electromagnetic interference. Noise comes in lots of different varieties:
The familiar white noise of sleep aids is one, but so is pink noise, black
noise and more exotic-sounding types of noise, such as power-law noise or
Poisson noise.

In this case, rather than prior knowledge about the statistical frequency of
the symbols used in a password, the attacker has prior knowledge about the
probable noise characteristics of the environment: Phase noise with one set
of parameters is more probable than phase noise with another set of
parameters, which in turn is more probable than Brownian noise, and so on.
Armed with these statistics, an attacker could infer the password stored on
the card much more rapidly than was previously thought.

“Some of the approximations that we’re used to making, they make perfect
sense in the context of traditional communication,” says Matthieu Bloch, an
assistant professor of electrical and computer engineering at the Georgia
Institute of Technology. “You design your system in a framework, and then
you test it. But for crypto, you’re actually trying to prove that it’s
robust to things you cannot test. So you have to be sure that your
assumptions make sense from the beginning. And I think that going back to
the assumptions is something people don’t do often enough.”

Bloch doubts that the failure of the uniformity assumption means that
cryptographic systems in wide use today are fundamentally insecure. “My
guess is that it will show that some of them are slightly less secure than
we had hoped, but usually in the process, we’ll also figure out a way of
patching them,” he says. The MIT and NUI researchers’ work, he says, “is
very constructive, because it’s essentially saying, ‘Hey, we have to be
careful.’ But it also provides a methodology to go back and reanalyze all
these things.” 



Log in to write comments 




Cfrg mailing list