[Rats] automobile attestation ... was: driverless cars can't tell the difference between projections and real objects

Michael Richardson <mcr+ietf@sandelman.ca> Thu, 20 February 2020 08:30 UTC

Return-Path: <mcr@sandelman.ca>
X-Original-To: rats@ietfa.amsl.com
Delivered-To: rats@ietfa.amsl.com
Received: from localhost (localhost [127.0.0.1]) by ietfa.amsl.com (Postfix) with ESMTP id 1331212089E for <rats@ietfa.amsl.com>; Thu, 20 Feb 2020 00:30:43 -0800 (PST)
X-Virus-Scanned: amavisd-new at amsl.com
X-Spam-Flag: NO
X-Spam-Score: 2.501
X-Spam-Level: **
X-Spam-Status: No, score=2.501 tagged_above=-999 required=5 tests=[BAYES_00=-1.9, KHOP_HELO_FCRDNS=0.399, RCVD_IN_SBL_CSS=3.335, SPF_HELO_NONE=0.001, SPF_SOFTFAIL=0.665, URIBL_BLOCKED=0.001] autolearn=no autolearn_force=no
Received: from mail.ietf.org ([4.31.198.44]) by localhost (ietfa.amsl.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id gvGJM8HdNRnN for <rats@ietfa.amsl.com>; Thu, 20 Feb 2020 00:30:40 -0800 (PST)
Received: from relay.sandelman.ca (minerva.sandelman.ca [IPv6:2a01:7e00::3d:b000]) (using TLSv1.2 with cipher ADH-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by ietfa.amsl.com (Postfix) with ESMTPS id 75719120873 for <rats@ietf.org>; Thu, 20 Feb 2020 00:30:40 -0800 (PST)
Received: from dooku.sandelman.ca (unknown [46.114.38.67]) by relay.sandelman.ca (Postfix) with ESMTPS id D7F371F458 for <rats@ietf.org>; Thu, 20 Feb 2020 08:30:36 +0000 (UTC)
Received: by dooku.sandelman.ca (Postfix, from userid 179) id 3C12C1A2BDC; Thu, 20 Feb 2020 09:30:36 +0100 (CET)
From: Michael Richardson <mcr+ietf@sandelman.ca>
To: rats@ietf.org
In-reply-to: <20200220082318.GA18888@gsp.org>
References: <20200220082318.GA18888@gsp.org>
Comments: In-reply-to Rich Kulawiec via Dumpsterfire <dumpsterfire@firemountain.net> message dated "Thu, 20 Feb 2020 03:23:19 -0500."
X-Mailer: MH-E 8.6; nmh 1.7+dev; GNU Emacs 25.2.1
MIME-Version: 1.0
Content-Type: multipart/signed; boundary="=-=-="; micalg="pgp-sha512"; protocol="application/pgp-signature"
Date: Thu, 20 Feb 2020 09:30:36 +0100
Message-ID: <11273.1582187436@dooku>
Archived-At: <https://mailarchive.ietf.org/arch/msg/rats/D7tc5qJ0nxmpPLElPGDnRCR8kUs>
Subject: [Rats] automobile attestation ... was: driverless cars can't tell the difference between projections and real objects
X-BeenThere: rats@ietf.org
X-Mailman-Version: 2.1.29
Precedence: list
List-Id: Remote Attestation Procedures <rats.ietf.org>
List-Unsubscribe: <https://www.ietf.org/mailman/options/rats>, <mailto:rats-request@ietf.org?subject=unsubscribe>
List-Archive: <https://mailarchive.ietf.org/arch/browse/rats/>
List-Post: <mailto:rats@ietf.org>
List-Help: <mailto:rats-request@ietf.org?subject=help>
List-Subscribe: <https://www.ietf.org/mailman/listinfo/rats>, <mailto:rats-request@ietf.org?subject=subscribe>
X-List-Received-Date: Thu, 20 Feb 2020 08:30:51 -0000

So, legitimate road signs will need to be attested.

That doesn't solve the problem with projected images, as humans shouldn't
need to attest their existence.

Rich Kulawiec via Dumpsterfire <dumpsterfire@firemountain.net> wrote:
    > (h/t to Dave Farber and Geoff Goodfellow)

    > Begin forwarded message:

    >> From: the keyboard of geoff goodfellow <geoff@iconia.com> Date:
    >> February 20, 2020 1:48:36 JST Subject: IS: Driverless cars can't tell
    >> the difference between projections and real objects
    >>
    >> This could be a concerning development for those who are working on
    >> autonomous vehicles.
    >>
    >> EXCERPT:
    >>
    >> One of the major concerns surrounding the development of driverless
    >> cars is that people might be able to hack into them remotely and take
    >> control. Tech journalists have been investigating ways this could be
    >> done for years, and car manufacturers are working hard to make sure
    >> their vehicles won't let this happen. It turns out you might not need
    >> to go very high tech to stop a driverless car, though, as researchers
    >> in Israel recently managed to stop one by simply projecting "phantoms"
    >> onto the road.
    >>
    >> Researchers from Ben-Gurion University of the Negev's (BGU) Cyber
    >> Security Research Center in Israel found that both semi-autonomous and
    >> fully autonomous cars stopped when they detected what they thought
    >> were humans in the street but were actually projections. They also
    >> projected a street sign onto a tree and fake lane markers onto the
    >> street to trick the cars. The research was published by the
    >> International Association for Cryptologic Research.
    >>
    >> Ben Nassi, lead author and a Ph.D. student, said in a statement that
    >> these types of issues are being overlooked by the companies that are
    >> developing these types of vehicles.
    >>
    >> "This type of attack is currently not being taken into consideration
    >> by the automobile industry. These are not bugs or poor coding errors
    >> but fundamental flaws in object detectors that are not trained to
    >> distinguish between real and fake objects and use feature matching to
    >> detect visual objects," Nassi said...  [...]
    >>
    >> Spooky video shows self-driving cars being tricked by holograms
    >> https://www.inverse.com/innovation/researchers-used-phantom-images-to-trick-driverless-cars-into-stopping
    >>

    > **********************************************************************
    > The Dumpsterfire mailing list is hosted by firemountain.net.

    > To unsubscribe or change delivery options:
    > http://www.firemountain.net/mailman/listinfo/dumpsterfire

--
Michael Richardson <mcr+IETF@sandelman.ca>, Sandelman Software Works
 -= IPv6 IoT consulting =-