SELF-EXPLANATION AND SELF-DRIVING java exception result no - - PowerPoint PPT Presentation

self explanation and self driving
SMART_READER_LITE
LIVE PREVIEW

SELF-EXPLANATION AND SELF-DRIVING java exception result no - - PowerPoint PPT Presentation

SELF-EXPLANATION AND SELF-DRIVING java exception result no explanation communication to explanation to at all non-expert human expert Leilani H. Gilpin MIT WHOS AT FAULT? 2 L.H. Gilpin WHOS AT FAULT? 3 L.H. Gilpin WHAT WENT


slide-1
SLIDE 1

SELF-EXPLANATION AND SELF-DRIVING

Leilani H. Gilpin MIT

java exception

result no explanation at all communication to non-expert explanation to human expert

slide-2
SLIDE 2

WHO’S AT FAULT?

2 L.H. Gilpin

slide-3
SLIDE 3

3 L.H. Gilpin

WHO’S AT FAULT?

slide-4
SLIDE 4

WHAT WENT WRONG?

photo courtesy of the New York Times

4 L.H. Gilpin

slide-5
SLIDE 5

WHAT WENT WRONG?

  • Who’s at fault?
  • Human (safety driver)

error

  • Pedestrian error
  • Vehicle error

5 L.H. Gilpin

ABC-15, via Associated Press

slide-6
SLIDE 6

WHAT WENT WRONG?

  • Unavoidable - No way to detect the pedestrian with enough time to

swerve out of the way.

  • Possibly avoidable - Did sensors detect the pedestrian with enough

time to swerve out of the way?

  • Internal errors - Sensors, perception mechanisms, etc. not working as

expected?

6 L.H. Gilpin

slide-7
SLIDE 7

EX-POST-FACTO EXPLANATION

1 2 3 4 CAN bus logs (simulated) Intervals

  • f interest

“safe” intervals: sensor hits behaviors / logs

  • ntology

classification constraints dependency tracking propagators explanation

7 L.H. Gilpin

Coherent Story

slide-8
SLIDE 8

STORY

  • TELLING FOR SAFETY
  • For autonomous machines

to be safe they need to be able to explain themselves

  • For autonomous vehicles to

be intelligent, they need to understand the action and behavior or their underlying parts

8 L.H. Gilpin

slide-9
SLIDE 9

VEHICLE STORIES

  • Autonomous agents must be able to provide explanations for the

following reasons:

  • in order to be audited
  • to provide an understandable and coherent story which justifies

their actions

  • able to be challenged in an adversary proceeding
  • if the explanation is inadequate or inappropriate, the agent should

either corrected or disabled.

9 L.H. Gilpin

slide-10
SLIDE 10

3 MAIN AREAS

  • Explanations
  • Machinery / software
  • Machine perception
  • Security
  • How can we strengthen vehicle security?
  • Accountability
  • What are likely [autonomous] vehicle scenarios?
  • How will pedestrians react?
  • How can we use technology to ensure vehicles can provide evidence?

10 L.H. Gilpin

slide-11
SLIDE 11

OUR RESEARCH

  • Adapted a game simulation to output a “CAN Bus” log
  • Edge detection : When did the operator apply brakes
  • Interval analysis: How do intervals relate
  • Tell a story of what happened
  • Begin to tell a why story

11 L.H. Gilpin

L.H. Gilpin and B.Z.

  • Yuan. “Getting Up to Speed on Vehicle Intelligence.” The AAAI 2017 Spring

Symposium on Science of Intelligence: Computational Principles of Natural and Artificial Intelligence.

slide-12
SLIDE 12

NEED FOR OPEN SOFTWARE

  • Availability of code/data to be evaluated
  • Software available for accountable

development

  • Simulation
  • Error detection and reasoning

12 L.H. Gilpin

slide-13
SLIDE 13

OUR DATA

  • Controller Area Network

log (CAN Bus)

  • Easy to hack
  • simple schema
  • schema: time stamp, CAN bus

code, extra information

  • connects to all aspects of a car
  • Standard

13 L.H. Gilpin

slide-14
SLIDE 14

OUR DATA - UP CLOSE

14 L.H. Gilpin

93.795 B1 81.83 81.83 93.795 B3 24.24 24.24 93.795 120 13 04 50

time stamp CAN bus code parameters

B1 - front wheels B3 - rear wheels 120 - drive mode right, left wheel rotation (in km/hr) 13, 50 - Drive 04 - powered in seconds

slide-15
SLIDE 15

MODELING

Mechanical systems

  • perator

Brake pedal Thro.le Steering wheel Booster Brake master Cylinder An9-lock brake module Sensor output Engine control unit Engine Transmission Differen9al Torque amplifier Rack and pinion Linkage Right wheel LeE wheel front Right wheel LeE wheel back Slave Cylinder

15 L.H. Gilpin

slide-16
SLIDE 16

MODELING

Mechanical systems

  • perator

Brake pedal Thro.le Steering wheel Booster Brake master Cylinder An9-lock brake module Sensor output Engine control unit Engine Transmission Differen9al Torque amplifier Rack and pinion Linkage Right wheel LeE wheel front Right wheel LeE wheel back Slave Cylinder

16 L.H. Gilpin

! ! ! !

Tire pressure sensor is anomalous given current state (snow, chains, engine). Check on right back wheel.

slide-17
SLIDE 17

MODELING

Physics systems

fric%on fric%on normal normal weight iner%al force

==> (explain normal-forces) REASON: rear-wheels-force decreased AND
 its magnitude exceeds the traction threshold. Since the rear wheels lost traction
 the friction of the contact patches
 MUST HAVE decreased; 
 so, the normal forces MUST HAVE decreased.
 Consistent with the accelerometers.

17 L.H. Gilpin

slide-18
SLIDE 18

MODELING

Physics systems

fric%on fric%on normal normal weight iner%al force

==> (explain normal-forces) REASON: front-wheels-force decreased AND tire pressure is low. Checking on mechanical system for anomalies…

18 L.H. Gilpin

slide-19
SLIDE 19

MODELING

Explanatory parking

==> (explain parking) Approach - within threshold Turn - risky, but within threshold. S-curve complete Parking complete.

Joint work with S. Lu and B.Z. Yuan.

19 L.H. Gilpin

1 2 3 4

slide-20
SLIDE 20

WHAT ABOUT SELF-DRIVING?

  • Same mechanics
  • Same physics
  • New perception
  • More sensors

By Guilbert Gates | Source: Google | Note: Car is a Lexus model modified by

  • Google. Uber’s sensing system uses similar technology.

20 L.H. Gilpin

slide-21
SLIDE 21

Local reasonableness monitor

Operator Route planning Mapping Driving Tactics Steering Power Braking Traffic Monitoring Safety

Trajectory planning Interrupt System Actuation System Route planning

Sensing Surroundings

result

21 L.H. Gilpin

SELF-DRIVING SYSTEM DESIGN

slide-22
SLIDE 22

EXPLAINING PERCEPTION TWO WAYS

  • Motivation - A first steps towards understanding

machine perception is to constrain the output to be reasonable.

  • Two ideas
  • Data representation: ConceptNet
  • Structural representation: Conceptual primitives

22 L.H. Gilpin

slide-23
SLIDE 23

METHODS (I)

1 2 3 4 Scene descriptions IsA hierarchy / anchor points relations are “close enough” behaviors / logs

  • ntology

classification constraints dependency tracking explain conflicting relations

23 L.H. Gilpin

Coherent Story

slide-24
SLIDE 24

PRELIMINARY RESULTS(I)

Premises (mailbox, IsA, heavy object) (mailbox, moves, False) (mailbox, LocatedNear, street) Perception A mailbox crossing the street

A mailbox crossing the street Reasonableness monitor

24 L.H. Gilpin

slide-25
SLIDE 25

input : “Mailbox crossing the street” This perception is UNREASONABLE using data from ConceptNet5. REASONING: A mailbox is an object typically found near a sidewalk. Mailboxes cannot cross a street because mailboxes are objects that do not move on their own.

PRELIMINARY RESULTS(I)

A mailbox crossing the street

25 L.H. Gilpin

slide-26
SLIDE 26

LIMITATIONS

input : “A penguin eats food” This perception is UNREASONABLE REASONING: A penguin is an animal that lives in Antartica and eats enough to eat. Food is an animal that lives in the refrigerator and eats food. 
 So a penguin cannot reasonably be located at the same location as food.

26 L.H. Gilpin

slide-27
SLIDE 27

METHODS (II)

1 2 3 4 Scene descriptions IsA hierarchy / anchor points built into primitive behaviors / logs

  • ntology

classification constraints dependency tracking explain conflicting relations

27 L.H. Gilpin

Coherent Story

slide-28
SLIDE 28

A MAILBOX CROSSING THE STREET

mailbox MOVE across street ?

28 L.H. Gilpin

slide-29
SLIDE 29

PRELIMINARY WORK

  • bject

MOVE

  • bject

29

S NP VP NP V

A mailbox crossing the street

L.H. Gilpin

slide-30
SLIDE 30

This perception is unreasonable. ================================ A mailbox is an object or thing that cannot move on its own. So it is unreasonable for a mailbox to cross the street.

EXPLAINING PERCEPTION (II)

30

A mailbox crossing the street

L.H. Gilpin

slide-31
SLIDE 31

EXPLAINING PERCEPTION (II)

This perception is reasonable.

========================== Although a mailbox cannot move on its own, a hurricane can propel a stationary object to move. So it is reasonable for a mailbox to cross the street

31

A mailbox crossing the street during a hurricane

L.H. Gilpin

slide-32
SLIDE 32

MONITOR IN DEVELOPMENT

L.H. Gilpin 32

slide-33
SLIDE 33

INTERNAL STORIES

Premises (mailbox, IsA, heavy object) (mailbox, moves, False) (mailbox, LocatedNear, street) Perception A mailbox crossing the street Premises (hurricane, has, high winds) Weather sensor Hurricane

33 33 L.H. Gilpin

slide-34
SLIDE 34

INTERNAL STORIES

Premises (mailbox, IsA, heavy object)

(mailbox, moves, False)

(mailbox, LocatedNear, street) Perception A mailbox crossing the street Premises

(hurricane, has, high winds)

Weather sensor Hurricane

34 L.H. Gilpin

slide-35
SLIDE 35

INTERNAL STORIES

internal story: high winds can cause heavy objects to move

35

Premises (mailbox, IsA, heavy object)

(mailbox, moves, False)

(mailbox, LocatedNear, street) Perception A mailbox crossing the street Premises

(hurricane, has, high winds)

Weather sensor Hurricane

L.H. Gilpin

slide-36
SLIDE 36

FUTURE WORK

Simulations Hardware

reasonability relevance

36 L.H. Gilpin

  • Explaining non-local inconsistencies
  • Explaining internal stories and premises
  • Incorporating into full-system design
slide-37
SLIDE 37

CONTRIBUTIONS

  • Ex-post-facto explanations
  • Explanations of

reasonableness for language descriptions of perception

  • Incorporating monitor

into a working autonomous simulation.

37 L.H. Gilpin

slide-38
SLIDE 38

LESS FRUSTRATION, MORE EXPLANATION

38 L.H. Gilpin

Tire pressure is low and internal cooling system is overheating. 
 Pull over and check

  • n cooling system.

Too hot to drive Chains are too loose

slide-39
SLIDE 39

SOFTWARE USED

  • Reasoning software
  • MIT/GNU scheme (free software)
  • Art of the Propagator System (free software)
  • Python (open-source)
  • ConceptNet (CC BY
  • SA 4.0)
  • Simulation - Unity game engine and Carla (open-source)

39 L.H. Gilpin