Verification of Robotic Code and Autonomous Systems Kerstin Eder - - PowerPoint PPT Presentation

verification of robotic code and autonomous systems
SMART_READER_LITE
LIVE PREVIEW

Verification of Robotic Code and Autonomous Systems Kerstin Eder - - PowerPoint PPT Presentation

Verification of Robotic Code and Autonomous Systems Kerstin Eder University of Bristol and Bristol Robotics Laboratory Verification and Validation for Safety in Robots To develop techniques and methodologies that can be used to design


slide-1
SLIDE 1

Verification of Robotic Code and Autonomous Systems

Kerstin Eder

University of Bristol and Bristol Robotics Laboratory

slide-2
SLIDE 2

To develop techniques and methodologies that can be used to design autonomous intelligent systems that are verifiably trustworthy.

Verification and Validation for Safety in Robots

2

slide-3
SLIDE 3

Correctness from specification to implementation

User Requirements

High-level Specification

Optimizer

Design and Analysis (Simulink)

Controller (SW/HW)

e.g. C, C++, RTL (VHDL/Verilog)

Translate Implement

3

slide-4
SLIDE 4

What can be done at the code level?

  • P. Trojanek and K. Eder.

Verification and testing of mobile robot navigation algorithms: A case study in SPARK. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

  • pp. 1489-1494. Sep 2014.

http://dx.doi.org/10.1109/IROS.2014.6942753

4

slide-5
SLIDE 5

What can go wrong in robot navigation software?

Generic bugs:

§ Array and vector out-of-bounds accesses § Null pointer dereferencing § Accesses to uninitialized data

Domain-specific bugs:

§ Integer and floating-point arithmetic errors § Mathematic functions domain errors § Dynamic memory allocation and blocking inter- thread communication (non real-time)

5

slide-6
SLIDE 6

Verification Approach

State of the art verification approaches:

§ Model checking: infeasible § Static analysis of C++: not possible § Static analysis of C: requires verbose and difficult to maintain annotations

Our “Design for Verification” approach:

§ SPARK, a verifiable subset of Ada

§ No Memory allocation, pointers, concurrency

§ Required code modifications:

§ Pre- and post-conditions, loop (in)variants § Numeric subtypes (e.g. Positive) § Formal data containers

6

slide-7
SLIDE 7

Results

§ Three open-source implementations of navigation algorithms translated from C/C++ (2.7 kSLOC) to SPARK (3.5 kSLOC)

  • VFH+ (Vector Field Histogram)
  • ND (Nearness Diagram)
  • SND (Smooth Nearness-Diagram) navigation
  • Explicit annotations are less than 5% of the code
  • SPARK code is on average 30% longer than C/C++

§ Several bugs discovered by run-time checks injected by the Ada compiler

  • Fixed code proved to be run-time safe
  • except floating-point over- and underflows
  • These require the use of complementary techniques, e.g. abstract

interpretation.

§ Up to 97% of the verification conditions discharged automatically by SMT solvers in less than 10 minutes § Performance of the SPARK and C/C++ code similar

7

slide-8
SLIDE 8

http://github.com/riveras/spark-navigation

  • P. Trojanek and K. Eder.

Verification and testing of mobile robot navigation algorithms: A case study in SPARK. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

  • pp. 1489-1494. Sep 2014.

http://dx.doi.org/10.1109/IROS.2014.6942753

8

slide-9
SLIDE 9

Correctness from Specification to Implementation

User Requirements

High-level Specification

Optimizer

Design and Analysis (Simulink)

Controller (SW/HW)

e.g. C, C++, RTL (VHDL/Verilog)

Translate Implement Verification

(IL)

Verification

(OL)

9

slide-10
SLIDE 10

What can be done at the design level?

  • D. Araiza Illan, K. Eder, A. Richards.

Formal Verification of Control Systems’ Properties with Theorem Proving. International Conference on Control (CONTROL), pp. 244 – 249. IEEE, Jul 2014. http://dx.doi.org/10.1109/CONTROL.2014.6915147

  • D. Araiza Illan, K. Eder, A. Richards.

Verification of Control Systems Implemented in Simulink with Assertion Checks and Theorem Proving: A Case Study. European Control Conference (ECC), pp. tbc. Jul 2015. http://arxiv.org/abs/1505.05699

10

slide-11
SLIDE 11

Simulink Diagrams in Control Systems

§ Simulating the control systems § Analysis techniques from control systems theory (e.g., stability) § Serve as requirements/specification § For (automatic) code generation

Code

Control systems design level Implementation level

11

slide-12
SLIDE 12

Stability Matrix P > 0 (Lyapunov function) Equivalence

V(k)-V(k-1) = x(k-1)T [(A−BK)T P(A−BK)-P]x(k-1)

(Lyapunov's equation application) Add as assertions Capture control systems requirements Retain in code implementation Matrix P−(A−BK)T P(A−BK) > 0 (Lyapunov function's difference)

Verifying Stability

slide-13
SLIDE 13

Assertion-Based Verification

13

slide-14
SLIDE 14

Stability Matrix P > 0 (Lyapunov function) Equivalence

V(k)-V(k-1) = x(k-1)T [(A−BK)T P(A−BK)-P]x(k-1)

(Lyapunov's equation application) Matrix P−(A−BK)T P(A−BK) > 0 (Lyapunov function's difference)

Test in simulation

Combining Verification Techniques

14

Automatic theorem proving

First order logic theory of the Simulink diagram

Axiom: Bu = B * u ... … Goal: vdiff == vdiff_an

slide-15
SLIDE 15

http://github.com/riveras/simulink

  • D. Araiza Illan, K. Eder, A. Richards.

Formal Verification of Control Systems’ Properties with Theorem Proving. International Conference on Control (CONTROL), pp. 244 – 249. IEEE, Jul 2014. http://dx.doi.org/10.1109/CONTROL.2014.6915147

  • D. Araiza Illan, K. Eder, A. Richards.

Verification of Control Systems Implemented in Simulink with Assertion Checks and Theorem Proving: A Case Study. European Control Conference (ECC), pp. tbc. Jul 2015. http://arxiv.org/abs/1505.05699

15

slide-16
SLIDE 16

Simulation-based testing Why and how?

  • D. Araiza Illan, D. Western, A. Pipe, K. Eder.

Coverage-Driven Verification - An approach to verify code for robots that directly interact with humans. (In Proceedings of HVC 2015, November 2015)

  • D. Araiza Illan, D. Western, A. Pipe, K. Eder.

Model-Based, Coverage-Driven Verification and Validation

  • f Code for Robots in Human-Robot Interactions.

(under review for publication at ICRA 2016)

16

slide-17
SLIDE 17

Coverage-Driven Verification

17

SUT

slide-18
SLIDE 18

Robotic Code

18

  • J. Boren and S. Cousins, “The SMACH High-Level Executive,”

IEEE Robotics & Automation Magazine, vol. 17, no. 4, pp. 18–20, 2010.

slide-19
SLIDE 19

Coverage-Driven Verification

19

SUT Test Response

slide-20
SLIDE 20

Coverage-Driven Verification

20

SUT Test Test Generator Response

slide-21
SLIDE 21

§ Effective tests:

  • legal tests
  • meaningful events
  • interesting events
  • while exploring the system
  • typical vs extreme values

§ Efficient tests:

  • minimal set of tests (regression)

§ Strategies:

  • Pseudorandom (repeatability)
  • Constrained pseudorandom
  • Model-based to target specific scenarios

Test Generator

21

slide-22
SLIDE 22

§ Effective tests:

  • legal tests
  • meaningful events
  • interesting events
  • while exploring the system
  • typical vs extreme values

§ Efficient tests:

  • minimal set of tests (regression)

§ Strategies:

  • Pseudorandom (repeatability)
  • Constrained pseudorandom
  • Model-based to target specific scenarios

Test Generator

22

slide-23
SLIDE 23

Model-based Test Generation

23

slide-24
SLIDE 24

Model-based Test Generation

24

slide-25
SLIDE 25

Coverage-Driven Verification

25

SUT Test Test Generator Checker Response

slide-26
SLIDE 26

Checker

§ Requirements as assertions monitors:

  • if [precondition], check [postcondition]

“If the robot decides the human is not ready, then the robot never releases an object”.

  • Implemented as automata

§ Continuous monitoring at runtime, self-checking

– High-level requirements – Lower-level requirements depending on the simulation's detail (e.g., path planning, collision avoidance).

assert {! (robot_3D_space == human_3D_space)}

26

slide-27
SLIDE 27

Coverage-Driven Verification

27

SUT Test Test Generator Checker Response

slide-28
SLIDE 28

Coverage-Driven Verification

28

SUT Test Test Generator Checker Coverage Collector Response

slide-29
SLIDE 29

Coverage Collector

§ Coverage models:

  • Code coverage, from statement to MC/DC
  • Functional coverage
  • Requirements coverage
  • Cross-product functional coverage
  • Cartesian product of environment actions, sensor states and robot actions

Coverage analysis enables feedback to test generation

29

slide-30
SLIDE 30

HRI Handover Scenario

30

Requirements: § Functional and safety (ISO 13482:2014, ISO 10218-1)

slide-31
SLIDE 31

Requirements based on ISO 13482 and ISO 10218

31

slide-32
SLIDE 32

Requirements based on ISO 13482 and ISO 10218

32

slide-33
SLIDE 33

33

Requirements based on ISO 13482 and ISO 10218

slide-34
SLIDE 34

HRI Handover Scenario

34

Coverage models:

§ Code statement (robot high-level control) § Requirements in the form of Assertions § Cross-product functional coverage

slide-35
SLIDE 35

Situation Coverage

slide-36
SLIDE 36

Functional Coverage

36

slide-37
SLIDE 37

Coverage Results

slide-38
SLIDE 38

Code Coverage Results

38

Pseudorandom Constrained

Coverage Hole

Model-based

slide-39
SLIDE 39

Assertion Coverage Results

39

§ 100 pseudorandomly generated tests § 100 constrained pseudorandomly generated tests § 4 model-based tests

slide-40
SLIDE 40

Functional Coverage Results

40

§ 100 pseudorandomly generated tests § 160 model-based tests § 180 model-based constrained tests § 440 tests in total

slide-41
SLIDE 41

Coverage-Driven Verification

41

SUT Test Test Generator Checker Coverage Collector Response

Coverage analysis enables feedback to test generation

slide-42
SLIDE 42

Coverage-Driven Verification

42

SUT Test Test Generator Checker Coverage Collector Response

Coverage analysis enables feedback to test generation

slide-43
SLIDE 43

Stimulating the SUT

43

SUT Test Test Generator Checker Coverage Collector Response Driver

slide-44
SLIDE 44

Stimulating the SUT

44

SUT Test Test Generator Checker Coverage Collector Response Driver Stimulus

slide-45
SLIDE 45

§ Environmental components (models) interacting with the system's control software § Examples: humans, actuators (Gazebo), communication signals, sensors

Driver

45

slide-46
SLIDE 46

46

slide-47
SLIDE 47

CDV for Human-Robot Interaction

  • D. Araiza Illan, D. Western, A. Pipe, K. Eder. Model-Based, Coverage-Driven Verification and Validation of

Code for Robots in Human-Robot Interactions. (under review for publication at ICRA 2016)

slide-48
SLIDE 48

http://github.com/robosafe/testbench

  • D. Araiza Illan, D. Western, A. Pipe, K. Eder.

Coverage-Driven Verification - An approach to verify code for robots that directly interact with humans. (Proceedings of HVC 2015, November 2015)

  • D. Araiza Illan, D. Western, A. Pipe, K. Eder.

Model-Based, Coverage-Driven Verification and Validation

  • f Code for Robots in Human-Robot Interactions.

(under review for publication at ICRA 2016)

48

slide-49
SLIDE 49

Summary

§ No single technique is adequate for an entire design/system in practice.

– Combine verification techniques

§ Learn from areas where verification techniques are (more) mature. § We need to design for verification.

49

slide-50
SLIDE 50

Any questions? Kerstin.Eder@bristol.ac.uk

Thank you

Special thanks to Dejanira Araiza Illan, David Western, Arthur Richards, Jonathan Lawry, Trevor Martin, Piotr Trojanek, Yoav Hollander, Yaron Kashai, Mike Bartley, Tony Pipe and Chris Melhuish for their hard work, collaboration, inspiration and the many productive discussions we have had.

slide-51
SLIDE 51