verification of robotic code and autonomous systems
play

Verification of Robotic Code and Autonomous Systems Kerstin Eder - PowerPoint PPT Presentation

Verification of Robotic Code and Autonomous Systems Kerstin Eder University of Bristol and Bristol Robotics Laboratory Verification and Validation for Safety in Robots To develop techniques and methodologies that can be used to design


  1. Verification of Robotic Code and Autonomous Systems Kerstin Eder University of Bristol and Bristol Robotics Laboratory

  2. Verification and Validation for Safety in Robots To develop techniques and methodologies that can be used to design autonomous intelligent systems that are verifiably trustworthy. 2

  3. Correctness from specification to implementation User Requirements High-level Specification Translate Optimizer Design and Analysis (Simulink) Implement Controller (SW/HW) e.g. C, C++, RTL (VHDL/Verilog) 3

  4. What can be done at the code level? P. Trojanek and K. Eder. Verification and testing of mobile robot navigation algorithms: A case study in SPARK. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). pp. 1489-1494. Sep 2014. http://dx.doi.org/10.1109/IROS.2014.6942753 4

  5. What can go wrong in robot navigation software? Generic bugs: § Array and vector out-of-bounds accesses § Null pointer dereferencing § Accesses to uninitialized data Domain-specific bugs: § Integer and floating-point arithmetic errors § Mathematic functions domain errors § Dynamic memory allocation and blocking inter- thread communication (non real-time) 5

  6. Verification Approach State of the art verification approaches: § Model checking: infeasible § Static analysis of C++: not possible § Static analysis of C: requires verbose and difficult to maintain annotations Our “Design for Verification” approach: § SPARK, a verifiable subset of Ada § No Memory allocation, pointers, concurrency § Required code modifications: § Pre- and post-conditions, loop (in)variants § Numeric subtypes (e.g. Positive) § Formal data containers 6

  7. Results § Three open-source implementations of navigation algorithms translated from C/C++ (2.7 kSLOC) to SPARK (3.5 kSLOC) • VFH+ (Vector Field Histogram) • ND (Nearness Diagram) • SND (Smooth Nearness-Diagram) navigation - Explicit annotations are less than 5% of the code - SPARK code is on average 30% longer than C/C++ § Several bugs discovered by run-time checks injected by the Ada compiler - Fixed code proved to be run-time safe - except floating-point over- and underflows - These require the use of complementary techniques, e.g. abstract interpretation. § Up to 97% of the verification conditions discharged automatically by SMT solvers in less than 10 minutes § Performance of the SPARK and C/C++ code similar 7

  8. http://github.com/riveras/spark-navigation P. Trojanek and K. Eder. Verification and testing of mobile robot navigation algorithms: A case study in SPARK. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). pp. 1489-1494. Sep 2014. http://dx.doi.org/10.1109/IROS.2014.6942753 8

  9. Correctness from Specification to Implementation User Requirements Verification (OL) High-level Specification Translate Verification Optimizer (IL) Design and Analysis (Simulink) Implement Controller (SW/HW) e.g. C, C++, RTL (VHDL/Verilog) 9

  10. What can be done at the design level? D. Araiza Illan, K. Eder, A. Richards. Formal Verification of Control Systems’ Properties with Theorem Proving. International Conference on Control (CONTROL), pp. 244 – 249. IEEE, Jul 2014. http://dx.doi.org/10.1109/CONTROL.2014.6915147 D. Araiza Illan, K. Eder, A. Richards. Verification of Control Systems Implemented in Simulink with Assertion Checks and Theorem Proving: A Case Study . European Control Conference (ECC), pp. tbc. Jul 2015. http://arxiv.org/abs/1505.05699 10

  11. Simulink Diagrams in Control Systems Control systems design level Implementation level Code § Simulating the control systems § Analysis techniques from control systems theory (e.g., stability) § Serve as requirements/specification § For (automatic) code generation 11

  12. Verifying Stability Stability Matrix P > 0 (Lyapunov function) Matrix Equivalence P − (A − BK) T P(A − BK) > 0 V(k)-V(k-1) = x(k-1) T [(A − BK) T P(A − BK)-P] x(k-1) (Lyapunov's equation application) (Lyapunov function's difference) Capture control systems requirements Retain in code Add as assertions implementation

  13. Assertion-Based Verification 13

  14. Combining Verification Techniques Stability Matrix P > 0 (Lyapunov function) Matrix Equivalence P − (A − BK) T P(A − BK) > 0 V(k)-V(k-1) = x(k-1) T [(A − BK) T P(A − BK)-P] x(k-1) (Lyapunov's equation application) (Lyapunov function's difference) First order logic theory of the Simulink diagram Axiom: Bu = B * u ... … Automatic Test in simulation theorem proving Goal: vdiff == vdiff_an 14

  15. http://github.com/riveras/simulink D. Araiza Illan, K. Eder, A. Richards. Formal Verification of Control Systems’ Properties with Theorem Proving. International Conference on Control (CONTROL), pp. 244 – 249. IEEE, Jul 2014. http://dx.doi.org/10.1109/CONTROL.2014.6915147 D. Araiza Illan, K. Eder, A. Richards. Verification of Control Systems Implemented in Simulink with Assertion Checks and Theorem Proving: A Case Study . European Control Conference (ECC), pp. tbc. Jul 2015. 15 http://arxiv.org/abs/1505.05699

  16. Simulation-based testing Why and how? D. Araiza Illan, D. Western, A. Pipe, K. Eder. Coverage-Driven Verification - An approach to verify code for robots that directly interact with humans. (In Proceedings of HVC 2015, November 2015) D. Araiza Illan, D. Western, A. Pipe, K. Eder. Model-Based, Coverage-Driven Verification and Validation of Code for Robots in Human-Robot Interactions. (under review for publication at ICRA 2016) 16

  17. Coverage-Driven Verification SUT 17

  18. Robotic Code J. Boren and S. Cousins, “The SMACH High-Level Executive,” 18 IEEE Robotics & Automation Magazine, vol. 17, no. 4, pp. 18–20, 2010.

  19. Coverage-Driven Verification Response Test SUT 19

  20. Coverage-Driven Verification Response Test Test SUT Generator 20

  21. Test Generator § Effective tests: - legal tests - meaningful events - interesting events - while exploring the system - typical vs extreme values § Efficient tests: - minimal set of tests (regression) § Strategies: - Pseudorandom (repeatability) - Constrained pseudorandom - Model-based to target specific scenarios 21

  22. Test Generator § Effective tests: - legal tests - meaningful events - interesting events - while exploring the system - typical vs extreme values § Efficient tests: - minimal set of tests (regression) § Strategies: - Pseudorandom (repeatability) - Constrained pseudorandom - Model-based to target specific scenarios 22

  23. Model-based Test Generation 23

  24. Model-based Test Generation 24

  25. Coverage-Driven Verification Checker Response Test Test SUT Generator 25

  26. Checker § Requirements as assertions monitors: - if [precondition], check [postcondition] � “If the robot decides the human is not ready, then the robot never releases an object”. - Implemented as automata § Continuous monitoring at runtime, self-checking – High-level requirements – Lower-level requirements depending on the simulation's detail (e.g., path planning, collision avoidance). assert {! (robot_3D_space == human_3D_space)} � 26

  27. Coverage-Driven Verification Checker Response Test Test SUT Generator 27

  28. Coverage-Driven Verification Checker Response Test Test SUT Generator Coverage Collector 28

  29. Coverage Collector § Coverage models: - Code coverage, from statement to MC/DC - Functional coverage - Requirements coverage - Cross-product functional coverage Cartesian product of environment actions, sensor states and robot actions - Coverage analysis enables feedback to test generation 29

  30. HRI Handover Scenario Requirements: § Functional and safety (ISO 13482:2014, ISO 10218-1) 30

  31. Requirements based on ISO 13482 and ISO 10218 31

  32. Requirements based on ISO 13482 and ISO 10218 32

  33. Requirements based on ISO 13482 and ISO 10218 33

  34. HRI Handover Scenario Coverage models: Code statement (robot high-level control) § Requirements in the form of Assertions § Cross-product functional coverage § 34

  35. Situation Coverage

  36. Functional Coverage 36

  37. Coverage Results

  38. Code Coverage Results Constrained Model-based Pseudorandom Coverage Hole 38

  39. Assertion Coverage Results § 100 pseudorandomly generated tests § 100 constrained pseudorandomly generated tests § 4 model-based tests 39

  40. Functional Coverage Results § 100 pseudorandomly generated tests § 160 model-based tests § 180 model-based constrained tests § 440 tests in total 40

  41. Coverage-Driven Verification Coverage analysis enables feedback to test generation Checker Response Test Test SUT Generator Coverage Collector 41

  42. Coverage-Driven Verification Coverage analysis enables feedback to test generation Checker Response Test Test SUT Generator Coverage Collector 42

  43. Stimulating the SUT Driver Checker Response Test Test SUT Generator Coverage Collector 43

  44. Stimulating the SUT Driver Checker Response Test Stimulus Test SUT Generator Coverage Collector 44

  45. Driver § Environmental components (models) interacting with the system's control software § Examples: humans, actuators (Gazebo), communication signals, sensors 45

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend