s v v
play

S V V .lu software verification & validation Testing of - PowerPoint PPT Presentation

S V V .lu software verification & validation Testing of Cyber-Physical Systems: Diversity-driven Strategies Lionel Briand COW 57, London, UK Cyber-Physical Systems A system of collaborating computational elements controlling


  1. S V V .lu software verification & validation Testing of Cyber-Physical Systems: Diversity-driven Strategies Lionel Briand COW 57, London, UK

  2. Cyber-Physical Systems • A system of collaborating computational elements controlling physical entities 2

  3. Context • Projects on verification of cyber-physical systems, control systems, autonomous systems, … • Focus on safety, performance, resource usage … • Automotive, satellite, energy, manufacturing … 3

  4. Controllers Disturbances Plant Actuator Sensor Controller Reference Inputs 4

  5. Decision-Making Components Plant Sensor Decision Actuator Controller 5

  6. Development Process Software-in-the-Loop Hardware-in-the-Loop Model-in-the-Loop Stage Stage Stage Architecture modelling Functional modeling: Deployed executables on • Structure target platform • Controllers • Behavior • Plant • Traceability • Decision System engineering modeling Hardware (Sensors ...) (SysML) Analog simulators Continuous and discrete Analysis: Simulink models Testing (expensive) • Model execution and testing Model simulation and • Model-based testing testing • Traceability and change impact analysis • ... (partial) Code generation 6

  7. Simulink Models - Simulation • Simulation Models Plant Software Model • heterogeneous • continuous behavior • are used for Network Model • algorithm design testing • comparing design options 8

  8. Cruise Control: Plant 9

  9. Problem • How do we automatically verify and test CP functional models (e.g., controller, plant, decision) at MiL? • What types of requirements / properties do we check? • Commercial tools: • Cannot handle continuous operators, floating point models (e.g., SLDV, Reactis) • Based on model structural coverage: low fault detection • Can only handle linear systems and specific properties (Linear analysis toolbox) 10

  10. Challenges • Limited work on verification and testing of controllers and decision-making components in CP systems • Space of test input signals is extremely large. • Model execution, especially when involving plant models, is extremely expensive. 11

  11. More Challenges • Test oracles are not simple Boolean properties and easily known in advance – they involve analyzing changes in value over time (e.g., signal patterns) and assessing levels of risk. • Simulatable plant model of the physical environment is not always available or fully accurate and precise. 12

  12. Diversity Strategies • Strategy: Maximize diversity of test cases • Assumption: “the more diverse the test cases the higher their fault revealing capacity” • Challenge: Define “diversity” in context, pair-wise similarity computation, test selection algorithm • Examples of early work • ISSRE 2003, Leon and Podgurski, filtering and prioritizing test cases, relying on code coverage • FSE 2010, Hemmati et al., Similarity-based test selection applied to model-based testing based on state models for control systems, selection with GA • Full control on test budget: Maximize fault detection for a given test suite size 13

  13. Testing Controllers 14

  14. Controllers are Pervasive 15

  15. Simple Example • Supercharger bypass flap controller ü Flap position is bounded within [0..1] ü Implemented in MATLAB/Simulink ü 34 (sub-)blocks decomposed into 6 abstraction levels Bypass Flap Bypass Flap Supercharger Supercharger Flap position = 0 (open) Flap position = 1 (closed) 16

  16. MiL Test Cases Input Model Output Signals Simulation Signal(s) Test Case 1 S1 S2 S3 t t t Test Case 2 S1 S3 S2 t t t 17

  17. MiL Testing of Controllers + Desired value Error System output Controller Plant ( SUT ) Model - Actual value Test Input Test Output Desired Value Initial Actual Value Desired Value Final Desired Value 18 T/2 T T/2 T time time

  18. Requirements and Test Objectives Desired ValueI (input) Initial Desired Actual Value (output) (ID) Final Desired Responsiveness (FD) Smoothness Stability T/2 T time 20

  19. Test Generation Approach • We formalize controller’s requirements in terms of desired and actual outputs • We rely on controller’s feedback to automate test oracles < Threshold Control System Desired Value Signal Output (Setpoint) + Plant Controller (environment) - desired value actual value Actual Value (feedback) Smoothness 21

  20. A Search-Based Test Approach • Search directed by model execution feedback Final Desired (FD) • Finding worst case inputs Worst Case(s)? • Possible because of automated oracle (feedback loop) • Different worst cases for different requirements Initial Desired (ID) • Worst cases may or may not violate requirements 22

  21. Initial Solution Objective Functions based on Requirements List of + 2. Single-State Worst-Case 1. Exploration Critical Search Scenarios Controller- Regions Domain plant HeatMap Expert model Diagram 1.0 0.9 Desired Value Actual Value Initial Desired 0.8 0.7 0.6 0.5 0.4 0.3 Final Desired 0.2 0.1 0.0 0 1 2 23 time

  22. Results • We found much worse scenarios during MiL testing than our partner had found so far • These scenarios are also run at the HiL level, where testing is much more expensive: MiL results -> test selection for HiL • But further research was needed: • Simulations are expensive • Configuration parameters 24

  23. Final Solution Objective Functions List of 2.Search with 1.Exploration with Worst-Case + Critical Surrogate Dimensionality Scenarios Controller Regression Partitions Modeling Reduction Domain Model Tree Expert (Simulink) Visualization of the 8-dimension space using regression trees Dimensionality reduction to identify Surrogate modeling the significant variables to predict the fitness (Elementary Effect Analysis) function and speed up the search (Machine learning) 25

  24. Open Loop Controllers Engaging • Mixed discrete-continuous behavior: [ ¬ ( vehspd = 0) ∧ time > 2] Simulink stateflows OnMoving OnSlipping time + +; time + +; • No plant model: Much quicker simulation ctrlSig := f ( time ) ctrlSig := g ( time ) time [ time > 4] [( vehspd = 0) ∧ OnCompleted • No feedback loop -> no automated oracle time > 3] time + +; ctrlSig := 1 . 0 • The main testing cost is the manual analysis of output signals • Goal: Minimize test suites CtrlSig • Challenge: Test selection On • Entirely different approach to testing Off 28

  25. Selection Strategies Based on Search • White-box structural coverage State Coverage • Transition Coverage S3 • • Input signal diversity t • Output signal diversity • Failure-Based selection criteria Domain specific failure patterns • S3 Output Stability • t Output Continuity • 29

  26. Test Generation Approach • We assume test oracles are manual • We rely on output signals to produce small test suites with high fault-revealing ability Output Input System Signals Signals Output Plant Controller (environment) instability discontinuity failure pattern failure pattern output diversity 30

  27. Output Diversity -- Vector-Based Output Normalized Euclidian Distance Time Output Signal 1 Output Signal 2 31

  28. Output Diversity -- Feature-Based signal features value derivative second derivative sign-derivative (s, n) extreme-derivatives 1-sided continuity instant-value (v) with strict local optimum constant (n) 1-sided discontinuity increasing (n) constant-value (n, v) decreasing (n) discontinuity discontinuity increasing with strict local optimum C B A 32

  29. Output Diversity -- Feature-Based signal features value derivative second derivative sign-derivative (s, n) extreme-derivatives 1-sided continuity instant-value (v) with strict local optimum constant (n) 1-sided discontinuity increasing (n) constant-value (n, v) decreasing (n) discontinuity discontinuity increasing with strict local optimum C B A Similarity: To which extent any part of a signal is similar to a feature 33

  30. Failure-based Test Generation • Search: Maximizing the likelihood of presence of specific failure patterns in output signals • Domain-specific failure patterns elicited from engineers Instability Discontinuity 1.0 1.0 CtrlSig Output CtrlSig Output 0.5 0.75 0.50 0.0 0.25 -0.5 0.0 -1.0 0.0 1.0 2.0 0.0 1.0 2.0 Time 34 Time

  31. Search • Whole test suite generation approach • Used when objective functions characterize the test suite • Optimize test objective for a given test suite size (budget for manual oracles) • Maximize the minimum distances of each output signal vector from the other output signal vectors • Adaptation of Simulated Annealing 35

  32. Fault-Revealing Ability Covers the fault and Covers the fault but is Likely to reveal it is very unlikely to reveal it Faulty Model Output Correct Model Output 36

  33. Results • The test cases resulting from state/transition coverage algorithms cover the faulty parts of the models • However, they fail to generate output signals that are sufficiently distinct from expectations, hence yielding a low fault revealing rate • Diversity strategies significantly outperforms coverage-based and random testing • Output-based algorithms are much more effective, both based on diversity and failure patterns 37

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend