s v v
play

S V V .lu software verification & validation Automated - PowerPoint PPT Presentation

S V V .lu software verification & validation Automated Testing of Autonomous Driving Assistance Systems Lionel Briand VVIoT, Sweden, 2018 Collaborative Research @ SnT Research in context Addresses actual needs Well-defined


  1. S V V .lu software verification & validation Automated Testing of Autonomous Driving Assistance Systems Lionel Briand VVIoT, Sweden, 2018

  2. Collaborative Research @ SnT • Research in context • Addresses actual needs • Well-defined problem • Long-term collaborations • Our lab is the industry 2

  3. Software Verification and Validation @ SnT Centre • Group established in 2012 • Focus: Automated, novel, cost- effective V&V solutions • ERC Advanced Grant • ~ 25 staff members • Industry and public partnerships 3

  4. Introduction 4

  5. Autonomous Systems • May be embodied in a device (e.g., robot) or reside entirely in the cyber world (e.g., financial decisions) • Gaining, encoding, and appropriately using knowledge is a bottleneck for developing intelligent autonomous systems • Machine learning, e.g., deep learning, is often an essential component 5

  6. Motivations • Dangerous tasks • Tedious, repetitive tasks • Significant improvements in safety • Significant reduction in cost, energy, and resources • Significant optimization of benefits 6

  7. Autonomous CPS • Read sensors, i.e., collect data about their environment • Make predictions about their environment • Make (optimal) decisions about how to behave to achieve some objective(s) based on predictions • Send commands to actuators according to decisions • Often mission or safety critical 7

  8. A General and Fundamental Shift • Increasingly so, it is easier to learn behavior from data using machine learning, rather than specify and code • Deep learning, reinforcement learning … • Assumption: data captures desirable behavior, in a comprehensive manner • Example: Neural networks (deep learning) • Millions of weights learned • No explicit code, no specifications • Verification, testing? 8

  9. Many Domains • CPS (e.g., robotics) • Visual recognition • Finance, insurance • Speech recognition • Speech synthesis • Machine translation • Games • Learning to produce art 9

  10. Testing Implications • Test oracles? No explicit, expected test behavior • Test completeness? No source code, no specification 10

  11. CPS Development Process Software-in-the-Loop Hardware-in-the-Loop Model-in-the-Loop Stage Stage Stage Architecture modelling Functional modeling: Deployed executables on • Structure • Controllers target platform • Behavior • Plant • Traceability • Decision System engineering modeling Hardware (Sensors ...) (SysML) Analog simulators Continuous and discrete Analysis: Testing (expensive) Simulink models • Model execution and testing Model simulation and • Model-based testing testing • Traceability and change impact analysis • ... (partial) Code generation 11

  12. MiL Components Plant Sensor Decision Actuator Controller 12

  13. Opportunities and Challenges • Early functional models (MiL) offer opportunities for early functional verification and testing • But a challenge for constraint solvers and model checkers: • Continuous mathematical models, e.g., differential equations • Discrete software models for code generation, but with complex operations • Library functions in binary code 13

  14. Automotive Environment • Highly varied environments, e.g., road topology, weather, building and pedestrians … • Huge number of possible scenarios, e.g., determined by trajectories of pedestrians and cars • ADAS play an increasingly critical role • A challenge for testing 14

  15. Testing Advanced Driver Assistance Systems 15

  16. Objective • Testing ADAS • Identify and characterize most critical/risky scenarios • Test oracle: Safety properties • Need scalable test strategy due to large input space 16

  17. Automated Emergency Braking System (AEB) Decision making Vision Brake (Camera) Controller Objects’ position/speed “Brake-request” Sensor when braking is needed to avoid collisions 17 17

  18. Example Critical Situation • “AEB properly detects a pedestrian in front of the car with a high degree of certainty and applies braking, but an accident still happens where the car hits the pedestrian with a relatively high speed” 18

  19. Testing via Physics-based Simulation 19

  20. Simulation Simulator Ego Vehicule Inputs Environment (physical plant) - the initial state of the physical plant and the Dynamic mobile objects mobile environment models objects Pedestrians - the static environment aspects Other actuators Vehicules Outputs static aspects sensors Time-stamped vectors for: - the SUT outputs - Road SUT - the states of the physical - Traffic sign plant and the mobile cameras - Weather environment objects Feedback loop 20

  21. Our Goal • Developing an automated testing technique for ADAS • To help engineers efficiently and effectively explore the complex test input space of ADAS • To identify critical (failure-revealing) test scenarios • Characterization of input conditions that lead to most critical situations 21

  22. ADAS Testing Challenges • Test input space is large, complex and multidimensional • Explaining failures and fault localization are difficult • Execution of physics-based simulation models is computationally expensive 22

  23. Our Approach • Effectively combine evolutionary computing algorithms and decision tree classification models • Evolutionary computing is used to search the input space for safety violations • We use decision tress to guide the search-based generation of tests faster towards the most critical regions, and characterize failures • In turn, we use search algorithms to refine classification models to better characterize critical regions of the ADAS input space 23

  24. AEB Domain Model Snow Normal « enumeration » « enumeration » WeatherC - snowType: CurvedRadius (CR) VisibilityRange {{OCL} self.fog=false SnowType Rain - 5 - 10 - 15 - 20 - 10 - 20 - 30 - 40 - 50 implies self.visibility = “300” - 25 - 30 - 35 - 40 - 60 - 70 - 80 - 90 - 100 - rainType: and self.fogColor=None} Weather - 110 - 120 - 130 - 140 RainType « enumeration » - 150 - 160 - 170 - 180 - visibility: Curved RampHeight (RH) - 190 - 200 - 210 - 220 VisibilityRange - radius: - 4 - 6 - 8 - 10 - 12 - 230 - 240 - 250 - 260 - fog: Boolean 1 CurvedRadius - 270 - 280 - 290 - 300 - fogColor: « enumeration » FogColor « enumeration » Test Ramped SnowType Scenario - height: RainType - ModerateSnow 1 Road RampHeight - simulationTime: - ModerateRain - HeavySnow - frictionCoeff: Real - HeavyRain Straight - VeryHeavySnow Real - timeStep: Real - VeryHeavyRain - ExtremeSnow 1 - ExtremeRain « enumeration » 1 Vehicle 1 FogColor Pedestrian AEB Output 1 v c 1 1 - v0 : Real - DimGray - : Real x p 0 0 - : TTC: Real v 1 Position - Gray y p - : Real - : certaintyOfDetection: v 2 0 - DarkGray - x: Real Real - : Real v p - Silver - y: Real 0 - : braking: Boolean v 3 θ p - LightGray - :Real 0 1 - None 1 Position vector Static input Output functions 1 1 F 1 - : Real Dynamic input Mobile F 2 - : Real object Output

  25. Search-Based Software Testing • Express test generation problem portion of input domain as a search problem denoting required test data • Search for test input data with certain properties, i.e., randomly-generated inputs constraints Input domain • Non-linearity of software (if, Random search may fail to fulfil low-probability loops, …): complex, discontinuous, non-linear search spaces (Baresel) Genetic Algorithm Fitness • Many search algorithms (metaheuristics), from local search to global search, e.g., Hill Climbing, Simulated Annealing and Genetic Input domain Algorithms Genetic Algorithms are global searches, sampling man 25 “Search-Based Software Testing: Past, Present and Future” Phil McMinn

  26. Multiple Objectives: Pareto Front F 1 Pareto front x Individual A Pareto dominates individual B if A is at least as good as B Dominated by x in every objective and better than B in at least one objective. F 2 A multi-objective optimization algorithm (e.g., NSGA II) must: • Guide the search towards the global Pareto-Optimal front. • Maintain solution diversity in the Pareto-Optimal front. • 26

  27. Decision Trees All points Count 1200 “non-critical” 79% “critical” 21% RoadTopology RoadTopology ( CR = 5 , ( CR = [ 10 − 40 ]( m )) Straight , RH = [ 4 − 12 ]( m )) Count 564 Count 636 “non-critical” 59% “non-critical” 98% “critical” 41% “critical” 2% θ p θ p 0 < 218 . 6 � 0 > = 218 . 6 � Count 412 Count 152 “non-critical” 49% “non-critical” 84% “critical” 51% “critical” 16% v p v p 0 < 7 . 2km / h 0 > = 7 . 2km / h Count 230 Count 182 “non-critical” 31% “non-critical” 72% “critical” 69% “critical” 28% Partition the input space into homogeneous regions 27

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend