safety assurance in in cyber physical systems buil ilt
play

Safety Assurance in in Cyber-Physical Systems buil ilt wit ith Le - PowerPoint PPT Presentation

Safety Assurance in in Cyber-Physical Systems buil ilt wit ith Le Learning-Enabled Components (L (LECs) December 12, 2018 Taylor T. Johnson taylor.johnson@vanderbilt.edu VeriVITAL - The Veri fication and V alidation for I ntelligent and T


  1. Safety Assurance in in Cyber-Physical Systems buil ilt wit ith Le Learning-Enabled Components (L (LECs) December 12, 2018 Taylor T. Johnson taylor.johnson@vanderbilt.edu VeriVITAL - The Veri fication and V alidation for I ntelligent and T rustworthy A utonomy L aboratory (http://www.VeriVITAL.com) Institute for Software Integrated Systems (ISIS) Electrical Engineering and Computer Science (EECS)

  2. Cyber-Physical Systems (CPS) Communication Networks Physical Cyber Sensors Environment, Components Interfaces Plant, /Software/ Controller Humans, … Actuators All of examples are safety-critical CPS! Can we bet our lives on autonomous CPS designs? 2

  3. 3

  4. 4

  5. Motivation: Perdix, Autonomous Distributed CPS https://www.youtube.com/watch?v=bsKbGc9TUHc

  6. Motivation: Chinese Boat Swarm, Autonomous Distributed CPS

  7. (Formal) Verification and Validation (V&V) Challenge Given system model 𝓑 and property 𝑄 , 𝓑, 𝑄 design algorithm that returns No: bug 𝓑 satisfies 𝑄 and give proof , or 𝓑 ⊨ 𝑄? 𝓑 violates 𝑄 and why ( bug ) Engineering Grand Challenge Yes: proof β€’ Debugging & verification: ~50%-75% engineering cost [Beizer 1990] β€’ Expensive & life-threatening bugs: ~$60 billion/year [NIST 2002] β€’ Fundamental & foundational computationally hard: State-space explosion (β€œcurse of dimensionality”) & undecidability β€’ Roughly: V&V gets exponentially harder in the size of the system 𝓑 networked software interacting with physical world: cyber-physical systems (CPS) 𝑄 Safety : something bad never happens Stability : reach good state eventually and stay there Assurance : safety, stability, liveness, mission specs, other functional 9 & non- functional specs (security, performance, …) …

  8. Challenges for Assurance of LECs β€’ Non-transparency β€’ LECs encode information in a complex manner and it is hard for humans to reason about the encoding β€’ Non-transparency is an obstacle to safety assurance because it is more difficult to develop confidence that the model is operating as intended β€’ Error rate β€’ LECs typically exhibit some nonzero error rate β€’ True error rate unknown and only estimates from statistical processes known β€’ Training based β€’ Training dataset is necessarily incomplete β€’ May under-represent safety critical cases β€’ Unpredictable behavior β€’ Training based on non-convex optimization algorithms and may converge to local minima β€’ Changing training dataset may change behaviors β€’ LECs can exhibit unique hazards β€’ Adversarial examples (incorrect output for a given input that cannot be discovered at design time): whole field of adversarial machine learning β€’ May be always possible to find adversarial examples β€’ Perception of environment is a functionality that is difficult to specify (typically based on examples) [ https://www.labsix.org ] 10

  9. Are autonomous cars today safer than human drivers? β€’ Standard metric: fatalities per mile driven β€’ Humans in the US: β€’ Drive >3 trillion miles (~1/2 a light year !!!) annually (2016) β€’ https://www.afdc.energy.gov/data/10315 β€’ Globally: over a light year β€’ Around 37,000 fatalities (2016) β€’ http://www.iihs.org/iihs/topics/t/general-statistics/fatalityfacts/state-by-state-overview β€’ Dividing: approximately 1 fatality per 85 million miles driven by humans β€’ https://www.rand.org/blog/articles/2017/11/why-waiting-for-perfect-autonomous- vehicles-may-cost-lives.html β€’ Autonomous vehicles β€’ In total across all manufacturers, have driven on the order of ~10 million miles total β€’ Ideal conditions in general (good weather, etc.) β€’ https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disengagement_report_2017 β€’ https://www.wired.com/story/self-driving-cars-disengagement-reports/ β€’ https://medium.com/waymo/waymo-reaches-5-million-self-driven-miles-61fba590fafe β€’ Autonomous vehicles: at least one fatality (and probably ~5-10) β€’ Dividing: approximately 1 fatality per 1 to 10 million miles driven β€’ Humans today are 1-2 orders of magnitude safer than current autonomous vehicles

  10. Closed-Loop CPS with LECs Verification Flow and Tools nnv + nnmt HyST Communication Networks Physical Cyber Sensors Components Environment, Interfaces /Software/ Plant, Controller(s) Humans, … Actuators /LECs  Plant models: hybrid automata, or networks thereof, represented in HyST/SpaceEx/CIF formats  LEC and cyber models: for now, neural networks, represented in ONNX format  Specifications: primarily safety properties for now, some reachability properties  Verification: composed LEC and plant analysis 12

  11. Plant Modeling & Verification HyST: Hybrid Source Transformation and Translation Software Tool SpaceEx XML Hybrid Automaton Network 𝓑, 𝑸 Plant Model Translation and Transformation (HyST Software Tool) http://verivital.com/hyst/ 𝓑 H 𝓑 SF 𝓑 O 𝓑 S 𝓑 D 𝓑 F dReach HyComp SLSF SpaceEx Flow* New algorithms, other tools, … Model Check 𝓑 ⊨ 𝑸? Reachable States, Traces, 𝓑 ⊨ 𝑸?  [Bak, Bogomolov, and Johnson, HSCC 2015] http://verivital.com/hyst/  https://github.com/verivital/hyst 13

  12. LEC Verification nnv: Neural Network Verifier Software Tool  Preliminary software tool now available Matlab toolbox for verification of neural networks  Available at: https://github.com/verivital/nnv   Additionally, translators for common neural network formats, as well as to several other custom inputs required by other LEC verification tools (e.g., ReLUplex, Sherlock, …) in our NNMT tool Available at: https://github.com/verivital/nnmt   Current support: Feedforward neural networks with ReLUs, tanh, and other monotonic activation functions  Closed-loop systems with LECs   Method: reachability analysis-based verification  Dependencies: Matlab MPT toolbox (https://www.mpt3.org/) LEC Example: Reachable set reaches unsafe region ( 𝑧 1 β‰₯ 5 ), the FFNN is unsafe Unsafe region 14

  13. LEC Verification: Reachability Analysis of Feedforward Neural Networks  Given a feedforward neural network F and an input set 𝒴 , the output reachable set of the neural network F is defined as 𝒡 = 𝑧 𝑀 𝑧 𝑀 = 𝐺 𝑦 0 , 𝑦 0 ∈ 𝒴 Output Set Input Set Property P Layer-by-Layer Propagation of Polytopes

  14. Reachable Set Computation Output Set Input Set Neural network system A Property P Verification problem: Will neural network system A satisfy or voilate P ? 16

  15. ReLU (Rectified Linear Unit) Neural Network For single neuron: n n οƒ₯ οƒ₯ =  +  =  +  ( ) max(0, ) y f x x j i i i i i i = = i 1 i 1 For single layer: = +  = x y Wx max(0, ) f x ( ) max(0, ) x Input set: Theorem: For ReLU neural networks, if input set is a union of polytopes, then Union of polytopes output sets of each layer are union of polytopes. Union of polytopes We can compute layer-by-layer. 17

  16. Illustrative Example Input set: 3 inputs, 2 outputs, 7 hidden layers of 7 neurons each. Output reachable set: union of 1250 8000 randomly generated outputs polytopes 18

  17. LEC Verification: Specification-Guided Verification for Neural Networks Output set computation Neural Network Output Set Input Set Interval-Based Computation Procedure: β€’ Partition Input Space into sub-intervals β€’ Compute output range for sub-intervals of input β€’ Union of output intervals over-approximate output set Key: How to partition the input space? 19

  18. LEC Verification: Specification-Guided Verification for Neural Networks Output set computation Neural Network Output Set Input Set Uniform Partition Specification-Guided β€’ Tight over-approximation Partition ( Length of sub-interval is small ) Coarse and fine partitions coexist β€’ β€’ Computationally expensive Computationally efficient ( avoid β€’ ( Huge number of sub-intervals ) unnecessary computation ) β€’ Independent of specification Non-uniform, guided by β€’ specification 20

  19. LEC Verification: Specification-Guided Verification for Neural Networks Random neural network Method Intervals Verification Time (s) β€’ Layer: 5 Uniform 111556 294.37 Each layer: 10 neurons β€’ Spec-Guided 4095 21.45 β€’ Activation function: tanh Specification-guided Uniform partition β€’ Number of partitions: 729 Specification-Guided Partition β€’ Number of partitions: 15 β€’ Computation: ~30s Robotic Arm Example β€’ Computation: ~0.27s Key results: β€’ β€’ 1-2 orders of magnitude less runtime 1-2 orders of magnitude less memory β€’ 21

  20. LEC Model Formatting and Translation for Benchmarking Sherlock (.txt) LEC Translation & Transformation Reluplex ONNX ONNX Converter (NNMT Software Tool) (.nnet) (.onnx) (.onnx) https://github.com/verivital/nnmt nnv (.mat / onnx) … Matlab Tensorflow PyTorch Keras Caffe2  Standard LEC representations (ONNX) & integration with standard learning frameworks  Challenges: specification & assumption modeling, analysis parameters 22

  21. Bounded Model Checking with LECs in the Loop  Alternate iterations of reachability analysis for:  nnv : Machine learning-based / neural network controllers  HyST : Plant (and environment, noise, etc.) models Iterative from time 0 to k-1 Compute reachable set for closed-loop system 23

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend