d ependable s oftware
play

D EPENDABLE S OFTWARE FOR E MBEDDED S YSTEMS M ONIKA H EINER BTU - PowerPoint PPT Presentation

VW PRESENTATION , W OLFSBURG embedded systems 1 / 23 D EPENDABLE S OFTWARE FOR E MBEDDED S YSTEMS M ONIKA H EINER BTU Cottbus Computer Science Institute Data Structures & Software Dependability monika.heiner(at)informatik.tu-cottbus.de


  1. VW PRESENTATION , W OLFSBURG embedded systems 1 / 23 D EPENDABLE S OFTWARE FOR E MBEDDED S YSTEMS M ONIKA H EINER BTU Cottbus Computer Science Institute Data Structures & Software Dependability monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  2. PROLOGUE embedded systems 2 / 23 ❑ my new car ! MSR ASR ABS EBV ESP USC ❑ my new BOOP ASPECT software CORE ADT OOP VDM++ TL toolkit ? SADT HOL JSD LOTOS VDM MASCOT RBD DFD CCS CSP OBJ Z FTA SA CTL/LTL NVP RBS MTBF MTTF MTTR monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  3. D EPENDABLE S OFTWARE - A LLIGATORS embedded systems 3 / 23 ❑ There is no such thing ❑ Sw systems are abstract, i.e. have no as a complete task description. physical form. -> no constraints by manufacturing processes or by materials governed by ❑ Sw systems tend to be (very) large and physical laws inherently complex systems. -> SE differs from -> mastering the complexity? other engineering disciplines But, small system’s techniques But, human skills in abstract reasoning are can not be scaled up easily. limited. ❑ Large systems must be developed by ❑ Sw does not grow old. large teams. -> no natural die out of over-aged sw -> communication / organization overhead -> sw cemetery But, many programmers tend to be lonely But, “sw mammoths” keep us busy. workers. monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  4. O VERVIEW embedded systems 4 / 23 ❑ dependability SOFTWARE DEPENDABILITY taxonomy development phase operation phase ❑ methods FAULT AVOIDANCE to improve FAULT TOLERANCE dependability fault prevention fault masking fault removal defensive diversity manuelly fault recovery computer-aided validation animation / simulation / testing context checking (static analysis) consistency checking (verification) monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  5. S TATE OF THE A RT embedded systems 5 / 23 ❑ natural fault rate of seasoned programmers - about 1-3 % of produced program lines ❑ undecidability of basic questions in sw validation • program termination ? Murphy’s law: • equivalence of programs cleanroom There is always approach • program verification still another fault. • . . . ❑ validation = testing ❑ testing portion of total sw production effort standard system: ≥ 50 % -> -> extreme availability demands: ≈ 80 % monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  6. L IMITATIONS OF T ESTING embedded systems 6 / 23 ❑ “Testing means the execution of a pro- ❑ exhaustive testing impossible gram in order to find bugs.” [Myers 79] • all valid inputs -> A test run is called successful, -> correctness, . . . if it discovers unknown bugs, • all invalid inputs else unsuccessful. -> robustness, security, reliability, . . . • state-preserving software (OS/IS): ❑ testing is an inherently destructive a (trans-) action depends on task its predecessors -> most programmers unable -> all possible state sequences to test own programs ❑ “Program testing can be used ❑ systematic testing to show the presence of bugs, of concurrent programs but never to show their absence !” is much more complicated than [Dijkstra 72] of sequential ones monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  7. T ESTING OF C ONCURRENT S OFTWARE embedded systems 7 / 23 ❑ state space explosion, worst-case: product of the sequential state spaces PROBE EFFECT ❑ • system exhibits in test mode T → T IME other (less) behavior than in standard mode PN TPN -> test means (debugger) affect timing behavior prop(pn) prop(tpn) • result: masking of certain types of bugs: ⊇ DSt (pn) -> not DSt (tpn) RG (pn) RG (tpn) live(pn) -> not live (tpn) not BND (pn) -> BND (tpn) ❑ non-deterministic behavior, -> pn: time-dependent dynamic conflicts ❑ dedicated testing techniques to guarantee reproducibility, e. g. Instant Replay monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  8. M ODEL - BASED S YSTEM V ALIDATION embedded systems 8 / 23 ❑ general modelling principle Petrinetz Problem model system ❑ modelling = abstraction ❑ analysis = exhaustive exploration analysis ❑ (amount of) analysis techniques depend on conclusions model type model system properties properties monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  9. M ODEL - BASED S YSTEM V ALIDATION embedded systems 9 / 23 ❑ process and tools safety controller environment requirements ❑ DFG project, functional requirements PLC’s modelling modelling library (compiler) ❑ dedicated technical language for requirement spec control environment model model ❑ error message = inconsistency between system model & temporal composition requirement spec logic ❑ verification methods set of -> toolkit system temporal model formulae errors / verification methods inconsistencies monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  10. M ODEL - BASED S YSTEM V ALIDATION embedded systems 10 / 23 ❑ objective - reuse of certified REAL components PROGRAM SAFETY DREAM REQUIREMENTS PROGRAM FUNCTIONAL REQUIREMENTS monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  11. M ODEL - BASED S YSTEM V ALIDATION embedded systems 11 / 23 ❑ model MODEL CLASSES classes ❑ analysis methods QUALITATIVE MODELS QUANTITATIVE MODELS ❑ analysis objectives NON - STOCHASTIC worst-case context checking evaluation MODELS verification by model checking performance STOCHASTIC prediction MODELS reliability prediction monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  12. S TATE S PACE E XPLOSION , P OSSIBLE A NSWERS embedded systems 12 / 23 BASE CASE TECHNIQUES ALTERNATIVES ANALYSIS METHODS ❑ compositional methods ❑ structural analysis -> simple module interfaces -> structural properties, reduction ❑ lnteger Linear Programming ❑ abstraction by ignoring some state information -> conservative approximation ❑ compressed state space representations -> symbolic model checking (OxDD) ❑ lazy state space construction -> stubborn sets, sleep sets PROOF ENGINEERING ❑ alternative state spaces (partial order representations) -> finite prefix of branching process -> concurrent automaton monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  13. C ASE S TUDY - P RODUCTION C ELL embedded systems 13 / 23 deposit belt (belt 2) arm 2 travelling crane robot press 14 sensors arm 1 34 commands feed belt (belt 1) elevating rotary table monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  14. C ASE S TUDY - D INING P HILOSOPHERS embedded systems 14 / 23 BDD A NALYSIS R ESULT , P HIL 1000: Number of places/marked places/transitions: 7000/2000/5000 Number of states: ca. 1.1 * 10e667 1137517608656205162806720354362767684058541876947800011092858232169918\\ 1599595881220313326411206909717907134074139603793701320514129462357710\\ 2442895227384242418853247239522943007188808619270527555972033293948691\\ 3344982712874090358789533181711372863591957907236895570937383074225421\\ 4932997350559348711208726085116502627818524644762991281238722816835426\\ 4390437022222227167126998740049615901200930144970216630268925118631696\\ 7921927977564308540767556777224220660450294623534355683154921949034887\\ 4138935108726115227535084646719457353408471086965332494805497753382942\\ 1717811011687720510211541690039211766279956422929032376885414750385275\\ 51248819240105363652551190474777411874 Time to compute P-Invariants: 45885.66 sec Number of P-Invariants: 3000 Time to compute compact coding: 385.59 sec Number of Variables: 4000 Time: 3285.73 sec ca. 54.75’ monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  15. S UMMARY - S OFTWARE V ALIDATION embedded systems 15 / 23 ❑ validation can only be as good as ❑ validation needs the requirement specification knowledgeable professionals -> readable <-> unambiguous -> study / job specialization -> complete <-> limited size -> profession of “software validator” ❑ validation is extremely validation is no substitute for thinking ❑ time and resource consuming -> ’external’ quality pressure ? ❑ sophisticated validation ❑ There is no such thing as is not manageable without a fault-free program ! theory & tool support -> sufficient dependability for a given user profile monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

  16. A NOTHER S UMMARY - D OUBTS embedded systems 16 / 23 monika.heiner(at)informatik.tu-cottbus.de February 2004 data structures and software dependability

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend